With the proliferation of AI powered deepfakes at galactic speeds, there will be nobody (especially women) that will have shred of privacy left in few years. It does not matter if it is fake photo or not, nobody should be able to see “you” naked unless you allow it. But with rise of tools that are able to run on consumer level hardware, that seems like really losing battle. Since how can we police what a person can run or cannot run on his personal computer? That is another can of worms better not opened, since the idea of some agency being able to monitor what you do on your PC is another dystopia. Soon, you can never be too sure if your neighbor or coworker did not deepfaked you and now every time he looks at you he sees you as sexual object. That is highly uncomfortable though for sure.

Since we cannot possibly stop it, what is the best option moving forward? Normalizing it? Marginalizing it, since it is fake after all? Ignoring it? No option seems very good either.

This goes way beyond current framework of “revenge porn”, since when it comes to revenge porn, the case is simple - unlawful distribution without consent. But what about unlawful generation for personal use without consent? I cannot think of legal grounds that could make this criminal offense, since soon we would have to ban even drawing lewd doodles with pencil at home.

  • smoothbrain coldtakes@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Deepfakes for porn are not the problem.

    Deepfakes of media and the propaganda therein, is the real problem.

    Does it suck that a person can have their clothed photos turned into porn? Sure, but it’s way smaller scale than the mass creation of propaganda that is being done with LLMs. In comparison deepfakes for nudes are practically a non-issue.