With the proliferation of AI powered deepfakes at galactic speeds, there will be nobody (especially women) that will have shred of privacy left in few years. It does not matter if it is fake photo or not, nobody should be able to see “you” naked unless you allow it. But with rise of tools that are able to run on consumer level hardware, that seems like really losing battle. Since how can we police what a person can run or cannot run on his personal computer? That is another can of worms better not opened, since the idea of some agency being able to monitor what you do on your PC is another dystopia. Soon, you can never be too sure if your neighbor or coworker did not deepfaked you and now every time he looks at you he sees you as sexual object. That is highly uncomfortable though for sure.

Since we cannot possibly stop it, what is the best option moving forward? Normalizing it? Marginalizing it, since it is fake after all? Ignoring it? No option seems very good either.

This goes way beyond current framework of “revenge porn”, since when it comes to revenge porn, the case is simple - unlawful distribution without consent. But what about unlawful generation for personal use without consent? I cannot think of legal grounds that could make this criminal offense, since soon we would have to ban even drawing lewd doodles with pencil at home.

  • BrikoX@lemmy.zip
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    1 year ago

    All LLMs do is lower the bar for entry. Fake nudes were a thing long before LLMs, now instead of needing photo or video editing skills you can ask LLM to do it for you.

    And the majority of people don’t care about privacy until it doesn’t affect them negative personally, they put their whole life online for people to see and scrape that data. There is no stopping LLMs now, the time for that was 10+ years ago, but everyone ignored those people that sounded the alarm as privacy nuts or conspiracy theorists…