Thanks ahead of time for your feedback

  • originalucifer@moist.catsweat.com
    link
    fedilink
    arrow-up
    94
    arrow-down
    2
    ·
    2 months ago

    i think its ‘barrier to entry’

    photoshop took skills that not everyone has/had keeping the volume low.

    these new generators require zero skill or technical ability so anyone can do it

    • Sanctus@lemmy.world
      link
      fedilink
      English
      arrow-up
      35
      ·
      2 months ago

      Scale also, you can create nudes of everyone on Earth in a fraction of the time it would take with Photoshop. All for the lowly cost of electricity.

    • AbouBenAdhem@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      3
      ·
      edit-2
      2 months ago

      When Photoshop first appeared, image manipulations that would seem obvious and amateurish by today’s standards were considered very convincing—the level of skill needed to fool large numbers of people didn’t increase until people became more familiar with the technology. I suspect the same process will play out with AI images—in a few years people will be much more experienced at spotting them, and making a convincing fake will take as much effort as it now does in Photoshop.

    • HobbitFoot @thelemmy.club
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 months ago

      It would also take a lot more effort to get something even remotely believable. You would need to go through thousands of body and face photos to get a decent match and then put in some effort pairing the two photos together. A decent “nude” photo of a celebrity would probably take at least a day to make the first one.

    • Toes♀@ani.social
      link
      fedilink
      arrow-up
      8
      arrow-down
      13
      ·
      2 months ago

      Have you tried to get consistent goal orientated results from these ai tools.

      To reliably generate a person you need to configure many components, fiddle with the prompts and constantly tweak.

      To do this well in my eyes is a fair bit harder than learning how to use the magic wand in Photoshop.

      • Dojan@lemmy.world
        link
        fedilink
        arrow-up
        21
        ·
        2 months ago

        I mean, inpainting isn’t particularly hard to make use of. There are also tools specifically for the purpose of generating “deepfake” nudes. The barrier for entry is much, much lower.

    • Gigasser@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      11
      ·
      2 months ago

      Ehhhh, I like to think that eventually society will adapt to this. When everyone has nudes, nobody has nudes.

      • Ephera@lemmy.ml
        link
        fedilink
        arrow-up
        7
        ·
        2 months ago

        Unfortunately, I doubt it will be everyone. It will primarily be young women, because we hyper-sexualize those…

      • kent_eh@lemmy.ca
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        2
        ·
        2 months ago

        You might think so, but I don’t hold as much hope.

        Not with the rise of holier than thou moral crusaders who try to slutshame anyone who shows any amount of skin.

        • Gigasser@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          2 months ago

          I like to be optimistic, eventually such crusaders will have such tools turned against them and that will be that. Even they will begin doubting whether any nudes are real.

          Still, I’m not so naive that I think it can’t turn any other way. They might just do that thing they do with abortions, that is the line of reasoning that goes: “the only acceptable abortion is my abortion”, now changed to “the only fake nudes, are my nudes”

  • EveryMuffinIsNowEncrypted@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    31
    arrow-down
    1
    ·
    edit-2
    2 months ago

    Honestly? It was kind of shitty back then and is just as shitty nowadays.

    I mean, I get why people do it. But in my honest opinion, it’s still a blatant violation of that person’s dignity, at least if it’s distributed.

    • Zorque@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      2 months ago

      It’s not that now it’s bad… it’s that now it’s actually being addressed. Whereas before it was just something people would sweep under the rug as being distasteful, but not worthy of attention.

  • xmunk@sh.itjust.works
    link
    fedilink
    arrow-up
    26
    arrow-down
    2
    ·
    2 months ago

    Because previously if someone had the skills to get rich off the skill making convincing fake nudes we could arrest and punish them - people with similar skillsets would usually prefer more legitimate work.

    Now some ass in his basement can crank them out and it’s a futile game of whack-a-mole to kill them dead.

    • ArbitraryValue@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      2
      ·
      edit-2
      2 months ago

      it’s a futile game of whack-a-mole

      It’s still going to be futile even with this law in place. Society is going to have to get used to the fact that photo-realistic images aren’t evidence of anything (especially since the technology will keep improving).

      • calabast@lemm.ee
        link
        fedilink
        arrow-up
        6
        arrow-down
        1
        ·
        2 months ago

        It blows my mind when I think about where we might be headed with this tech. We’ve gotten SO used to the ability to communicate instantly with people far away in the technology age, how will we adapt when we have to go back 300 years and can only trust something someone tells us in person. Will we go back to local newspapers? Or can we not even trust that? Will we have public amphitheaters in busy parts of town, where people will around the news? And we can only trust these people, who have a direct chain of acquaintance all the way back to the source of the information? That seems extreme, but I dunno.

        I think most likely we won’t implement extreme measures like that, to ensure we’re still getting genuine information. I think most likely we’ll just slip into completely generated false news from every source, no longer have any idea what’s really going on, but be convinced this AI thing was overblown, and have no idea we’re being controlled.

        • ArbitraryValue@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          6
          ·
          2 months ago

          I don’t think it will be quite that bad. Society worked before photography was invented and now we have cryptographic ways to make sure you’re really talking to the person you think you’re talking to.

          • calabast@lemm.ee
            link
            fedilink
            arrow-up
            3
            ·
            2 months ago

            Truuue, I hadn’t thought of that. Okay, at least it won’t be as bad as I feared.

            Now I just have to sell all these vintage printing presses I bought…

  • Tarquinn2049@lemmy.world
    link
    fedilink
    arrow-up
    20
    ·
    2 months ago

    It’s a bit of a blend of it has always been a big deal, and that it is indeed more of a big deal still now because of how easy, accessible, and believable the AI can be. Like even nowadays, Photoshop hits only one point of that triangle. But it was even less capable back in the day. It could hit half of one of those points at any given time.

    Basically, a nude generated by a good AI has to be proven false. Because it doesn’t always immediately seem as such at first. If you have seen obvious AI fakes, they are just that, obvious. There are many non-obvious ones that you might have seen and not known they were fake. That is, of course, assuming you have looked.

    The other reason it can be more of a big deal now is that kids have been doing it of other kids. And since the results can be believable, the parents didn’t know they were fake to start with. So it would blow up as if it was real before finding out it was AI. And anything involving that is gonna be a big deal.

      • Tarquinn2049@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        2 months ago

        I mean, that was an issue in the first month or so. Though I could see if the automated tools people use for this specific purpose might not stay up to date. I haven’t specifically interacted with those. But proper AI tools have in-filling to correct mistakes like that, you can keep the rest of the image and just “reroll” a section of it until whatever you didn’t like about it is fixed. Super quick and easy.

  • Cyteseer@lemmy.world
    link
    fedilink
    arrow-up
    19
    arrow-down
    1
    ·
    2 months ago

    It’s always been a big deal, it just died down as Photoshop as a tool became normalized and people became accustomed to it as a tool.

  • shasta@lemm.ee
    link
    fedilink
    arrow-up
    18
    arrow-down
    2
    ·
    2 months ago

    If AI is so convincing, why would anyone care about nudes being controversial anymore? You can just assume it’s always fake. If everything is fake, why would anyone care?

  • Snot Flickerman@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    edit-2
    2 months ago

    It was a big deal back then, too, but a lot harder to police, and a lot more obvious that they were fakes.

    Gillian Anderson fakes were real fuckin popular during the time the X-Files were on the air.

    EDIT: Searching for women from the time talking about the phenomenon in the 90’s is difficult because it mostly turns up… troves of fake nudes of these women. Of course.

      • Snot Flickerman@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        9
        ·
        edit-2
        2 months ago

        I recall women heavily disliking it back then, but I also recall that people in general viewed the internet as just full of weirdos and creeps. Internet wasn’t mainstream, by any stretch of the imagination, so I think it likely “got swept under the rug” because of a general feeling of “who cares what weirdos do online? We’re real people and we never use the internet because we have lives.

        Also, fewer lawyers understood the tech at the time, or how to figure out who was producing these images, and how to prosecute them. So I’d wager that part of going after them was held back by tech-unsavvy lawyers who were like “What’s happening where and how? Dowhatnow? Can you FAX it to me?”

        • Don_Dickle@lemmy.worldOP
          link
          fedilink
          arrow-up
          2
          arrow-down
          3
          ·
          2 months ago

          Did she ever unleash her wrath like the article says…Maybe the nerd in me but never wanted to see her naked just want to see her in a Princessesque Liae outfit. IYKYK

  • simple@lemm.ee
    link
    fedilink
    arrow-up
    15
    arrow-down
    1
    ·
    2 months ago

    Because now teenagers can do it with very little effort whereas before it at least required a lot of time and skill

  • Ziggurat@sh.itjust.works
    link
    fedilink
    arrow-up
    9
    ·
    2 months ago

    I have a similar opininipn. People have been forging!/editing photographs and movies for as long as the technique existed.

    Now any stupid kid can do it, the hard part with AI is actually not getting porn. If it can teach everyone that fake photo are a thing, and make nudes worthless (what’s the point of a nude anyway ? Genitals looks like… Genitals)

  • atrielienz@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    2
    ·
    2 months ago

    How do you prove it’s not you in either case? Photoshop doesn’t make a whole video of you fucking a sheep. But AI can and is actively being used that way. With Photoshop it was a matter of getting ahold of the file and inspecting it. Even the best Photoshop jobs have some key tells. Artifacting, layering, all kinds of shading and lighting, how big the file is, etc.

  • someguy3@lemmy.world
    link
    fedilink
    arrow-up
    9
    ·
    edit-2
    2 months ago

    AI is much better. Photoshop was always a little off with size, angle, lighting, etc. Very easy to spot fakes.

  • snooggums@midwest.social
    link
    fedilink
    English
    arrow-up
    5
    ·
    2 months ago

    In addition the the reduced skill barriers mentioned, the other side effect is the reduced time spent finding a matching photo and actually doing the work. Anyone can create it in their spare time, quickly and easily.

  • HubertManne@moist.catsweat.com
    link
    fedilink
    arrow-up
    3
    ·
    2 months ago

    I sorta feel this way. Before that people would make cutout mashups or artistic types might depict something. I do get that its getting so real folks may things the people actually did the thing.

  • Melvin_Ferd@lemmy.world
    link
    fedilink
    arrow-up
    6
    arrow-down
    4
    ·
    edit-2
    2 months ago

    I got a few comments pointing this out. But media is hell bent on convincing people to hate AI tools and advancements. Why? I don’t know.

    Tin foil hate is that it can be an equalizer. Powerful people that own media like to keep power tools to themselves and want the regular folk to fear and regulate ourselves from using it.

    Like could you imagine if common folk rode dragons in GOT. Absolutely disgusting. People need to fear them and only certain people can use it.

    Same idea. If you’re skeptical, go look up all the headlines about AI in the past year and compare them to right wing media’s headlines about immigration. They’re practically identical.

    “Think of the women and children.”

    “They’re TAKING OUR JOBS”

    “Lot of turds showing up on beaches lately”

    “What if they kill us”

    “THEY’RE STEALING OUR RESOURCES”