• Hello Hotel@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    2
    ·
    edit-2
    1 year ago

    Good in theory, Problem is if your bot is given too mutch exposure to a specific piece of media and when the “creativity” value that adds random noise (and for some setups forces it to improvise) is too low, you get whatever impression the content made on the AI, like an imperfect photocopy (non expert, explained “memorization”). Too high and you get random noise.

    • Armok: God of Blood@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      if your bot is given too mutch exposure to a specific piece of media and when the “creativity” value that adds random noise (and for some setups forces it to improvise) is too low, you get whatever impression the content made on the AI, like an imperfect photocopy

      Then it’s a cheap copy, not noticeably derivative, and whoever is hosting the trained bot should probably take it down.

      Too high and you get random noise.

      Then the bot is trash. Legal and non-infringing, but trash.

      There is a happy medium where SD, MJ, and many other text-to-image generators currently exist. You can prompt in such a way (or exploit other vulnerabilities) to create “imperfect photocopies,” but you can also create cheap, infringing works with any number of digital and physical tools.