![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://fry.gs/pictrs/image/c6832070-8625-4688-b9e5-5d519541e092.png)
Why is she claiming that the bill is about liability?
Why is she claiming that the bill is about liability?
No competent engineer would use NFTs for the purpose. It’s inconvenient, slow and ridiculously expensive. No one uses the “technology” because it’s rubbish.
Implementing such a feature is trivial. Steam has a marketplace. They don’t let you sell used games because the developers don’t want it.
I can relate to the sentiment, but that just makes it worse. How do you enforce ownership of data?
There’s only 1 thing for it: More internet surveillance.
It’s not.
It’s very tamperable. It lacks common safety features like 2FA. Hacks are common and stolen NFTs can not be recovered.
It doesn’t provide any evidence of ownership, much less proof. Anyone can mint NFTs without providing any evidence of ownership or anything. There is no legal requirement that ownership of anything is transferred along with an NFT.
But it’s also possible to do things like build a mass facial recognition database with image data,
Facebook built one years ago, but ended up destroying it. https://www.theverge.com/2021/11/2/22759613/meta-facebook-face-recognition-automatic-tagging-feature-shutdown
What about is wrong?
Artists are allowed to do the exact same thing. That’s probably not a helpful answer, but it’s the correct answer to your question. You’re making some wrong assumptions about the law, and probably about the economics, as well. Writing a proper explanation would take me quite a while and I’m not sure if it would be appreciated.
There are some companies, EG Adobe and Shutterstock, that offer “commercially safe” image generators trained on licensed images. Artists who would like to make money by licensing images for AI training can deal with them.
How does that work? By definition, a rich person owns a lot of property. Therefore, laws that give more power to property owners favor the rich. Copyright is a type of property.
@[email protected] How are you feeling about yourself?
Property rights are the only thing that protects the poor from the rich. Sure.
The winners of a system don’t have an incentive to undermine the rules. Quite the opposite. The NYT wants these rules because it would benefit from them. There are at least 2 image generators that adhere to capitalist ethics. I don’t know what Claro uses, but I see no indication that they are being uppity.
The background is that French law requires ISPs to retain the IPs of their customer for some time. That way, an IP address can be associated with a customer.
If I download music in a Starbucks, can they fine the Starbucks CEO then?
A CEO is an employee. You generally can’t sue employees for this sort of thing. It may be possible to sue the company as a whole for enabling the copyright infringement, but that’s not to do with this case. Perhaps in the future, operators of WiFi-hotspots will be required to use something like Youtube’s Content ID system.
Anyway I hope I hope online artists, and authors are able to use this to sue AI companies for stealing their copyrighted works.
They can use this to go after “pirates”. It’s got nothing to do with AI.
One of the top tier models would probably do well on a standardized test like that. You don’t get them for free, though.
You can try some different chat models free at DDG. https://duckduckgo.com/?q=DuckDuckGo&ia=chat
Was a reference to the thread next door that revealed - horror of horrors - that photos of children were part of the training data. Sure, you never know who is behind these hit pieces, but there doesn’t really need to be anyone behind it.
Oh no. That’s unethical!
/s
I doubt that. Having a very proprietary attitude towards one’s images and making good images are not related at all.
Besides, good training data is to a large extent about the labels.
I’m sure it works fine in the lab. But it really only targets one specific AI model; that one specific Stable Diffusion VAE. I know that there are variants of that VAE around, which may or may not be enough to make it moot. The “Glaze” on an image may not survive common transformations, such as rescaling the image. It certainly will not survive intentional efforts to remove it, such as appropriate smoothing.
In my opinion, there is no point in bothering in the first place. There are literally billions of images on the net. One locks up gems because they are rare. This is like locking up pebbles on the beach. It doesn’t matter if the lock is bad.
Saw a post on Bluesky from someone in tech saying that eventually, if it’s human-viewable it’ll also be computer-viewable, and there’s simply no working around that, wonder if you agree on that or not.
Sort of. The VAE, the compression, means that the image generation takes less compute; ie cheaper hardware and less energy. You can have an image generator that works on the same pixels, visible to humans. Actually, that’s simpler and existed earlier.
By Moore’s law, it would be many years, even decades, before that efficiency gain is something we can do without. But I think, maybe, this becomes moot once special accelerator chips for neural nets are designed.
What makes it obsolete is the proliferation of open models. EG Today Stable Diffusion 3 becomes available for download. This attack targets 1 specific model and may work on variants of it. But as more and more rather different models become available, the whole thing becomes increasingly pointless. Maybe you could target more than one, but it would be more and more effort for less and less effect.
You are apparently mistaking me for someone else.
Animals never could own property. PETA sued to get the monkey recognized as author and thus copyright-holder of the selfie. Or, more likely, to generate publicity as that was obviously never going to happen.
The way it looks, Adobe has to do this to comply with EU law.