That’s not how any of this works. Copyright is a legal concept, not a technological one. You can’t strip the copyright off something by deleting part of it; the result is still a derivative work.
No, you have it the other way around. It means copyright owners can share “corrupted” versions of their works and the AI can still use it. Possible AI leaks won’t return the original work, since it was never used.
Of course I think this is only one aspect of why artists wouldn’t share their works, but it’s not the point the paper is trying to make. They’re just giving an aspect of how it could be useful.
That’s not how any of this works. Copyright is a legal concept, not a technological one. You can’t strip the copyright off something by deleting part of it; the result is still a derivative work.
It’s not what the paper is about at all, seems this is just shit journalism again.
All the paper says about copyright is that this method is more secure because AI can sometimes spit out training examples.
Why… why is it more secure? Does it mean AI training is actively abusing copyright law? And this is more secure because they can hide it better?
No, you have it the other way around. It means copyright owners can share “corrupted” versions of their works and the AI can still use it. Possible AI leaks won’t return the original work, since it was never used.
Of course I think this is only one aspect of why artists wouldn’t share their works, but it’s not the point the paper is trying to make. They’re just giving an aspect of how it could be useful.