Opinionated article by Alexander Hanff, a computer scientist and privacy technologist who helped develop Europe’s GDPR (General Data Protection Regulation) and ePrivacy rules.
We cannot allow Big Tech to continue to ignore our fundamental human rights. Had such an approach been taken 25 years ago in relation to privacy and data protection, arguably we would not have the situation we have to today, where some platforms routinely ignore their legal obligations at the detriment of society.
Legislators did not understand the impact of weak laws or weak enforcement 25 years ago, but we have enough hindsight now to ensure we don’t make the same mistakes moving forward. The time to regulate unlawful AI training is now, and we must learn from mistakes past to ensure that we provide effective deterrents and consequences to such ubiquitous law breaking in the future.
Sure. Copyright is is - is broken. And it certainly doesn’t help I’m paying Spotify etc just so they can pocket the money. But don’t we need something so Hollywood can produce my favorite TV show? I mean that stuff costs millions and millions to make, until it somehow arrives on my screen. Or an author making a decent living with coming up with a nice fantasy novel series? What’s the alternative until we arrive at Star Trek and money is a thing of the past?
I’m pretty sure the AI companies are stealing copyrighted work. Afaik Mata admitted doing it. For several older ones we know which books were in the training datasets. There are several ongoing lawsuits dealing with books being used to train AI, Scarlett Johansson’s voice etc.
I agree. As is, AI is a plaything for rich companies. They have complete control, since they hired the experts and they have the money for all the graphics cards and electricity. If it’s as disruptive as people claim, it’s our bad. Because we’re out of the loop.