A new report warns that the proliferation of child sexual abuse images on the internet could become much worse if something is not done to put controls on artificial intelligence tools that generate deepfake photos.
The legality doesn’t matter, what matters is that the sites will be flooded and could be taken down if they aren’t able to moderate fast enough. The only long-term viable solution is image classification, but that’s a tall ask to make from scratch.
The legality doesn’t matter, what matters is that the sites will be flooded and could be taken down if they aren’t able to moderate fast enough. The only long-term viable solution is image classification, but that’s a tall ask to make from scratch.