• 0 Posts
  • 13 Comments
Joined 2 years ago
cake
Cake day: July 22nd, 2023

help-circle
  • For me at least it’s that the perspective is off. When someone is learning to draw or is just a shitty artist and the perspective isn’t very good you can immediately identify it. It looks like someone drew “front” eyes on the side of the head or something like that. When AI makes an image, the perspective is off, just in a different ways. The eye doesn’t look like a “front” eye or a “side” eye or a “top” eye. It looks like all of them and none of them. It makes the entire thing unsettling.













  • I completely disagree. It absolutely is AI doing this. The point the article is trying to make is that the data used to train the AI is full of exclusionary hiring practices. AI learns this and carries it forward.

    Using your metaphor, it would be like training AI on hundreds of excel spreadsheets that were sorted by race. The AI learns this and starts doing it too.

    This touches on one of the huge ethical questions with regulating AI. If you are discriminated against in a job hunt by an AI, who’s fault is that? The AI is just doing what it’s taught. The company is just doing what the AI said. The AI developers are just giving it previous hiring data. If the previous hiring data is racist or sexist or whatever you can’t retroactively correct that. This is exactly why we need to regulate AI not just its deployment.