
yea…unfortunately this is not happening. Back in the days where AI didn’t only meant LLM or generative algs, people tried to predict crime with algorithms. It have been shown that they exibhit the same bias humans do because they learnt for humans. One of maiy examples: https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing but i think i read stories like this happening well before 2010.
But wait ! There is more ! This isn’t something new at all. Before police used algorithms, even before computers existed, police forces tranied dogs to help them in their missions (defending people, detecting drugs…). Guess what ? The dogs also learnt the bias of trainers. https://daily.jstor.org/the-police-dog-as-weapon-of-racial-terror/, https://www.npr.org/sections/thetwo-way/2011/01/07/132738250/report-drug-sniffing-dogs-are-wrong-more-often-than-right are examples for both categories.
So yeah. Unbiased AI, or dogs, or unicorns won’t happen for as long as we humans training them are biased.
There are multiple way to de this. The easiest solution would be to une a second disk with same size or bigger and clone the current one onto the new one, using dd or clonezilla
if you have enough size on your dis_ for a second root partition, you can use dd to clone the existing root partition in the new one, edit /etc/fstab on the new one to point to the correct root partition, and us grub-mkconfig if you use grub, use grub-mkconfig to reconfigure it. It should automatically detect the new partition.