After reading the article, “The Hidden Biases in A.I.” I found it very concerning how Artificial Intelligence holds prejudices and stereotypes as many people. The people creating the programs are actually putting a reflective of their outlook into the A.I. Research has also shown how Artificial Intelligence isn’t able to accurately distinguish between women and dark skin tones. Instead, it has been found that they do an almost 100% accuracy with lighter skin tones and males. This has been a result of the limited diversity presented in the team that comes up with these prototypes. The A.I has picked up the traits of their white male creators.
This is a very concerning issue as this A.I. is being used to hire employees and recognize criminals for law enforcement. The A.I used for hiring has shown to choose males with the light skin tone to be best for a job since it is data imported from old company systems that have only shown to have been hiring more men. Now the police identification has wrongfully ID many black males. And so far none for white males. This article helps us understand the A.I. doesn’t provide equality or a system free from any type of preference. It also showcases a problem that needs to be fixed.
What should we do to help resolve this issue?