white robot toy holding black tablet

After reading the article, “The Hidden Biases in A.I.” I found it very concerning how Artificial Intelligence holds prejudices and stereotypes as many people.  The people creating the programs are actually putting a reflective of their outlook into the A.I.  Research has also shown how Artificial Intelligence isn’t able to accurately distinguish between women and dark skin tones.  Instead, it has been found that they do an almost 100% accuracy with lighter skin tones and males.  This has been a result of the limited diversity presented in the team that comes up with these prototypes.  The A.I has picked up the traits of their white male creators. 

This is a very concerning issue as this A.I. is being used to hire employees and recognize criminals for law enforcement.  The A.I used for hiring has shown to choose males with the light skin tone to be best for a job since it is data imported from old company systems that have only shown to have been hiring more men.   Now the police identification has wrongfully ID many black males.  And so far none for white males.  This article helps us understand the A.I. doesn’t provide equality or a system free from any type of preference.  It also showcases a problem that needs to be fixed.

What should we do to help resolve this issue?

image_printPrint this page.


0 0 votes
Rate This Post
Notify of
Inline Feedbacks
View all comments

Youth Voices is an open publishing and social networking platform for youth. The site is organized by teachers with support from the National Writing Project. Opinions expressed by writers are their own.  See more About Youth VoicesTerms of ServicePrivacy Policy.All work on Youth Voices is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License


Email allisonpr@gmail.com Call or Text 917-612-3006

Missions on Youth Voices
Would love your thoughts, please comment.x

Log in with your credentials


Forgot your details?

Create Account