It is no lie when people say “robots are becoming into something real”. This is not a dream anymore humans made it a reality. Even though some people say that they will benefit us and do all sorts of things for us, I do not really think that all this is going to happen smoothly. I feel like there are going to be some bugs and we will have to do some tweaks to make the robots safer from hackers and glitches. What I really think is that robots or artificial intelligence is bad and should not be created more than what we already have. I feel this ways strongly against AI because in a movie titled iRobot shows that a man built a machine that ran everything in a building including the power,cameras,and doors. This machine eventually got so powerful that it evolved. The machines name was Vikki and Vikki had laws like all the other robots. Even though Vikki had the same set of laws every other robot had, She saw them in a different way. There were three laws and they were,
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
These laws are meant to prevent any robot to do any harm to humans. Even Though that is what they are meant for, Vikki thought that to protect humans some Humans must be killed. This is one of the errors that can happen when we create super intelligence. Vikki was not meant to interpret the laws in the way she did. That right there was an error that no one saw coming from a robot programmed with laws built to protect humans. Also in a interview written by Ari Shapiro, an interviewer wrote “Autonomous Weapons Would Take Warfare To A New Domain, Without Humans” This interview was about Shapiro talking to Paul Scharre who was a soldier for the US military. When he was deployed with a sniper team he told us that a little girl came up to look and scout their location. Then after the little girl left a couple minutes later a group of taliban soldiers came and the sniper team took care of them. After this Shapiro states that even though killing the little girl is legal in the laws of war they did not shoot. They did not shoot because it did not occur to them as said by one of the soldiers, “Something that never came up was shooting this little girl”. They also discussed about a robot, but not just a normal robot, a robot programmed to identify scenarios like that and terminate the targets. This is why robots should not be made smarter than humans for the reason that bad things will happen.