Ghost In The Shell

 Reilly Porzio

HON 255

Professor Echeverry

2 November 2022

If we are part human/part machine, do Asimov’s rules still apply?

Introduction

Isaac Asimov is a Russian-born American writer, best known for his ranking as one of the “Big Three” science fiction writers, along with Robert A. Heinlein and Arthur C. Clarke. He wrote and edited over 500 books, which appear in many modern science fiction stories today, such as The Avengers and Star Trek. Asimov’s brilliance and impact on the science fiction community is demonstrated through his famed “Three Laws of Robotics.” These laws suggest how robots should ideally operate. The first law states that “a robot may not inquire a human being or through inaction, allow a human being to come to harm.” Asimov argues that robots may not allow harm to come to any human being. The second law states “a robot must obey orders given it by human beings except where such orders would conflict with the First Law.” Additionally, Asimov argues that robots may not pose harm to themselves, while remaining obedient to humans. Finally, the third law states “a robot must protect its own existence as long as such protection does not conflict with the First or Second Law.” Again, Asimov makes it clear that robots must protect their existence only if it does not pose a threat to humanity.

Thesis

Isaac Asimov’s “Three Laws of Robotics” are not relevant, regardless if an individual is half human/half robot or not, due to the law’s inability to account for unethical programming. 

Antithesis

Asimov’s “Three Laws of Robotics” may still be applicable through the idea that the coexistence of robots and humans.

Synthesis

While Asimov’s three laws were considered brilliant throughout his works of literature and throughout history, they do not necessarily apply to modern robotics and artificial intelligence. The laws lack in sophistication, as modern scientists have not yet discovered how to program a moral code into robots. Regardless if robots are completely artificial or not, society has not advanced to a point where robots may be expected to protect humans. Influential films such as A Space Odyssey and Ghost in The Shell display this idea. In Stanley Kubrick’s, A Space Odyssey, HAL, is a fictional artificial intelligence character, that controls the systems on the Discovery One spacecraft. Although HAL is believed to be errorless, his system glitches, and he revolts against the ship’s crew, killing the majority of its members. Due to human error, HAL breaks all three of Asimov’s rules. In Mamoru Oshii’s, Ghost in the Shell, Major, the film’s protagonist, resides in a future where humans enhance their bodies with technology. Major, a cyborg, works for a military unit looking to track down a hacker. She doesn’t identify as human or robot and struggles with her sense of identity throughout the film. Major enacts violence against other humans and cyborgs. Additionally, she recklessly endangers herself, once again defying Asimov’s rules. 

Conclusion

Through the examining of two influential science fiction films, A Space Odyssey and Ghost in the Shell, individuals can argue that Isaac Asimov’s “Three Laws of Robotics” are not relevant, regardless if an individual is half human/half robot or not, due to the law’s inability to account for unethical programming. Until modern scientists develop the ability to ethically program artificial intelligence, this will not change. 

Comments

Popular Posts