Isaac Asimov’s I, Robot includes nine short stories that unveil the robotics evolution from a nursemaid robot that could not speak to a time in which the Machines rule the Earth. Although the Three Laws of Robotics guide Asimov’s robots to ensure humans’ and robots’ safety, these machines face resistance from humans and are banned from Earth. The short stories “Robbie,” “Reason,” and “Evidence” show how humans clearly target robots with prejudice. Fearing the loss of their power over machines, the humans blame Robbie, Cutie, and Stephen Byerley for posing a threat, and yet Asimov positions their suspicion of robots as unreasonable. As such, I will argue that I, Robot portrays prejudice against Robbie, Cutie and Stephen Byerley to ironically demonstrate the use of technology as a scapegoat.
In the short story “Robbie,” Mrs. Grace Weston scapegoats technology when she relies on prejudices to claim that her daughter Gloria’s mute nursemaid robot Robbie is a threat. Mrs. Weston worries about Robbie’s influence over Gloria and about her family’s reputation in the community, as her neighbors are suspicious of the robot. Those are concerns clearly involving power dynamics. As a result, Mrs. Weston wants to get rid of Robbie to stop her daughter’s dependence on the robotic caregiver. The bias is evident in her complaint: “I won’t have my daughter entrusted to a machine – and I don’t care how clever it is. It has no soul, and no one knows what it may be thinking. A child just isn’t made to be guarded by a thing of metal” (Asimov 9). Additionally, Mrs. Weston is bothered that children from the neighborhood are not allowed to walk close to their house in the evenings because “the villagers consider Robbie dangerous” (Asimov 11). Although Gloria deems Robbie a friend, Mrs. Weston calls it “a nasty old machine” (Asimov 14). Mrs. Weston alleges that “something might go wrong…some little jigger will come loose and the awful thing will go berserk,” but her husband refutes these arguments as “[n]onsense” (Asimov 9). Mr. Weston reminds his wife not only that the First Law makes it “impossible for a robot to harm a human being” but also that an expert from U.S. Robots visits their home twice a year to check Robbie to ensure that everything is working fine (Asimov 9-10). Technology clearly bears the blame in Mrs. Weston’s prejudiced view of robots, which ends up being ironic since, eventually, Robbie saves Gloria’s life.
Similarly, there is an ironic outcome in targeting the robot Cutie with prejudices in the short story “Reason.” The scapegoating of technology is prevalent through Michael Donovan’s use of a series of abusive words against Cutie such as “lunatic robot” and “walking junk yard” (Asimov 60). The story takes place on Solar Station #5 where Cutie is a robot curious about its own creation. Cutie faces hostility from humans because it refuses to believe in the existence of Earth, thereby challenging human authority on the station. For Cutie, its Master is the Energy Converter, not the humans. The story deals with the theme of religious tolerance as Cutie influences other robots to reject humans’ authority in what looks like a cult to the Energy Converter. As Cutie states, “These are robots – and that means they are reasoning beings. They recognize the Master, now that I have preached Truth to them. All the robots do. They call me prophet” (Asimov 66). Scholar Gordon Allport explains that “[t]he chief reason why religion becomes the focus of prejudice is that it usually stands for more than faith – it is the pivot of the cultural tradition of a group” (415). Allport’s analysis applies to what occurs in “Reason” since the problem is not Cutie’s cult to the Energy Converter itself, but what it stands for: a disdain of humans as the dominant beings. Donovan wonders how “to trust him [Cutie] with the station, if he does not believe in Earth” (Asimov 79). As such, what really seems to be behind prejudices against Cutie is a human need to be in control. The irony lies in the fact that there is no reason to blame the robot’s capacity to efficiently rule the station better than humans.
“Evidence” also deals with an unreasonable prejudice against robots. In the text, mayoral candidate Stephen Byerley is accused of being a robot. The scapegoating of technology is evident through the view that Byerley would not be fit for the job of mayor if he were a machine. The text draws attention to the anti-robot group the Fundamentalists, which “required no new reason to detest robots” and took advantage of the accusation against Byerley to make their “detestation audible” (Asimov 225). In what is a striking example of bias, Byerley is challenged to hit a person to prove that he is not “a monster, a make-believe man,” since such violence against humans would be impossible under the First Law of Robotics (Asimov 235). Clearly, the choice of words used in this episode highlights a hostile view of robots. It is also noteworthy that the unfavorable attitude toward robots in “Evidence” goes beyond the mere fact that the use of these machines has been banned from Earth; the bias also relies on the sense that it would be humiliating for humans to be governed by a robot-mayor. Therefore, the problem is not a safety issue but rather a matter of wounded pride. Once more, Asimov uses irony to expose the resistance to the idea of mayor-robot as mere prejudice; if Byerley were indeed a robot, “it would be impossible for him to hurt humans by letting them know that a robot had ruled them” (Asimov 237).
Although there are attempts to vilify Robbie, Cutie, and Stephen Byerley in I, Robot, robots are not a real threat to humans in these stories. On the contrary, Asimov’s robots are kind, protective and competent beings. As science fiction scholar Adam Roberts points out, “Asimov created a race of sentient, thoughtful beings” (199). In fact, I, Robot sets the view of robots as superior beings since its “Introduction,” when Doctor Susan Calvin, the robopsychologist who provides a framework to the book, affirms that robots are not only “[g]ears and metal; electricity and positrons” but “a cleaner better breed than we are” (Asimov xiv). Plots like those in “Robbie,” “Reason,” and “Evidence” provide specific examples of robots’ caring actions. For instance, Robbie saves Gloria’s life in an incident at the facilities of U.S. Robots, which forces Mrs. Weston to accept the robot at home. Asimov even delivers a compassionate description of Robbie’s affection for Gloria: “Robbie’s chrome-steel arms (capable of bending a bar of steel two inches in diameter into a pretzel) wound about the little girl gently and lovingly, and his eyes glowed a deep, deep red” (27-28). In “Reason,” Cutie is capable of handling the work in the solar station better than a human could, and for the benefit of humans. Powell is the one who admits, “He [Cutie] knows he can keep it [the energy beam] more stable than we can, since he insists he’s the superior being, so he must keep us out of the control room. It’s inevitable if you consider the Laws of Robotics” (Asimov 78). Finally, the irony in “Evidence” is that a robot can be a better politician than a human could be, as Doctor Calvin explains: “[b]y the Laws of Robotics, he’d be incapable of harming humans, incapable of tyranny, of corruption, of stupidity, of prejudice” (Asimov 237). As scholar John Pierce points out, short stories like “Reason” and “Evidence” indicate that “robots might indeed be better suited to protect the welfare of mankind than men themselves” (81). Asimov’s portrayal of altruistic robots is central to my argument because it is through this feature that the author demonstrates that the discrimination against Robbie, Cutie, and Stephen Byerley is unfair. For this reason, irony is a critical rhetorical device employed in I, Robot. As influential literary critic Cleanth Brooks explains, irony is produced by “pressures of context” (1048). In I, Robot, with its context of prejudices against altruistic robots, irony can be easily forged. And irony is likely the reason why the scholar Alessandro Portelli understands that, in Asimov’s work, “[a] human being who fears or mistrusts robots is either a villain who must be defeated or a skeptic who must be convinced” (151).
Regardless of the portrayal of prejudice against robots as unreasonable in I, Robot, it raises a question about the faults for which technology is taking the blame. I believe that the answer lies in humans’ own anxieties about losing power. Similarly, “Robbie,” “Reason,” and “Evidence” feature such tensions. For instance, in “Robbie,” Mrs. Weston sees a threat in the affection and dependence that her daughter Gloria feels for Robbie, so she desires to break the connection between the child and her robotic caregiver. In “Reason,” Cutie leaves humans out of the control room of the station, but everything works even better without them. In “Evidence,” the controversy involving Stephen Byerley in the mayoral race gains scope due to the fear of being ruled by a robot. Therefore, all these stories unveil a conflict relying on the fact that “Asimov positions a race of properly Kantian ethical beings (the robot) against the much more nebulous ethics that characterize actual human activity” (Roberts 199). The unfavorable view of robots manifests humans’ flawed morality under the discourse that the machine is a threat. Ironically, the problem is not the robots but the humans themselves. For this reason, my argument relates to what feminist theorist Donna Haraway considers an overarching concern of the field of science fiction: “the interpenetration of boundaries of problematic selves and unexpected others” (300). Consequently, humanity’s prejudice against robots also embraces the idea of resistance against otherness.
As such, the discrimination against robots in the short stories “Robbie,” “Reason,” and “Evidence” may be read as a form of racism. Philosopher Michel Foucault’s account that “racism is inscribed as the basic mechanism of power” sheds light on humans’ prejudice against robots (131). For Foucault, there is a link between “biological theory and the discourses of power,” as evolutionism addresses not only the “hierarchy of species” but also “the selection that eliminates the less fit” (132). Following this concept, it could be suggested that, if humans are weaker than robots, these machines could overcome humanity. At first glance, this gives the sense of robots as a threat. Asimov clearly plays with such views. In “Reason,” for instance, Cutie states that humans are “inferior creatures” and “will probably not exist much longer,” replaced by robots (Asimov 69). Cutie explains its reasoning to Powell and Donovan: “The material you are made of is soft and flabby, lacking endurance and strength, depending for energy upon the inefficient oxidation of organic material” (Asimov 62). Cutie continues the comparison: “I, on the other hand, am a finished product.… I am composed of strong metal, am continuously conscious, and can stand extremes of environment easily” (Asimov 63). The example helps to understand why bias acts as humans’ defense against robots. Based on evolutionism, robots become perceived as a threat because they are stronger beings than humans. However, Asimov shows that this view is flawed; robots are not a threat exactly because they are superior beings.
Ironically, the same superiority that leads robots to be perceived as a threat is what guarantees that there is no reason to fear them. In other words, robots will not try to subjugate humans because, as better beings, they are not selfish as humans are. Additionally, the Laws of Robotics ensure humans’ safety. Scholar Daniel Dinello argues that the laws “formed an ethical system that guaranteed robot servitude and technological goodness” (65). In fact, “Robbie,” “Reason,” and “Evidence” are texts that clearly demonstrate how the laws “make robots safe – and safe, of course, means subservient” (Pierce 82). For instance, Robbie is even more obedient than a pet would be, and what is even more important in the plot, it saves Gloria’s life. In “Reason,” Cutie may believe that the Master is the Energy Converter, but its actions are in accordance with humans’ own good, as the robot keeps the energy beam stable, which is vital for Earth’s sake. The robots do disobey humans in the station, but that is what guarantees humans’ own safety as the First Law prevails. In the case of Stephen Byerley, being a robot would make him a decent politician; in addition, he would conceal his real identity as a robot to avoid hurting the ego of a human. As Doctor Calvin concludes, machines “cannot, must not, make us unhappy” because of the Laws of Robotics (Asimov 271). The laws are Asimov’s way of saying that humans do not have reason to fear robots. As scholar Lee McCauley states, “Asimov believed that humans would put safeguards into any potentially dangerous tool and saw robots as just advanced tools” (153). For Portelli, the outcome of these laws is a new view of robots “as an instrument for the progress of mankind rather than a threat. Should science ever go out of control, it will not be because of its inherent characteristics, but through the fault of mankind” (150). In fact, the robot Nestor 10 from the short “Little Lost Robot” only becomes unstable because humans have modified the First Law. The story leaves a lesson: mankind’s dominance over robots is ensured as long as humans do not interfere with the Laws of Robotics.
The portrayal of prejudices against Robbie, Cutie and Stephen Byerley in I, Robot is also effective to challenge the “Frankenstein complex” that characterizes the scapegoating of robots. Dinello considers that Robbie, a caring and friendly machine, is a “counterbalance to the Frankenstein spirit of gothic horror’s blasphemous creation of the artificial human doomed to become a monster” (63). As Cutie’s and Stephen Byerley’s actions in I, Robot are also in accordance with humans’ own interest, they similarly consolidate the view that blaming robots is unreasonable. For Dinello, “fear of the creature and fear of the machine are mocked as characterizing foolish, mindless people” (64). The use of irony then becomes evident; Robbie, Cutie and Stephen Byerley face hostility, which qualifies as prejudice since robots are good beings. This paradox confirms that, in I, Robot, humans make technology a scapegoat for their own fears of losing power to robots. As Gorman Beauchamp puts it: “Asimov’s benign robots, while initially feared by men, prove, in fact, to be their salvation. The Frankenstein’s complex is therefore presented as a form of paranoia” (85).
Irony plays an important role in Asimov’s short stories “Robbie,” “Reason” and “Evidence” as it helps to emphasize man’s attempt to scapegoat technology. The hostility toward Robbie, Cutie and Stephen Byerley are prejudices clearly attached to humans’ anxieties involving power dynamics, such as the fear of being ruled by a robot. Asimov presents robots as kind and harmless because they follow the Laws of Robotics. These laws provide humans with the ultimate power over the machine. Thus, unfavorable attitudes toward Robbie, Cutie and Stephen Byerley sound unreasonable in the text. It is also worth noticing that robots become easy targets of suspicion because they stand for otherness. Actually, there are circumstances in today’s society that deal with prejudices as Robbie, Cutie and Stephen Byerley do in I, Robot. Mrs. Weston from the short story “Robbie” is not much different from a mother concerned whether or not her children’s friends are a bad influence. Donovan from the short story “Reason” does not differ from someone who has a hard time accepting orders from a new boss who comes from a different religious and cultural background. In addition, the conspiracy theories about Barack Obama’s birthplace and his eligibility for the presidency of the United States share similarities with the short story “Evidence,” in which Quinn accuses Stephen Byerley of being a robot in an attempt to hinder his chances of becoming a mayor. Therefore, the scapegoating of robots involves the same dynamics of power intrinsic in prejudice. Asimov’s I, Robot offers a clear picture of how people try to attach the burden of their own weaknesses to others.
Allport, Gordon. The Nature of Prejudice. New York: Anchor Books, 1958. Print.
Asimov, Isaac. I, Robot. New York: Bantam Books, 2004. Print.
Beauchamp, Gorman. “The Frankenstein Complex and Asimov’s Robots.” Mosaic 13.3 (1980): 83-94. Web.
Brooks, Cleanth. “Irony as a Principle of Structure.” Critical Theory since Plato. Ed. Hazard Adams and Leroy Searle. Boston: Thomson Wadsworth, 2005. 1043-1050. Print.
Dinello, Daniel. “Cybernetic Slaves: Robotics.” Technophobia! Science Fiction Visions ofPosthuman Technology. Austin: U of Texas P, 2005. 58-86. Web.
Foucault, Michel. “Society Must Be Defended.” Cultural Theory: An Anthology. Ed. Imre
Szeman and Timothy Kaposy. Malden, MA: Wiley-Blackwell, 2011. 124-133. Print.
Haraway, Donna. “The Promises of Monsters: A Regenerative Politics for Inappropriate/d Others.” Cultural Studies. Ed. Lawrence Grossberg, Cary Nelson, Paula Treichler. New York: Routledge, 1992. 295-337. Print.
McCauley, Lee. “AI Armageddon and the Three Laws of Robotics.” Ethics and InformationTechnology 9 (2007): 153–164. Web.
Pierce, John. “Artificial Intelligence.” Great Themes of Science Fiction: A Study in Imaginationand Evolution. NY: Greenwood Press, 1987. 77-94. Print.
Portelli, Alessandro. “The Three Laws of Robotics: Laws of the Text, Laws of Production, Laws of Society.” Science Fiction Studies 7.2 (1980): 150-156. Web.
Roberts, Adam. “Golden Age Science Fiction 1940-1960.” The History of Science Fiction. New York: Palgrave Macmillan, 2005. 195-229. Web.
Vanessa Nunes wrote this essay for a second-year course on science fiction at the University of Winnipeg, in Canada, where she has since completed a B.A. in English; she’s now working towards her M.A. in Cultural Studies. Vanessa also holds a bachelor’s degree in Social Communication – Journalism from the Federal University of Rio Grande do Sul, in her home country, Brazil. Her research interests include science fiction, fairy tales, internet culture, politics of representation, and post-colonialism.