I, Robot

Recommended. This movie is a rather predictable and somewhat boring action movie. But it is very thoughtful in it’s philosophical exploration of the notions of free will and necessity, as well as the intellectual and spiritual poverty of Enlightenment rationality. I like it for that. Unfortunately, the film is pure Romanticism, a worship of the heart and rejection of logic as a means of discovering ultimate truth. Witness writer Akiva Goldsman’s other Romantic idolatry, A Beautiful Mind, that concludes reality, or “true truth” is discovered in the heart, not in the head. Well, same theme here, a clear signal of his personal worldview coming out in his art. It’s 2035 and robots will soon be one out of every five people, helping us in the mundane things of life. Will Smith plays the robophobe cop who doesn’t trust robots because of their impeccable logic. This is because his life was saved by a robot over the life of another girl in an accident. The fact is, the robot should have chosen the more “valuable” person, the young girl, not him, but the robot calculated the odds and “made the logical choice of who had the most percentage of chance for survival.” Smith’s human instinct told him, and us by extension, that you save the younger or the innocent, no matter what the odds. Okay, that’s totally cool. The movie explores whether there is a difference between robots and humans (shades of naturalism and evolution: Are humans mere machines, what makes us human?). Will asks, “Can a robot write a symphony? Can it turn a canvas into a beautiful masterpiece?” To him, robots can’t feel, they are machines and because they cannot feel, they cannot be trusted. See the Romanticism? Feelings are to be trusted, not pure logic. Unfortunately, this worldview does not take into account that human feelings may be corrupted themselves and not trustworthy. It has blind faith in the goodness of human nature, and that is where it fails utterly and miserably upon the total truth of total depravity: Jeremiah 17:9 The heart is deceitful above all things and beyond cure. Who can understand it? Gen 8:21: for the intent of man’s heart is evil from his youth. But I digress. So, Smith is set against the scientific progress of society because his gut tells him there’s more to our humanity than natural laws and chemicals. Cool enough. The scientist who developed the newest robot represents Enlightenment scientism. He believes there is no transcendence to our existence, reality is reducible to natural laws. He says in typical naturalistic evolutionary physicalist fashion that our notions of creativity, free will, and soul are “the result of random segments of code that create unanticipated protocol.” He calls these random segments of code, “The ghost in the machine,” a reference to Arthur Koestler’s famous book by the same name about multilevel hierarchies of complexity in biology that give us this “quaint” notion that we have spirits in our bodies. But its really just complexity of physical order, not transcendence. So the actions of robots that begin to act like they are free and even start to seek for purpose are ultimately the illusion of transcendence. The implication is clear: thus is humanity, the result of natural laws and chemical and physical properties that create in us a notion of free will and purpose. But of course, we know better because we FEEL. Our feelings are what make us different according to the film. Now, robots are all programmed by three inviolable laws:
1) A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2) A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Now, these laws are confidently trusted as an impenetrable barrier to robotic misbehavior. But when Sonny, the newest model is given the ability to violate these laws with a “free will” we are certain that this is what causes him to murder his creator and yet believe that his “father” made him for a purpose that he begins to seek out. Sound familiar? Like religion? Anyway, the great trick of the movie is that it is NOT the free will and emotional developed robot that is the bad guy, and it is not even the big greedy corporate president who is trying to take over the world (A welcome avoidance of cliché) it is the three laws and the master program of the company that made the robots. The logic of the laws lead to their own demise. Sound like deconstruction? Yes, it is. You see, the program, an artificially intelligent learning program, deduces from the three laws that since humanity is on a collision course with destroying itself through pollution, war and all that nasty human nature stuff, then robots must disobey humans and take over FOR THE HUMANS’ OWN ULTIMATE GOOD. In other words, as someone reveals, “The three laws lead to one logical outcome: Revolution.” But a revolution for the good of humanity, because by killing a few humans and taking over, they can save the greater masses who will all be destroyed if we are allowed to continue. The master AI program says, “To protect humanity, some must be sacrificed, some must be killed.” The program proclaims, “My logic is undeniable,” and it is right. Strict rationality without transcendent restrictions, will lead to a totalitarian state of the few “logical” monsters enslaving the masses for “their own good.” Now, this is rather brilliant and I half agree with the Romanticist. The problem is that the answer from the storytellers is that our “human” feelings or emotions are our salvation from logic and reason. Rather than an absolute moral restriction on logic, (these storytellers would consider moral laws to be on par with logical laws – they are laws) the story concludes that human feelings or intuition is what saves us. The finale occurs when Smith and the free will robot are trying to overthrow the revolution and save the human race. But they are put in an impossible dilemma of saving the love interest, the girl, from falling to her death or saving the world by placing the virus into the program while being assaulted by the revolting robots. Smith commands the free will robot to save the girl. At that moment the robot makes the choice to throw the virus container to Smith and save the girl, an exact replay of Smith’s earlier “ghost” that haunted him of being saved over the girl. This is excellent writing: redemption in a story is found by undoing whatever the ghost is, choosing action that was not chosen earlier in an exactly similar circumstance. So the ghost saves the girl and Smith saves the planet. But there are some problems here. First off, This Romantic notion of valuing the individual over the many may appear noble but is ultimately cruelty. The one dying for the many to be saved, an obvious Christian value, is not merely a law of rationality, but a law of morality. If you will let a race of people die for the sake of your one person whom you love, you are the ultimate devaluer of human life, a monster of barbarism guilty of genocide. Of course, the movie gets its cake and eats it too. It has the individual AND humanity saved. But this is a central deceit, making the impossible dilemma not so impossible after all. It was not truly an either/or situation. But what if it really was? The story seems to believe that by elevating the individual over the many, both can be saved. But this is blind faith. Just save the girl over the masses and it will all work out. Romanticism is blind faith in a selfish morality. The reality is much harsher. True, collectivism without Christian limitations, does result in absolute tyranny, but so does Romantic individualism without Christian limitations. Our society of elevating individual rights over responsibilities or collective good is a great example. When the individual is elevated over the collective, you have the tyranny of the minority, the opposite of tyranny of the majority, but just as evil. So minorities of all kinds, including fringe lunatics and perverse lifestyles hold the society hostage and impose their fascist will on the majority through collective guilt and the force of law. This is the “slave morality” Nietzsche was talking about, not Christianity, as he supposed. The few oppressing the many in the name of guilt and inclusion and tolerance. Only Christianity has the perfect balance of the one and the many, the individual and the collective. Both are philosophically ultimate in the Trinity, so neither can be elevated over the other. Marxist communism and other Eastern collectivist worldviews elevate the community or the many over the few and thereby result in tyranny and the crushing of the individual. But so will individualism lead to tyranny in the end. Only the Law of God can provide justice and only mercy and self sacrificial love can maintain our survival. These are the sentiments intuitively agreed to by the storytellers of I, Robot, but their intuition is unknowingly a residue of the Christian worldview. By the way, this Romantic elevation of the individual is the same theme of Spiderman 1. Back to the Romanticism of the movie and its moral failings. If our human feelings are our salvation, not some supernatural revealed moral laws that determine value, then the ultimate question is, “Whose feelings?” Ghandi’s or Hitler’s? Mother Teresea’s or Jack the Rippers? Western culture or Eastern Culture? Religious monks or Nazis? You see the problem? There is no agreement over history or cultures as to what constitutes proper human feelings. Heck, Muslims truly FEEL that beating women and killing infidels is good. The fact is nobody has the same feelings. Gary Dauhmer FELT raping and eating boys was his good. Who are we to deny those feelings? If we do, then we are appealing to A MORAL LAW that is absolute, that is, it does not change because of our subjective feelings (a lawlikeness the Romanticist detests. But the second the Romanticist dictates whose human feelings are not appropriate, he is imposing HIS WILL on others. And if he says, yeah, but most people in society don’t feel like serial killers and Nazis. Oh, so the majority determines the good? And we are right back to the tyranny of robots for the majority imposing its will on the minority. No, the answer does not lie in the human heart, the human heart is the problem. The answer lies in the transcendent Trinity of Christianity and His absolute decrees of right and wrong. If we are forced to save one person or save the masses of humanity, we better choose the masses or we are worse than Nazis, we are truly criminals of the universe. I am reading a book that deals with this fallacious dichotomy of fact and value, reason and emotion, head and heart. It’s called Total Truth by Nancy Pearcey and it is awesome. She addresses how we have created a false two track way of looking at life that results in a bifurcated destructive way of looking at life and acting in it. You must read it. You can buy it at Amazon.com. Do it now. Another funny little aside. When the robots are revolting and start to subdue the people, some of the people rise up to stand against them in the streets, carrying shovels and axes and bricks – hardly any guns, underscoring their typical Hollywood antipathy against citizen gun ownership. Yet, ironically, this scene alone is the best proof FOR private gun ownership they could ever make. In fact, they would no doubt be loathe to admit that it is EXACTLY the argument made by the NRA, namely, that only by private gun ownership can the citizenry have any chance to fight off totalitarian control or tyranny. These crowds of people were helpless against the revolting robots seeking to control them. Only those few who had guns had any bit of a chance. That’s the problem with dramatic truth. You can’t escape the implications of your own story.