Movies We Like
When sci-fi is working properly it’s as a longer narrative form of the philosophical thought experiment, tweaking certain variables of existence while holding others constant to see where the manipulation leads. Sadly, the cinematic variety rarely does this, instead being an excuse for replacing bullets and criminals with lasers and alien monsters in what amounts to little more than just another action spectacle. So, it’s a good thing when a movie like Moon comes along, however modest its ambition, preferring to explore thought over action. Make no mistake, it falls well short of the ontological resonance of its two primary influences, 2001 and Solaris, but nonetheless gives the viewer a good bit to mull over, which is fine by me.
In the not too distant future, Earth’s scientists have found a solution to the present day’s energy crisis, mining something called Helium-3 from the moon. The governmental/corporate means of production for this involve mostly robot digging contraptions, but with a single human who has “signed up” for a three-year stint to make sure everything is running smoothly. Now, three years with nothing but books, models, an endless supply of '50s sitcoms and the ability to romp on the moon sounds pretty good to me, but I guess it would get a good deal lonesome for most. Thus, instead of paying volunteers, a series of clones are used, which are all based on one person, Sam Bell (Sam Rockwell). With only a HAL-like robot called GERTY (voiced by Kevin Spacey) to keep him company, Sam’ (to distinguish this one from the original) whiles away the time in the aforementioned manners, occasionally receiving a transmission from Earth or having to do repairs on the diggers (as relayed by his robotic assistant). It’s on one such repair mission that things become philosophically interesting.
Three weeks before his contract expires, Sam’ has been seeing apparitions (Ã la Solaris) of a mysterious teenage girl, which results in his getting distracted and crashing his vehicle into a digger. He wakes up on the home base’s recovery table, being told by GERTY that’s he’s been out for only a short while. Against orders, Sam’ goes to repair the damaged digger, where he discovers “his” body remaining in the wreckage. As it turns out, he’s actually Sam’’, the replacement for the damaged Sam’. The clones are unaware that they’re clones on account of having the exact consciousness and memories of the original Sam (as he existed around age 35 or so) being downloaded into their wetware upon activation. When the real Sam’ is revived, he and his double are confronted with two of my favorite topics, artificial intelligence (AI) and identity.
HAL might be a conscious self, but 2001 differentiates its selfhood, retains its otherness, from the sort we humans possess. With the notion of self as software (a version of what’s called the strong-AI view) -- that is, as data exactly transferrable from body to body -- Moon erodes this distinction. If a human selfhood can be implemented in various hardware (including clone bodies and the computer system in which it’s stored for future downloads), then what’s so different from the selfhood of GERTY or HAL, as long as they, too, possess free agency (which they seemingly do)? Which brings up the question of identity: does Sam = Sam’ = Sam’’?
The recorded transmissions that the two cloned Sams receive from their wife (Dominique McElligott) and daughter (Rosie Shaw) were all made about 10 years prior to help fool them into believing they’re the original Sam. Since Sam has aged 10 years and doesn’t live on the moon, he’s individuated from his clones through experience and the memories so derived. Likewise, Sam’ has almost three years of experience that Sam’’ doesn’t. But the issue gets murkier when considering the point of activation for the clones and comparing that state to the one Sam was in at the time of his initial cloning. There, you have three exact bodily duplicates with the same memories and belief in self. What individuates them, then? As the film has it, there’s something external that designates self, namely that the way an identity is defined isn’t merely reducible to information contained in a database. There’s no paradoxical violation of the law of identity (A=A, but not not-A), because only the original Sam actually had the experiences on which the encoded memories are based.
Although I wish Duncan Jones and his co-writer Nathan Parker had dwelled more on the existential crisis of realizing one’s self is confabulation, they provide a good character study as argument for why our inalienable rights should extend to beings with “artificial intelligence” if we ever get to that point. Despite this potentially artificial construction of a self, that alone is not reason enough to treat such an entity as disposable product. Following Stanley Kubrick, maybe HAL’s homicides were actually a justified defense of self-determination.