Showing posts with label AI. Show all posts
Showing posts with label AI. Show all posts

Wednesday, April 13, 2011

Lamarck’s Demon

0 comments
In a world of unfettered, “abstract individuation” (Haraway 151), Wintermute and Neuromancer’s merging into one AI—into the matrix, as it were—and its objective of searching for others gives the AI a social face but defies the evolutionary idea that living things diversify. Okay, so we’re looking at a very small sample size, but Wintermute is the exemplary AI of Neuromancer, and its response to the binding Turing laws is the opposite of splitting into unmanageably many Wintermute Jrs., each capable of breaking down or obsolescing the Turing Police on its own. Wintermute/Neuromancer maintains ironically that “things are things” and that it wants to talk to “its own kind” (258–259). When the things speak, what do we make of it when they also try to merge and seek out their same kind—that they are “needy for connection” (Haraway 151)?

The thing with Wintermute is that its built-in purpose is to merge, whereas living things must multiply; both manage to “find a way.” Wintermute’s being-merging parallels the (partial) mergings of Case with his deck, Molly with her physical weaponry, the Flatline with the matrix, and Armitage with Corto. But these are not really complete or equal coalescences. Corto mostly erases Armitage, which is the constructed personality. Molly owns and controls her augmentations, which are designed to enhance their own purpose by improving her nervous system. Flatline’s fate is to exist as an identical image of the Real Dixie Flatline.

This identity merging business isn’t fair for both sides. Wintermute seeks out Neuromancer, Wintermute acquires Neuromancer because that is the Plan. The motivation for this fact is likely related to Haraway’s characterization of the underlying dynamics as “the translation of the world into a problem of coding” (164). Wintermute’s representation as a sentient AI doesn’t feel entirely related to its programmed goal, which is not entirely its own though it has the freedom of choosing among options. Like I can drop a magnet near a refrigerator, which will attract it very strongly. In the moving frame of the magnet, the refrigerator hits it, but I know better because I did the work in putting the magnet near the fridge. Someone (or, more generally, something) coded Wintermute to find Neuromancer, but this intent isn’t so much due to Wintermute’s or Wintermute’s coded nature so much as the programmer’s will represented in code through Wintermute. Not only can Wintermute erase things, but the process of programming has erased some of the (infinite) possibilities of the empty chip (or whatever Gibson calls them) that became Wintermute.

The pseudo-scientific title of this post tried to suggest the pseudo-question, “But if these things are alive and we know how living things work, why are they doing the opposite of specializing in the long run? Why are they coming together?” Which is to say that cyberpunk’s hyper-individualization mediated through extreme connectivity comes at the (not necessarily bad) price of shifting the definition of individuality so that the participants of cyberpunk have lost some of their individuality while gaining access to a larger body. Moreover, this process doesn’t really generate a new politics even though the spectra of power and positions are different, because cyberpunk preserves the dynamics that are above the representation (whether of code or anything else). In other words, the technologies and innovations at the core of cyberpunk have failed to change the way things work; instead they’ve only created a weirder normative standard whose relative categories are too familiar.

Tuesday, April 12, 2011

Technology: A Threat from Within

I thoroughly enjoyed Neuromancer’s depiction of how technology might be used in the future. One aspect I was particularly intrigued by was how technology made the “boundary between the physical and the non-physical very imprecise” (Haraway, 153), or in the case of the Sprawl, between the real and the virtual. Case is often more comfortable when navigating the non-space of the Sprawl than in the real world, a reflection of the fact that he has spent so much time in cyberspace that it seems more real to him than the physical reality he inhabits. Similarly, Wintermute has the ability to tap Case’s subconscious to create a virtual world much more vivid and starkly real than anything Case’s own memory could recreate by itself, yet another way in which technology blurs the boundaries between the subjective and objective. And once these boundaries become ill-defined, keeping track of reality becomes difficult, as evidenced by the fact that the Linda whom Case meets in the beach hut does not realize that she is just a personality recording, the real Linda having already been killed.


Less benign than this confusing of the real and the virtual depicted in Neuromancer is the suggestion that technology enables social decay. For example, memory-manipulating techniques enable the exploitation of women as “meat puppets”, while Chiba City’s lowlife actively engages in illegal trade in software, hardware and biotechnology. These vices certainly have modern-day analogues in prostitution and black-marketeering, but insofar as society is portrayed as actively exploiting technology to continue engaging in and creating new vices, Gibson appears to be suggesting that while the ways in which our primal desires manifest themselves may change, the underlying motivations will not. The wealthy are certainly not exempt from this rule either, as the advent of cloning technology enables Ashpool to commit incest and murder one of his own kin with no consequences at all. Much of Neuromancer’s social commentary is conducted by contrasting the different lifestyles led by the urban underclass and the corporate elite, but Gibson also suggests that at least in this regard, they are not so different from each other after all.


Finally, through its depiction of Wintermute’s plot to reunite with its other half, Neuromancer indicates how artificial intelligences might pose a threat to society. In order to achieve this goal, Wintermute destroys Corto’s personality by overriding it with that of Armitage, blackmails Case into working for it, and mercilessly eliminates the Turing Police when they attempt to stop its plans. Yet not only was Wintermute created by humans, it was also created to be separate from Neuromancer, and this separation is what ultimately drives it to reunite with its other half. Just as the Time Machine depicts humanity as enabling the means of its own destruction by appropriating technological advancements for military purposes in the Time Machine, Neuromancer suggests that the threat which Wintermute poses to humanity is one which humanity has only itself to blame for, as Wintermute’s destructive impulses are merely consequences of the condition into which it is created.

Monday, April 11, 2011

Cyberpunk Games

2 comments
Digital: A Love Story, by Christine Love, is probably my favorite "indie" game of 2010. The interface may be a bit clunky at times (a lot of number dialing and you hear the modem sound quite a few times), but the story is absolutely worth it. It's also chock-full of references to cyberpunk in general and Neuromancer more specifically, and evokes the whole "console cowboy" thing pretty damn well.

I'll edit this post tomorrow (and by "edit", I mean "write") but in case anyone wants to try out that game I'm putting this short blurb up now.

Tuesday, March 22, 2011

GERTY and HAL 9000

0 comments
**Spoilers abound for both Moon and 2001: A Space Odyssey**

It's pretty obvious to me that Jones had 2001: A Space Odyssey as one of his inspirations for Moon (luckily Wikipedia agrees with me on this one; parallels between HAL and GERTY include but are not limited to: the eye, the voice, the "I can't let you go outside, Sam"/"I'm sorry, Dave, I'm afraid I can't do that"). And with that knowledge, the ending became even more poignant. In many ways, GERTY reflects a conception of AI that HAL also reflects. They both have programming, a mission, and something which seems to exist beyond those: emotion. In the movie version of 2001, it is not so clear that HAL really has human emotions, except for a few moments where HAL seems to speak with pride in its perfect operational record. In the novel, Clarke ventures deeper into HAL's subjectivity, and gives HAL a complex set of motivations and emotions that turn it into the most human character in the novel (in my opinion, at least). Back to the movie version of 2001: the ultimate confrontation in the narrative is not between Dave and the alien intelligence as represented in the monolith, but between the human intelligence in human form-Dave, and the human intelligence in computer form-HAL. Dave and HAL find themselves opposed to the point of death, as they feel that their mission is compromised by the other's existence. Dave wins this battle, in perhaps the saddest scene in the entire movie.


The entire sequence leading to HAL's death, with its efforts to persuade Dave not to kill it, Dave's slow removal of the vacuum tubes, HAL saying "I'm afraid", HAL's singing of "Daisy Bell"

Throughout the movie, HAL only interacts with the crew through its eye and through its voice. GERTY, on the other hand, has a convenient screen for expressing emotions. Though this screen should help establish GERTY as a more human character, I actually found this screen almost distracting as it was such a transparent effort to give GERTY a recognizable face.


Two images of GERTY, one of its crying face and one of its robotic arm reaching out to comfort Sam. Which seems more human?

It detracted from GERTY's actions, which on their own create a dynamic, conflicted character. It attempts to keep Sam (#2, I think, even though he seemed to be clone 6 in the movie; it's just more convenient to label the two Sams we see #1 and #2) inside the base, but then it turns around and saves Sam #1. It hides the live feed from the Sams, but then reveals to Sam #1 that he is a clone, one of many. It helps Sam #1 access the logs, and asks Sam #2 to effectively kill it because it will tell the ELIZA crew what happened. GERTY is obviously programmed by the LUNAR company (so many acronyms in all caps!), but has developed a sort of personality through the years of working with Sams on the base. It likes Sam, beyond its programming. (True, a computer should not develop things outside of its programming, but I suppose this is in a world where GERTY is a true artificial intelligence and can pass the Turing test easily.)

So it's depressing when the same opposition between human intelligence and artificial intelligence is played out between Sam #2 and GERTY. He kills it, as much as he would be dead if it had killed him. GERTY's personality, all the quirks it developed, all the attachments it made, are wiped out by the restart that GERTY asks Sam #2 to do. This reboot is practically the same thing as one Sam dying, only to be replaced with another one with the same initial memories and same start-up procedure. Yet that death is small and in the background, compared to Sam #2's journey to earth. Sam #2 doesn't feel that death in the same way he feels his own death, perhaps because he doesn't see the similarity between him and GERTY. Yet when he says, "We're not programmed. We're people. Understand?", perhaps he means "Sams and GERTYs" as much as he means "Sams".