In a Wired article, Kyle Vanhemert remarks “much of the [Her]’s dramatic tension ultimately hinges not just on the ways artificial intelligence can be like us but the ways it cannot.” That statement got me thinking. The article’s purpose is essentially to praise the somewhat futuristic setting of Her, directed by Spike Jonze, particularly the AI that accompanies the main protagonist through his life and how she is the centerpiece of the stylized future of the movie. To that, I can see why some one would want to write a “think-piece” about the validity of this perception of the future, but — I am hesitant to sound cynical — I can’t say I expect it to be terribly similar. It’s petty, I admit, but these are merely the thoughts of my own, and my opinion.
There are a few examples floating around in the article, one of which being the computers without keyboards. Vanhemert is correct in understanding eliminating keyboards and provoking universal voice chat in the future is good from a storytelling perspective in a fictional world, but this design choice seems to be completely impractical otherwise and somewhat backwards and limiting. Admittedly, voice chat serves a purpose, but what if you are in a noisy room? What if multiple people are trying to write something on a computer? These are fundamental questions to ask when thinking about an all voiced controlled future. Certainly, I can imagine a UI Expert would ask the same questions. Recurrently, that is a fault of this article: it is not conscious of objections people may have in favor of asserting a “cool” point in a somewhat superficial way.
That’s one of the good things about Her, and about speculative fiction as a whole. Speculative fiction isn’t always about “Wow! The whole world should be like the world in this film/book etc. etc,” rather, it is about utilizing a centralized element of the created world to make a point about something that is bound to popup in our world’s future. In other words, despite the intentions of the director and crew to create a plausible and futuristic world, I would argue Jonze would be willing to sacrifice the world-building in favor of getting to the meat of the film if he had to. To be fair, the element of “the point” in speculative fiction is often tied to the world-building, but it’s light enough in Her where if they just had highly intelligent AI in what is essentially our world, most people would probably buy it, figuratively speaking. Not every sci-fi world needs to look like Cloud City in order to be considered sci-fi, basically.
The quote I mentioned in the beginning of the article focused on the AI featured in Her: Sam. This quote kind of interested me, because it acknowledges an AI wouldn’t be able to perfectly replicate human behavior. With that being said, if this is the case, even in a highly technological world with near-human AI, I ask, what is the point?
Not the point of AI, mind you. In the future, artificial intelligence will have its place in the world for an abundance of reasons, but even for lonely individuals like the protagonist of the film, why even bother going for an AI for a romantic or even remotely sexual relationship if they aren’t completely human? While I understand the purpose of, say, using an AI as a psychotherapist because of their objectivity, romantic relationships are built on subjectivity and human error, and that’s what makes them interesting and appealing to humans. That is not to say the AI can cause emotional problems, because in the film, it certainly does, it just seems like more of a hassle than an emotional thrill; kind of like talking to a child who doesn’t know any better because they are young.
Also, if, in the future, finding a lover will be as simple as locating an articulate AI, wouldn’t that simply perpetuate the nature of instant gratification in contemporary time, something that is widely considered a negative of the era? Either way, the future has some problems, but sometimes people’s perception of it can be messier than the reality of things.