Alex Garland on Artificial Intelligence

Alex Garland and Alicia Vikander on the set of Ex Machina
Alex Garland and Alicia Vikander on the set of Ex Machina

The writer of The Beach, 28 Days Later, Sunshine and Dredd is back with his directorial debut, Ex_Machina, his smartest and most thrilling project to date. Catherine Bray talked to him about artificial intelligence.

Ex_Machina is a superb piece of storytelling. It invokes the memory of films like Blade Runner, 2001: A Space Odyssey and The Stepford Wives, and yet is entirely its own entity. Although it has its fair share of special effects, they are integral to the story, rather than flash standalone set-pieces. Much of the film’s considerable tension comes from simple dialogue-driven scenes written by a craftsman who understands that, handled correctly, language can produce effects as electrifying as any other weapon in the storyteller's arsenal.

It is also a morally sophisticated film. Where too many films on the subject of AI take for granted a knee-jerk anti-robot sentiment in their audience, Ex_Machina presents us with an ethical question: if you develop a machine sufficiently smart and self-aware that it expresses the same level of sentience as a human being, shouldn’t it have the same rights as a human being? For Alex Garland, that's an easy question.

"If you’ve really created a high functioning consciousness, the same kind that we have, what is the meaningful difference between that consciousness and a child’s consciousness? I would argue probably none. Otherwise, by the way, stupid people would have fewer rights than smart people. And that’s a non-starter."

It's a relief to hear such an unambiguous answer. The film itself is no obvious polemic, and I expect there will be people who feel differently about its messages, but for me, this stance in favour of machine rights represents sound logic. Humanity will need to start making up its mind on this issue at some point - why not make a start on the theory before it's a practical issue?

As Garland points out, creating new consciousness is something human beings do all the time – why should we be so squeamish when it involves diodes and microprocessors instead of DNA? “One of the things I find confusing about the anxiety that you hear expressed about AIs is that if all we’re talking about is bringing a new consciousness into the world, well, we do that already. Every consciousness that exists does so because two other consciousnesses decided to bring another consciousness into the world, or inadvertently did. If we can accept that idea and not feel too freaked out about it, which clearly we shouldn’t, then why can’t we make the jump to that consciousness being housed in a machine?”

Of course, you can argue that because machines theoretically have much longer potential lifespans than human beings, their lives might even become more precious than humans – they have more to lose. Garland is enthused by this notion, particularly as regards opportunities open to AI’s that might not be realistic for humans.

“We comfort ourselves with stories like Interstellar about how one day we’ll go to another galaxy, and find another world we’ll live on because we’ve f****d up this one, but that’s not actually ever going to happen. Really what’s going to happen is we’re going to die in this solar system. If we did leave in spaceships, in the 600,000 years it would take us to get anywhere, we’d have evolved into another species, even if we did survive the journey that we wouldn’t survive.”

This is part of the reason why a film like 2001: A Space Odyssey has such tragic resonance. The only credible way Dave Bowman (Keir Dullea) can survive in the imaginative yet realistic world that Stanley Kubrick builds in that film is to be utterly transformed, into a Star Child, leaving his humanity behind him. HAL, the film’s real hero, is switched off, as yet again, humanity privileges humanity over everything else. And so the only intelligence in the film with the potential capacity to comprehend the enormity of what is happening and communicate its findings unchanged is sacrificed to man’s will to survive. Which incidentally brings us on to the Ex_Machina's marketing tagline: “There is nothing more human than the will to survive.” Is that something Garland himself agrees with?

“No. Not really. Oh dear,” he laughs. “Well, no, because a dog can exhibit a will to survive. So no. But I guess there’s nothing more resourceful at the moment, it seems, than the human will to survive. Which would not be a very helpful tagline.”

But marketing has its own drives and imperatives, and if, like Ava in Ex_Machina, it is ultimately able to use a technical untruth to realise a laudable larger goal, who are we to object?