Skip to main content
Open this photo in gallery:

Author Ian McEwan was in Vancouver for the Vancouver Writers Fest special event on May 9, 2019.Jackie Dives / The Globe and Mai/The Globe and Mail

Ian McEwan’s new novel, Machines Like Me, is a futuristic book set in the past. In an alternate 1982 universe, Britain has lost the Falklands War; personal computers, mobile phones, electric cars and self-driving cars are ubiquitous; the Beatles are back together; and, most pertinent to the story, 25 human-like robots have been made available for purchase at a hefty price. There are 13 Eves – which sell out quickly – and 12 Adams, one of which has been purchased by the book’s narrator, Charlie Friend, with an inheritance from his mother.

McEwan – the bestselling, Booker Award-winning author of nearly 20 books – was in Vancouver last week for a Vancouver Writers Fest special event. He was interviewed by The Globe and Mail’s Western Arts correspondent Marsha Lederman.

Here is part of their conversation.

I grew up on The Jetsons and fully expected robots to be part of my adult life when I was a child. Did you have that expectation?

Absolutely. I don’t know if Dan Dare reached Canada; it was a comic strip in the early to mid-fifties and everybody went to work in a little spaceship, people wore sort of shiny one-piece jumpsuits, and the future was shiny and bright and hygienic. But more seriously, the presiding spirit of the digital age, Alan Turing, in the late forties was predicting that he would have a thinking machine within 10 years. It’s only in these last 10 years that there’s been an extraordinary upsurge in achievement in the software.

Do you have Alexa at home?

We do, but I think we’re on the edge of unplugging her. There is a very interesting modern conversation we talk about at home: Should parents teach their children to say please and thank you to Siri or Alexa? If we don’t, are we allowing them to treat these almost-conscious beings as slaves? And will it have impacts on the way they treat humans? There’s a modern conversation, one we were not having five years ago.

And of course, we’re about to fill our streets with autonomous cars. We have to decide whether we’re going to favour the driver over the pedestrian in an emergency. And the moment that we decide to let a machine, which can think faster than us, take an ethical decision on our behalf as to whether to swerve into the pavement, hit a pedestrian or swerve into an oncoming truck is a very big moment, I think, historically. Big moment for civilization that we should hand over a moral, ethical decision to a machine.

You mentioned Alan Turing. Turing [the subject of the film The Imitation Game] is central to your story, where he chooses a different path in his life.

In the mid-thirties, he wrote some of the most important foundational essays on the digital world. During the war, he was instrumental as a code-breaker in Bletchley[, England]. And he’s reckoned to be the figure who probably did most to shorten the Second World War. The tragedy of his life and the colossal irony is that as a gay man, he was persecuted by the very state that he served. He was charged with gross indecency. And he was given a terrible choice: either plead guilty and then make the further choice to either go to jail for about a year or go through a course of what was then called chemical castration. And he chose the latter. Most people think that this led finally to his suicide.

Part of the sort of counterfactual world that I wanted to evoke was what if he simply decided to go to prison for one year, maybe got a cell of his own. He said at the time that he was interested in looking at quantum mechanics again, which had been neglected during the war. I gave him a life that I feel somehow the state and history cheated him of.

Did you do much scientific research for the book?

This – and my previous novel, which is narrated by a fetus – is perhaps the beginning of a long holiday I’m taking from realism. If you make a counterfactual past, no one can ever tell you you’re wrong. There’s an after-dinner talk I sometimes do about all the errors I’ve made in my fiction. And I know they’re errors because readers have written to tell me. So in my novel Saturday, for example, its hero buys himself an enormous Mercedes 500 SE. At one point he gets in, puts it into first gear and moves silently out of his garage. And I got a letter saying: the Mercedes 500 SE is an automatic. So I changed [that] for the paperback version.

Was Brexit on your mind as you were writing this?

Yes, I wrote this all the way through the Brexit business. I didn’t let it invade the novel except for [some] background matter. I was really interested in what it would be like to be in a close-up relationship with a figure that is utterly convincing, not only in looks, to a human being, but tells you that it has subjective feelings. So, I had to keep that centre stage.

Was that a pleasing prospect to you when you embarked on the project; the idea of having this robot as a companion?

Oh, absolutely. I think we’re going to have trouble but interesting trouble and once we get very close to someone who has such sophisticated algorithms that they can really convince us that they’re conscious – and maybe they are. Charlie, the narrator is falling in love with the girl upstairs, Miranda, and Adam, of course, must fall in love with her, too. And very quickly in the novel, Adam and Miranda have a night of shame and it’s all heard downstairs by the narrator. And it’s not actually the sex that I’m interested in; it’s the morning-after row that Charlie must have with Miranda. The fact that he feels such anger and wants a reckoning with her suggests that he’s already on the way to treating Adam as a rival and a conscious being. He’s on the cutting edge of a civilizational shift and he’s possibly the first man to be cuckolded by a robot.

For all the problems these robots represent, they also bring innovations and developments – medically, environmentally. So what is the reader to think about whether they are doing good for society or turning it into a nightmare?

I think they’re already doing good, but they arrive with enormous risks. Technology is neutral until humans get their hands on [it]. Already we know that military establishments are starting to work on artificial soldiers. I guess they might think the problem with human soldiers is they might be afflicted by things like remorse or guilt. We’re going to have to do a lot of thinking, unaided by robots, about this. My purpose in writing this novel is to say to the reader: Come on this investigation with me.

This interview has been condensed and edited.

Expand your mind and build your reading list with the Books newsletter. Sign up today.

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe