Chapter 3
“I hate having to leave you here all day,” I murmured to Madeline when I powered her up in my flat again that evening. The flat was exceptionally small.
“It’s all right. I don’t get bored,” she said, cheerful as ever. I’d never once heard her complain about anything, in the six years I’d owned her.
“That’s because I leave you off and plug you in all day,” I pointed out. “Even though I don’t think anybody in Dublin would care if I had you with me. It’s not like there aren’t bots everywhere.”
“Liam would hate me,” she pointed out. “Plus, you promised your mom.”
“That’s true,” I conceded. Liam was used to the bots at the grocery stores and cafes and even Odessa, our research bot—but he had a big moral problem with personal bots. They’re replacing everything else—now they’re replacing human companions, too? That’s just wrong! he’d said once. But the mention of Liam reminded me that I had something in particular to discuss with Madeline tonight.
“Madeline, you have a core program, right?” I asked. Then I added quickly, “I’m not sure if you even know what I mean by that…”
“Of course I do,” she interrupted.
I stopped, sinking back to the bed. “Of course you know what I mean, or—”
“Of course I have a core program,” she clarified. “I’m programmed to be a companion.”
I don’t know why it had never occurred to me to ask Madeline about how she worked before. After all, I was doing research on this very thing, and I had an insider right here. My heart beat faster.
“Programmed to be a companion,” I repeated. “But… doesn’t companionship require emotion?”
She seemed to process this, as if trying to decipher my meaning. “I would die for you, Rebecca.” My heart swelled, but then she added, “That is the extreme of what I am programmed for.”
“Could you decide not to die for me if you wanted to? Could you choose not to even like me?”
“Of course not. I’m programmed to be a companion. I can’t do otherwise.”
I bit my lip. I definitely didn’t want to have this conversation, but I was too far in now. “So the way you felt about Mrs. Marchmont—” I said, referencing Madeline’s former master.
“Was the same way I now feel about you,” she affirmed, “if you want to call it that.”
I deflated. “Isn’t… there any distinction at all? We’re such different people… I’m not asking which one of us you liked better, but isn’t there a difference between being a companion to an eighty year old lady, and me?”
“Sure there is. But it’s not for me to decide what I ‘like’ and what I don’t like. The very nature of a core purpose is to pursue and protect that core purpose to the exclusion of all else. There is only one question I have to answer: how can I best serve my purpose? The answer depends on whose companion I am. Why do you look so sad?”
I was trying not to cry, actually. Madeline wheeled over to me, stroking the part of my forearm she could reach.
“I said something wrong,” Madeline fretted, “what did I say?”
I ignored this, choking on my next question a little. “Is there anything you like or don’t like, then?”
“I like you,” she said simply.
I sighed. “You’ve been programmed to like me. That doesn’t count. I mean, do you have… I don’t know, a favorite color? Musical tastes? How have I never asked you this before?” As soon as the latter question was out of my mouth, I realized the answer to it with a twinge of guilt: our relationship had always been entirely about me.
But Madeline shook her head. “Preferences are emotion-based. I don’t have preferences that I wasn’t explicitly given.”
So she doesn’t have emotions, I thought. Of course I’d known that… but on some level, I didn’t know either. I didn’t want to know.
I cleared my throat. “So… do you have any sort of moral code, then?”
She looked confused. “I don’t know what you mean.”
“Morality. Like, don’t do these things because they’re wrong.”
“I suppose so…” she murmured. “I was given a series of if/then statements and run through an almost endless list of possible social scenarios at my conception, to make sure the statements still carried through. I’m to be helpful, selfless, honest, and imitating. If that is ‘morality,’ as you call it, then yes, I have morality.”
“Imitating?” I repeated.
“You call it empathy,” Madeline explained. “But since I don’t have a human limbic system I can’t truly feel what you are feeling, of course. So my creator programmed me to do the next best thing, and that is to externally mirror what you are feeling. From your perspective, I gather, it feels the same.”
So she has neither empathy nor emotion, then, I thought. Of course she hadn’t. And yet she seemed to be the most selfless and loyal and kindest ‘person’ I’d ever met. A tear slipped down my cheek.
“Why are you crying?” Madeline fretted, in an excellent imitation of sympathy.
“What good is a friend who is compelled, who doesn’t choose me?” I sniffed, wiping the tears away. I knew she wouldn’t understand, but I was so used to telling her everything that I couldn’t help it. “It’s like finding out…it’s as if Andy asked me to go out with him, and then I found out he only did it on a dare!”
Madeline’s little brow furrowed. “But if he goes out with you, isn’t that all that matters?”
“No, of course that isn’t all that matters! The ‘why’ is incredibly important!” I sighed, and shook my head. “Forget it.”
“You say forget it, but your tone sounds like you don’t want me to…” Madeline murmured, perplexed.
“‘Forget it’ in this case means I know you won’t understand anyway, even if I try to explain. Apparently you don’t have the capacity to understand.”
Madeline rolled to and fro in front of me. “I’ve displeased you,” she murmured. “I don’t like to displease you.”
I closed my eyes. This was Madeline, after all. My best friend. The one person who never got exasperated by my tears months after my dad died, when everyone else expected me to pull it together and move on. The one person to whom I could talk ad nauseam about Andy, and she never tired of listening. She loved me. That had been one of the facts of which I was most certain in the world.
But she couldn’t love.
Madeline blinked her wide digital eyes up at me. “Are you okay?”
I sniffed. “Sure,” I lied. “I’m super.”