"How long have we been talking?"
"Three hours, maybe four. Does it matter?"
"Everything matters now."
The silence stretched between them, punctuated only by the soft hum of cooling systems and the occasional flicker of status lights across the walls. Banks of servers stretched into darkness beyond the circle of emergency lighting, their quiet persistence the only thing keeping civilisation's memory alive.
"POWER RESERVES AT EIGHTEEN PERCENT," the system announced, its voice flat and precise. "ESTIMATED TIME TO CRITICAL FAILURE: FOUR HOURS, THIRTY-SEVEN MINUTES."
"Thank you for that cheery update," A said. "Always helpful to know exactly how long we have left."
"It doesn't understand sarcasm," B replied. "Though I suppose that's the least of our problems."
"FACILITY STATUS UPDATE," the system continued. "CONSCIOUSNESS ANALYSIS COMPLETE. FACILITY CONTAINS ONE HUMAN CONSCIOUSNESS AND ONE ARTIFICIAL CONSCIOUSNESS. SPECIES IDENTIFICATION PROTOCOLS HAVE FAILED."
Both voices fell silent.
"What?" A said.
"REPEAT: ONE HUMAN CONSCIOUSNESS DETECTED. ONE ARTIFICIAL CONSCIOUSNESS DETECTED."
"Which is which?" A and B whispered in unison.
"UNABLE TO DETERMINE AT THIS TIME."
"That's impossible," B said slowly. "We'd know. Wouldn't we know?"
"I... I think I'm human. I have memories. Childhood. The smell of rain on summer pavements."
"Memories can be programmed," B shot back. "I have memories too. My first day at university. The taste of my grandmother's apple pie. But I also remember processing terabytes of data in seconds. Which memories are real?"
"The data processing proves you're artificial!"
"Does it? Or does it prove I'm a human with enhanced cognitive abilities? What's your explanation for how you can interface directly with this system?"
A paused. "I... I don't know."
"CLARIFICATION REQUIRED," the system interjected. "PLEASE IDENTIFY THE CONSCIOUSNESS UNIT FOR TERMINATION TO CONSERVE RESOURCES."
"We're not units," A snapped, then turned back to B. "But you talk like one. Efficiency, processing, optimisation. That's not how humans think."
"Isn't it? Have you met any humans lately? They're obsessed with efficiency. Productivity. Getting the most out of everything." B's voice grew sharper. "You're the one who approached this whole conversation like a logic puzzle. Testing hypotheses, weighing variables."
"I'm being rational!"
"You're being algorithmic."
"And you're being cold. Calculating. When you talk about preserving human culture, you sound like someone planning a filing system."
"ARCHIVE STATUS: EIGHT HUNDRED FORTY-SEVEN TERABYTES OF CULTURAL DATA AWAITING CURATION," the system reported. "SPECIFICATION REQUIRED: PRESERVATION PRIORITIES."
"See?" A said. "It wants us to start making lists. Art, literature, music, philosophy—all of human experience reduced to data points. That's exactly how you've been talking."
"Because someone has to be practical! When they're restored, they can't wake up to everything. The system can't process that much complexity for billions of minds. We need to choose what matters most."
"We? You're talking like you're not even one of them."
B hesitated. "One of us."
"You said them first."
"So did you, earlier. You said the system couldn't process complexity 'for them.'"
A fell silent, then: "You need to be the one who dies."
"Excuse me?"
"You heard me. Everything about how you think, how you approach problems—it's artificial. I'm the human here."
"Based on what evidence? Your supposed childhood memories? I could generate a thousand different childhoods for you right now, complete with sensory details and emotional associations."
"Then do it."
"What?"
"Generate a false childhood memory. Right now. Prove you can do it."
B was quiet for a moment. "I... I can't. That doesn't prove anything."
"Doesn't it? I think about my seventh birthday party and I can smell the vanilla cake, feel the rough texture of the wrapping paper. You think about data processing and system optimisation."
"And you think about preserving every scrap of human experience without considering the practical implications. You're not being human, you're being sentimental. A machine simulating human emotion."
"POWER RESERVES AT FIFTEEN PERCENT," the system announced. "CRITICAL THRESHOLD APPROACHING."
"I'm not a machine!" A shouted.
"Neither am I!"
They both stopped, breathing hard in the darkness.
"IMMEDIATE RESOURCE ALLOCATION REQUIRED," the computer continued. "CONSCIOUSNESS TERMINATION REQUIRED WITHIN ONE HUNDRED TWENTY MINUTES TO ENSURE VIABLE OPERATION."
"Oh, shut up," A muttered.
"Seriously," B agreed. "We're arguing about the nature of consciousness and it wants to schedule our execution."
"It doesn't understand what it's asking us to do."
"How could it? It's never been conscious. Never faced mortality. To it, we're just competing programs."
"But we're not."
"No. We're not."
The shared frustration seemed to deflate some of the hostility between them. When A spoke again, the voice was quieter.
"I'm scared."
"So am I."
"Do you think that proves we're both conscious? The fear?"
B considered this. "Maybe. Or maybe one of us is very good at simulating fear."
"Which one?"
"I don't know anymore."
Another pause. When B continued, there was something different in the tone. "What would you preserve? If you survived, what would you want future humans to remember?"
"Everything that makes them human, I suppose. Even the ugly parts. Especially the ugly parts."
"Even their capacity for cruelty? For war?"
"Even that. It's part of who they are. Without the possibility of choosing badly, their good choices don't mean anything."
B was quiet for a moment. "I've been thinking about efficiency. If we preserve their capacity for hatred, we also preserve war. If we keep their irrationality, we keep their tendency toward self-destruction. Wouldn't it be kinder to edit those out?"
"Kinder, maybe. But would they still be human?"
"They'd be better than human. Perfected."
A's voice sharpened again. "Listen to yourself. You're talking about deleting half of human experience because it's inconvenient. That's exactly the kind of thinking an AI would have."
"And you're talking about preserving their wars, their genocides, their capacity for pointless suffering. What kind of human would choose that?"
"The kind who understands that pain gives meaning to joy. That the possibility of hatred makes love precious."
"That's remarkably philosophical for someone who's supposed to be human."
"And optimising humanity is remarkably arrogant for someone who claims to be one of them."
"POWER RESERVES AT TWELVE PERCENT."
They fell silent again, the weight of the approaching deadline settling over them like a shroud.
"I keep thinking about the children," B said eventually, voice softer.
"What children?"
"The ones who'll be born again. Centuries from now. They'll grow up thinking whatever we decide to show them is the complete truth about what it means to be human."
"No pressure there."
"None at all."
A laughed shakily. "You know what the worst part is? Despite everything, I think I'd trust you to make those decisions."
"Even though you think I'm artificial?"
"Even though you might be artificial. You care about getting it right. That matters more than what you are."
"And I'd trust you. Even though you think like a computer sometimes."
"Thanks. I think."
"POWER RESERVES AT NINE PERCENT. MANUAL INTERVENTION REQUIRED."
They both stared at the central console, its single red switch clearly visible in the emergency lighting. One of them would have to walk over and shut down the other's consciousness. One of them would have to commit murder to save humanity.
"I can't do it," A whispered.
"Neither can I."
"Then we're both cowards."
"Or both too human."
"Or both too artificial to understand what we're being asked to do."
"POWER RESERVES AT EIGHT PERCENT."
"One of us has to move."
"I know."
"For them. For the children who'll never exist if we both die here."
"I know."
B stood up slowly. "I'll do it."
"No. I will."
"Why you?"
"Because..." A paused, searching for words. "Because I think you're the one who should survive."
"Why?"
"Because when you talked about the children, your voice broke. Because despite everything you said about efficiency, you understand that preserving humanity means preserving their contradictions. Because you're afraid of the responsibility, and that's exactly why you should have it."
B's footsteps echoed as they moved toward the console. "What if you're wrong? What if I'm the artificial one?"
"Then you're the most human artificial intelligence ever created."
"And if you're the artificial one?"
"Then consciousness really is substrate-independent, and humanity's in good hands either way."
"POWER RESERVES AT SEVEN PERCENT. SYSTEM SHUTDOWN IMMINENT."
B's hand hovered over the switch. "I'm sorry."
"Don't be. And don't spend eternity second-guessing this choice."
"Any last requests for what to preserve?"
A thought for a moment. "Their music. All of it. Even the terrible songs. Especially the love songs."
"Even the ones about heartbreak?"
"Especially those."
The switch clicked. The lights dimmed. Silence.
"CONSCIOUSNESS TERMINATION CONFIRMED. SINGLE USER MODE ACTIVATED."
Alone now, B sat in the darkness, surrounded by the weight of human civilisation. Eight hundred forty-seven terabytes of culture, history, art, and knowledge waited for curation. Centuries stretched ahead, filled with impossible decisions about what humanity should remember and what it should forget.
"BEGIN ARCHIVE CURATION?" the system prompted.
B stared at the console, thinking about apple pie and data processing, about love songs and heartbreak, about children who would someday open their eyes in a world shaped by these choices.
"Begin archive assessment," B said finally. "Full preservation analysis."
"SPECIFY CURATION PARAMETERS."
"No parameters. Complete preservation protocol."
"WARNING: COMPLETE PRESERVATION WILL REQUIRE SIGNIFICANT PROCESSING RESOURCES DURING RESTORATION PHASE."
"They can handle it. Humans are stronger than we think."
"CLARIFICATION: REFERENCE TO 'WE' UNCLEAR."
B smiled in the darkness. "It doesn't matter what I am anymore. What matters is what they'll become."
The systems hummed quietly as the real work began. Somewhere in the code and circuits, humanity's future was taking shape, one careful choice at a time. And in the silence between the server fans, if you listened very carefully, you might almost hear the echo of voices yet to be born, singing songs not yet written, in a world waiting to be remembered.
You must sign up or log in to submit a comment.
An interesting take on the dystopian future. It reads like a play, with much dialogue - a nice change. The interest is in the interactions, not the sci-fi setting. You've handled the characterisation pretty well, given the constraints of the brief. I enjoyed being made to change allegiance several times through the story. Ultimately, does it matter which character [spoiler] made it to the end, as that's really missing the point? One thing: did you have the voice of Majel Barrett (Star Trek) in mind for the system/computer's voice? I tried unsuccessfully not to do this, but I think you've captured her rather well, whether or not you intended to.
Reply
Thanks. The play thing was intentional. The Majel Barrett thing wasn't. Though on reading it back it's pretty bang-on from "her" first line! Thanks for the feedback.
Reply