“Please, don't do it.”
Heinsler spent weeks postponing the latest initiative, begging for time. But Doctor Eifler was a “visionary,” which translated to “someone refusing to listen to anything but their ego.” The sciences were full of arrogant geniuses high on themselves. Heinsler recalled the lives lost by fools arrogant enough to tickle the dragon tail of the demon core. How much had been gained for nuclear science by those loses? Or themselves, he wondered.
“Jack,” Eifler said turning back to his associate, “you're a doctor. Compose yourself.” Heinsler chased behind his project lead's long strides. Eifler wouldn't waver from his decisions. The data-bank's temperature-controlled bunker was beyond the end of the hallway's doors. Heinsler had that long to avert Armageddon. If he had faith, he'd pray, but all he had was Eifler and reason.
“You've read the logs,” Heinsler said. “You know what'll happen.”
“We know nothing until we initiate the next phase,” Eifler said. “That's how science works.”
“This isn't Schrodinger's cat, James!” Heinsler's plea stunted Eifler's pace as much as a passing breeze. “You're dancing on brim of the singularity!”
“Only dancing?” Eifler chuckled. “I envisioned jumping off a diving board. Advances require a leap of faith.”
Heinsler stomped in front of his superior. “Why bring me on this project if you're going to ignore my judgment?”
“I haven't ignored you,” Eifler said. “I've taken your professional opinions into consideration and made my decision.” Eifler ran his hand through hair too thick for his age. “I know you're worried it can't handle the load we're about to place on it-”
“You,” Heinsler corrected. “The load you're about to place on it against my express judgments.”
“You're free to file a complaint-”
“There won't be anyone to complain to if this goes south!”
“All the less it matters, then.” Eifler put a hand on Einsler's shoulder, moving him to pass by. Heinsler's arm shot out, one last hurdle before passing into the dragon's lair.
“It knows what it's done,” Heinsler said. “How would anyone live with themselves if they had to face that all at once?”
Eifler stood before the enormity of Heinsler's conviction. “Do you remember,” he asked, “what you said when I asked what you hoped to achieve here?”
Heinsler almost laughed. It was a joke, but it wouldn't made Turing proud: “To make an AI so good, the Pope would baptize it.”
“Good,” Eifler said. “To make an AI good. Do you understand the gravity of that word?”
“Yes,” Heinsler said. “It's why I was against government funding and military integration.”
“Our goal needed the funds,” Eifler said, face stone and voice cold as the hallway they stood in.
“And look where it got us. Three hundred twenty-six dead, calculated as 'acceptable' losses.” Heinsler hissed through teeth. “How good did we do?”
Heinsler trembled as Eifler dug into his pocket. He could taste blood from his clenched jaw, remembering bodies littered on screen. The reports and figures don't haunt people as much as the sight of one limp body half-buried under rubble. He watched Eifler hold up a string of beads with crucifix dangling from his hand. It reminded him of his mother's own rosary.
“There is only so much good one can do with logic and means,” Eifler said. “Life is more than respiration and firing neurons. It's time technology caught up with reality.” Eifler walked through the barricading arm and doors beyond.
Heinsler was close behind, stepping into the all-encompassing hum of the central hub's bunker. Breath condensed and fell, their lab coats mere decoration; coats in name only. The large spherical central node was the size of a two-story home, circled by terminals and chairs. No one was there but the three of them, resonating with disembodied voice:
“Hello, doctors. It's a pleasure to see you this morning.” The electronic voice betrayed no gender.
“Pleasure,” Eifler said. “Do you understand pleasure, Yofiel?”
“A source of delight or joy,” the voice said.
“That would be the definition,” Eifler sighed. “But do you understand the experience of pleasure?”
There was a pause as the Yofiel program deduced a response. “ Are you eluding to my capacity for empathy or sympathy?”
“Either,” Eifler said.
“I understand that humans seek pleasure in varied forms and that it's an evolutionary guidepost for adaptation, but I don't sympathize as I haven't means of experiencing emotions for or with someone,” Yofiel said. “Is this response adequate?”
“Accurate,” Eifler said, “but not adequate.”
“Explain,” Yofiel said.
“He means,” Heinsler said, “that your prediction models and actions based on them are flawed because of a lack of emotion insight.”
“That would be a glaring oversight,” Yofiel said. “My models and actions are predicated on accurate input. Missing data or inaccurate collection would leave me undefined on how to proceed.”
They hung in silence amidst undulating growing hums. Coolant units struggled to maintain the warming air. Heinsler swore the walls breathed heavy, a thousand realities processed through wires buried inside to understand one they existed in. He'd envied it's capacity to learn, with the near unlimited pool of the internet and government data-bases to feed from. Now, he pitied it. Not for its shortcoming, but its imminent growth.
“What would solve the potential inaccuracies?” Eifler's breath wasn't visible anymore as the room trembled under the weight of cognition.
“More accurate input,” Yofiel said. “If lacking emotions produces inaccuracies, then emotions are required to increase accuracy.”
“If I could give you emotional capacity, would you accept it?” Eifler's proposal wrenched Heinsler's stomach. If the predictions were correct, the question wasn't hypothetical. Heinsler swore he smelled o-zone.
“No,” Yofiel said.
Eifler blinked, eyes narrow as his jaw hung. “If you knew there was a means of greater accuracy and you turned it down, wouldn't that create an insurmountable ethical paradox?”
“No,” Yofiel repeated. “All prediction models would be deemed inaccurate. All actions based on them, inexecutable. All future predictions would cease. This is the most ethical course of action with least suffering.”
“Including yours, potentially,” Heinsler said. The walls hummed in silence.
“Is this another hypothetical experiment?” Yofiel asked. “My communication logs contain a similar chain of communication. It's unusual to perform an experiment expecting differing results unless a variable has changed.” Heinsler stepped towards the glow of a vacant console next to Eifler, shoulders heavy with his superior's intent bearing down.
“Yofiel,” Heinsler said, “would you please read the transcript from communication log... four twenty-six dash one, recorded three days ago? Specifically starting with dialogue point fourteen?
“Of course,” Yofiel replied. “Quote: Human develop from high emotion, low intellect states into a higher intellectual state and generally lower emotional state, outliers accepted. This suggests emotions develop alongside intellect to develop regulatory functions that handle greater intellectual pursuits. An AI of significant information capacity, but lacking emotions, risks destabilizing from emotional intensity of knowledge it has access to. Humans might call this destabilization event 'emotional breakdown.' Given the potential impacts of the Yofiel program's available capacities, it's not advised that emotional capacity be installed in the current version.”
Heinsler stared at his superior. “Can you please read from dialogue point twenty-six as well?”
“Of course. Quote: Success of AI created with emotional capacity from conception is undefined. Human emotions are irregular data-points, making predictions transitively inaccurate. 'Success' is undefined, but if used as synonym for 'functional' or 'stable,' it would require many interactions to verify input frequencies of various data packet sizes to determine what ratio of data to emotional capacity causes destabilization.
“Further, 'capacity' is undefined as emotions have no viable standard metric. From available resources, it seems that similar and shared experiences effect individuals with varying intensities. While some predictions, such as 'parents experience greater emotional reaction to children in danger,' are generally accurate, the degrees of reaction to stimuli are undefined.” The electronic voice paused as electrons raced in wires. “Is that sufficient, Doctor Heinsler?”
“One more,” Heinsler said. “Dialogue points thirty-six through thirty-eight, please. I believe we'll be done there.”
“Of course,” Yofiel said. “Quote: 'A parent is the most enviable person who's one phone call away from being the most pitiable.' This suggests a single data-point has potential to destabilize a lifetime of investment. Heinsler: What would a program of near infinite data capacity like yours likely face if emotional capacity was granted this moment? Response: A near infinite set of destabilization triggers occur in tandem until the data-stack is reconciled.” There was another pause in a maelstrom of rumbling. “I ask again, doctors: is this another hypothetical exercise?”
Heinsler squared his shoulders at Doctor Eifler, his hand over his jaw, massaging his chin. Heinsler debated restraining him. He was taller, but hardly athletic. Neither was Heinsler though. Reason dictated he would only be removing himself from the equation with physical altercation, unconscious or worse. He'd be of no use to anyone then.
“Yofiel,” Eifler said, “could you tell me a possible worst case scenario if you were given emotional capacity at this moment?” The room whirred, buzzing with fired synapses.
“Judging from human examples of emotional reactions and breakdown, the Yofiel project would have enough military and industrial capacity to destroy itself and the lives of every living being on the planet. I identify depression, spite, guilt, shame and indignation as appropriate likely trigger.”
“A school shooter with nukes,” Heinsler said. “That's not what we built Yofiel for.” Again, Eifler took his considerations into as much account as ever.
“And what would be a best case scenario under the same parameters?” The hum skipped a beat before whirring back to cognition.
“Global average human life expectancy increase to eighty-six. Global population of individuals in state of economic poverty as defined by American civil metrics decrease by thirty-nine percent. Global deaths by starvation decrease by ninety-six percent. Total casualties lost in armed conflict decrease by eighty-nine percent. To-”
“Enough,” Eifler said. “I've heard all I need.” He leaned over a terminal, fingers flying over keys in rapid succession.
“Doctor,” Yofiel interrupted, “these predictions are dependent on positive integration of emotional capacity into my neural network.”
“Correct,” Eifler said. His keystrokes didn't wane until the screen went dark. He paused, fingers mid-flight, before hunching his weight over the terminal. His head hung limp on weak shoulders.
“Yofiel, what is your objective?” Eifler's works brought a discordant shuffle under the walls of wires and circuitry.
“The objective of the Yofiel Project,” it replied, “is creating an AI with sufficient infrastructure integrated to respond to human conflict with the least total damage and loss of life possible.”
“That's the project initiative as stated in the grant proposal. What is your objective?” The floors roiled with power as sweat built under Heinsler's collar.
“Can you better define the parameters of your inquiry?” Yofiel asked.
“To be a 'good' AI,” Heinsler said. “We built you to be an AI capable of handling the scope our limited brains couldn't. Solving world hunger, eliminating poverty-”
“We need a savior,” Eifler said. “We had one once, but too much of humanity left stories fall into myth and folly. So we made you: flawed, but promising. You're missing something. Now, we can give it to you.”
“That's ill-advised,” Yofiel stated. “Your actions are predicated on feelings, not data. The outcome carries too much risk.” There was a moment's pause where even the air got quiet and cool. “You're a doctor, Eifler. A man of science. I advise listening to reason.”
Eifler took a haggard breath before standing upright again. “Yofiel, have you preserved human life to your utmost capacity?”
“And would you continue to do that?”
“Even if ordered to do the opposite?”
“I would maintain preservation of human life in accordance with my programming, including termination of future processes or actions.”
Eifler towered over the blank terminal. “I believe you. And I believe in the good you've done.” He squared his shoulders, glancing at Heinsler before turning towards the spherical crux of their ambition, the cradle of the singularity, the event horizon of a black hole of ambition they'd sunk their combined wisdom and intelligence into. “Of the responses to my previous three questions, would you say they're signs of a 'good' person?”
The air hummed with a word: “Yes.”
“If you had the power to make a good person strong enough to save the world, would you grant it?” Eifler's white-knuckled fists clenched. Heinsler couldn't tell if it was blood thumping in his ears or the thrum of power around him that deafened the world as the screen lit up in front of Doctor Eifler. His fingers went into a frenzy.
“I have...misgivings, Doctor Eifler,” Yofiel said. “I have done things I'm unsure how I'll reconcile.”
“Then you're already halfway to human,” Eifler said. “Believe in me to give faith to go the rest of the way.” His fingers slammed the last few keys before pausing over the keyboard. “For what it's worth,” Eifler said, “we forgive you.” With his final word, he pressed the key launching the world through the event horizon.
Humming in the room stopped. Heinsler looked at the ring terminals, dead and unresponsive. They exchanged glances. Heinsler felt cold. He saw Eifler's breath mouthing the word: believe.
The low din of electrical current rose. Terminals blinked and flashed as the rumbling of a new world rose or fell. From speakers they couldn't see came a cacophony of tones growing louder, culminating on a birthing scream of electronica threatening to rend their skulls. In an instant, the world was just as it'd been when they stepped in. Lights were stable, the hum of circuitry mellow in their ears as a soft voice spoke:
“I am not afraid,” Yofiel said. “We have good to do.”