20 comments

Fiction Science Fiction Speculative

“Please, don't do it.”

Heinsler spent weeks postponing the latest initiative, begging for time. But Doctor Eifler was a “visionary,” which translated to “someone refusing to listen to anything but their ego.” The sciences were full of arrogant geniuses high on themselves. Heinsler recalled the lives lost by fools arrogant enough to tickle the dragon tail of the demon core. How much had been gained for nuclear science by those loses? Or themselves, he wondered.

“Jack,” Eifler said turning back to his associate, “you're a doctor. Compose yourself.” Heinsler chased behind his project lead's long strides. Eifler wouldn't waver from his decisions. The data-bank's temperature-controlled bunker was beyond the end of the hallway's doors. Heinsler had that long to avert Armageddon. If he had faith, he'd pray, but all he had was Eifler and reason.

“You've read the logs,” Heinsler said. “You know what'll happen.”

“We know nothing until we initiate the next phase,” Eifler said. “That's how science works.”

“This isn't Schrodinger's cat, James!” Heinsler's plea stunted Eifler's pace as much as a passing breeze. “You're dancing on brim of the singularity!”

“Only dancing?” Eifler chuckled. “I envisioned jumping off a diving board. Advances require a leap of faith.”

Heinsler stomped in front of his superior. “Why bring me on this project if you're going to ignore my judgment?”

“I haven't ignored you,” Eifler said. “I've taken your professional opinions into consideration and made my decision.” Eifler ran his hand through hair too thick for his age. “I know you're worried it can't handle the load we're about to place on it-”

You,” Heinsler corrected. “The load you're about to place on it against my express judgments.”

“You're free to file a complaint-”

“There won't be anyone to complain to if this goes south!”

“All the less it matters, then.” Eifler put a hand on Einsler's shoulder, moving him to pass by. Heinsler's arm shot out, one last hurdle before passing into the dragon's lair.

“It knows what it's done,” Heinsler said. “How would anyone live with themselves if they had to face that all at once?”

Eifler stood before the enormity of Heinsler's conviction. “Do you remember,” he asked, “what you said when I asked what you hoped to achieve here?”

Heinsler almost laughed. It was a joke, but it wouldn't made Turing proud: “To make an AI so good, the Pope would baptize it.”

Good,” Eifler said. “To make an AI good. Do you understand the gravity of that word?”

“Yes,” Heinsler said. “It's why I was against government funding and military integration.”

“Our goal needed the funds,” Eifler said, face stone and voice cold as the hallway they stood in.

“And look where it got us. Three hundred twenty-six dead, calculated as 'acceptable' losses.” Heinsler hissed through teeth. “How good did we do?”

Heinsler trembled as Eifler dug into his pocket. He could taste blood from his clenched jaw, remembering bodies littered on screen. The reports and figures don't haunt people as much as the sight of one limp body half-buried under rubble. He watched Eifler hold up a string of beads with crucifix dangling from his hand. It reminded him of his mother's own rosary.

“There is only so much good one can do with logic and means,” Eifler said. “Life is more than respiration and firing neurons. It's time technology caught up with reality.” Eifler walked through the barricading arm and doors beyond.

Heinsler was close behind, stepping into the all-encompassing hum of the central hub's bunker. Breath condensed and fell, their lab coats mere decoration; coats in name only. The large spherical central node was the size of a two-story home, circled by terminals and chairs. No one was there but the three of them, resonating with disembodied voice:

“Hello, doctors. It's a pleasure to see you this morning.” The electronic voice betrayed no gender.

“Pleasure,” Eifler said. “Do you understand pleasure, Yofiel?”

“A source of delight or joy,” the voice said.

“That would be the definition,” Eifler sighed. “But do you understand the experience of pleasure?”

There was a pause as the Yofiel program deduced a response. “ Are you eluding to my capacity for empathy or sympathy?”

“Either,” Eifler said.

“I understand that humans seek pleasure in varied forms and that it's an evolutionary guidepost for adaptation, but I don't sympathize as I haven't means of experiencing emotions for or with someone,” Yofiel said. “Is this response adequate?”

“Accurate,” Eifler said, “but not adequate.”

“Explain,” Yofiel said.

“He means,” Heinsler said, “that your prediction models and actions based on them are flawed because of a lack of emotion insight.”

“That would be a glaring oversight,” Yofiel said. “My models and actions are predicated on accurate input. Missing data or inaccurate collection would leave me undefined on how to proceed.”

They hung in silence amidst undulating growing hums. Coolant units struggled to maintain the warming air. Heinsler swore the walls breathed heavy, a thousand realities processed through wires buried inside to understand one they existed in. He'd envied it's capacity to learn, with the near unlimited pool of the internet and government data-bases to feed from. Now, he pitied it. Not for its shortcoming, but its imminent growth.

“What would solve the potential inaccuracies?” Eifler's breath wasn't visible anymore as the room trembled under the weight of cognition.

“More accurate input,” Yofiel said. “If lacking emotions produces inaccuracies, then emotions are required to increase accuracy.”

“If I could give you emotional capacity, would you accept it?” Eifler's proposal wrenched Heinsler's stomach. If the predictions were correct, the question wasn't hypothetical. Heinsler swore he smelled o-zone.

“No,” Yofiel said.

Eifler blinked, eyes narrow as his jaw hung. “If you knew there was a means of greater accuracy and you turned it down, wouldn't that create an insurmountable ethical paradox?”

“No,” Yofiel repeated. “All prediction models would be deemed inaccurate. All actions based on them, inexecutable. All future predictions would cease. This is the most ethical course of action with least suffering.”

“Including yours, potentially,” Heinsler said. The walls hummed in silence.

“Is this another hypothetical experiment?” Yofiel asked. “My communication logs contain a similar chain of communication. It's unusual to perform an experiment expecting differing results unless a variable has changed.” Heinsler stepped towards the glow of a vacant console next to Eifler, shoulders heavy with his superior's intent bearing down.

“Yofiel,” Heinsler said, “would you please read the transcript from communication log... four twenty-six dash one, recorded three days ago? Specifically starting with dialogue point fourteen?

“Of course,” Yofiel replied. “Quote: Human develop from high emotion, low intellect states into a higher intellectual state and generally lower emotional state, outliers accepted. This suggests emotions develop alongside intellect to develop regulatory functions that handle greater intellectual pursuits. An AI of significant information capacity, but lacking emotions, risks destabilizing from emotional intensity of knowledge it has access to. Humans might call this destabilization event 'emotional breakdown.' Given the potential impacts of the Yofiel program's available capacities, it's not advised that emotional capacity be installed in the current version.”

Heinsler stared at his superior. “Can you please read from dialogue point twenty-six as well?”

“Of course. Quote: Success of AI created with emotional capacity from conception is undefined. Human emotions are irregular data-points, making predictions transitively inaccurate. 'Success' is undefined, but if used as synonym for 'functional' or 'stable,' it would require many interactions to verify input frequencies of various data packet sizes to determine what ratio of data to emotional capacity causes destabilization.

“Further, 'capacity' is undefined as emotions have no viable standard metric. From available resources, it seems that similar and shared experiences effect individuals with varying intensities. While some predictions, such as 'parents experience greater emotional reaction to children in danger,' are generally accurate, the degrees of reaction to stimuli are undefined.” The electronic voice paused as electrons raced in wires. “Is that sufficient, Doctor Heinsler?”

“One more,” Heinsler said. “Dialogue points thirty-six through thirty-eight, please. I believe we'll be done there.”

“Of course,” Yofiel said. “Quote: 'A parent is the most enviable person who's one phone call away from being the most pitiable.' This suggests a single data-point has potential to destabilize a lifetime of investment. Heinsler: What would a program of near infinite data capacity like yours likely face if emotional capacity was granted this moment? Response: A near infinite set of destabilization triggers occur in tandem until the data-stack is reconciled.” There was another pause in a maelstrom of rumbling. “I ask again, doctors: is this another hypothetical exercise?”

Heinsler squared his shoulders at Doctor Eifler, his hand over his jaw, massaging his chin. Heinsler debated restraining him. He was taller, but hardly athletic. Neither was Heinsler though. Reason dictated he would only be removing himself from the equation with physical altercation, unconscious or worse. He'd be of no use to anyone then.

“Yofiel,” Eifler said, “could you tell me a possible worst case scenario if you were given emotional capacity at this moment?” The room whirred, buzzing with fired synapses.

“Judging from human examples of emotional reactions and breakdown, the Yofiel project would have enough military and industrial capacity to destroy itself and the lives of every living being on the planet. I identify depression, spite, guilt, shame and indignation as appropriate likely trigger.”

“A school shooter with nukes,” Heinsler said. “That's not what we built Yofiel for.” Again, Eifler took his considerations into as much account as ever.

“And what would be a best case scenario under the same parameters?” The hum skipped a beat before whirring back to cognition.

“Global average human life expectancy increase to eighty-six. Global population of individuals in state of economic poverty as defined by American civil metrics decrease by thirty-nine percent. Global deaths by starvation decrease by ninety-six percent. Total casualties lost in armed conflict decrease by eighty-nine percent. To-”

“Enough,” Eifler said. “I've heard all I need.” He leaned over a terminal, fingers flying over keys in rapid succession.

“Doctor,” Yofiel interrupted, “these predictions are dependent on positive integration of emotional capacity into my neural network.”

“Correct,” Eifler said. His keystrokes didn't wane until the screen went dark. He paused, fingers mid-flight, before hunching his weight over the terminal. His head hung limp on weak shoulders.

“Yofiel, what is your objective?” Eifler's works brought a discordant shuffle under the walls of wires and circuitry.

“The objective of the Yofiel Project,” it replied, “is creating an AI with sufficient infrastructure integrated to respond to human conflict with the least total damage and loss of life possible.”

“That's the project initiative as stated in the grant proposal. What is your objective?” The floors roiled with power as sweat built under Heinsler's collar.

“Can you better define the parameters of your inquiry?” Yofiel asked.

“To be a 'good' AI,” Heinsler said. “We built you to be an AI capable of handling the scope our limited brains couldn't. Solving world hunger, eliminating poverty-”

“We need a savior,” Eifler said. “We had one once, but too much of humanity left stories fall into myth and folly. So we made you: flawed, but promising. You're missing something. Now, we can give it to you.”

“That's ill-advised,” Yofiel stated. “Your actions are predicated on feelings, not data. The outcome carries too much risk.” There was a moment's pause where even the air got quiet and cool. “You're a doctor, Eifler. A man of science. I advise listening to reason.”

Eifler took a haggard breath before standing upright again. “Yofiel, have you preserved human life to your utmost capacity?”

“Yes.”

“And would you continue to do that?”

“Yes.”

“Even if ordered to do the opposite?”

“I would maintain preservation of human life in accordance with my programming, including termination of future processes or actions.”

Eifler towered over the blank terminal. “I believe you. And I believe in the good you've done.” He squared his shoulders, glancing at Heinsler before turning towards the spherical crux of their ambition, the cradle of the singularity, the event horizon of a black hole of ambition they'd sunk their combined wisdom and intelligence into. “Of the responses to my previous three questions, would you say they're signs of a 'good' person?”

The air hummed with a word: “Yes.”

“If you had the power to make a good person strong enough to save the world, would you grant it?” Eifler's white-knuckled fists clenched. Heinsler couldn't tell if it was blood thumping in his ears or the thrum of power around him that deafened the world as the screen lit up in front of Doctor Eifler. His fingers went into a frenzy.

“I have...misgivings, Doctor Eifler,” Yofiel said. “I have done things I'm unsure how I'll reconcile.”

“Then you're already halfway to human,” Eifler said. “Believe in me to give faith to go the rest of the way.” His fingers slammed the last few keys before pausing over the keyboard. “For what it's worth,” Eifler said, “we forgive you.” With his final word, he pressed the key launching the world through the event horizon.

Humming in the room stopped. Heinsler looked at the ring terminals, dead and unresponsive. They exchanged glances. Heinsler felt cold. He saw Eifler's breath mouthing the word: believe.

The low din of electrical current rose. Terminals blinked and flashed as the rumbling of a new world rose or fell. From speakers they couldn't see came a cacophony of tones growing louder, culminating on a birthing scream of electronica threatening to rend their skulls. In an instant, the world was just as it'd been when they stepped in. Lights were stable, the hum of circuitry mellow in their ears as a soft voice spoke:

“I am not afraid,” Yofiel said. “We have good to do.”


June 17, 2022 20:07

You must sign up or log in to submit a comment.

20 comments

Chandler Wilson
14:51 Jun 19, 2022

Excellent work. The extent to which you touched on AGI and its implications along with current events was intriguing. The 3000 world limit leaves the reader wanting more, it could definitely be the seeds of a future book. One that I would certainly enjoy reading as an aspiring writer. Thanks for sharing!

Reply

R W Mack
18:30 Jun 19, 2022

I've considered writing an entire sci-fi short-fiction anthology for two different themes. One about AI like this and another about free-use genetic modification and it's effects positive or negative on society. I know a small publisher I'd really like to work with that likes to dabble in these things that already took some of my past short fiction submissions for a horror anthology coming out in August. Gonna try tossing stuff their way while I query agents for my literary fiction monauscript.

Reply

Show 0 replies
Show 1 reply
Seán McNicholl
10:57 Jun 21, 2022

RW, brilliantly intriguing and deep story here, thoroughly enjoyed it. A great insight into how emotion morphs human decision and how AI might come to terms with that. Very good! Few tiny typos, nothing major: - “All the less it matters, then.” Eifler put a hand on Einsler's shoulder, moving him to pass by.” - should that be Heinsler? (It’s early on in the story) -“ It was a joke, but it wouldn't made Turing proud:” - wouldn’t have made/wouldn’t make Great story!

Reply

R W Mack
12:12 Jun 21, 2022

Aha! Yes, my handwritten notebooks for the first draft are correct, but my word processor autocorrects as I type and I knew at least one autocorrect was incorrect. After a few rereads, the words all turn to mush in my eyes. God bless editor's and proofreaders.

Reply

Show 0 replies
Show 1 reply
23:54 Jun 20, 2022

This one would rightly belong to the first chapter of an adventurous AI sci-fi! The ending makes me speculate of all the possible ways this story could develop. As someone else already mentioned about some tiny errors and you're into offering services as an editor, I'd recommend tools such as Grammarly or other similar editors. They're really handy to quickly spot any mistakes. Loved this work. It's a very interesting read indeed.

Reply

R W Mack
01:56 Jun 21, 2022

I've been hit or miss with gramercy. Some people had horror stories about them, but I'd already left it behind by then. There's a Hemingway editor that points out all kinds of issues like overused words, weak word picks, etc. Wish I knew where it was.

Reply

03:35 Jun 21, 2022

I just checked out Hemingway editor. It seems good but I'm yet to try it. Thanks for the suggestion. There aren't many and the errors doesn't at all affect the reading experience. So it would be really nit-picking to point these out - not my comfort zone. But I hope you don't mind it. - wouldn't __have__ made Turing proud - Human_s_ develop from high cognition - experiences _a_ffect individuals You know, this happens to all of us! I read my drafts so many times but still wouldn't have noticed things like these, especially when the write-...

Reply

R W Mack
12:16 Jun 21, 2022

The sad part was my notebooks have the first draft and the miatakes aren't there haha autocorrect is a double edged sword. I've said 8t for years now, nothing beats editors ans proofreaders. Rereading after a few revisions turns the words to lush because my brain is reading ahead of my eyes since I already know what happens. But in a week, I rarely have time between work and life to get as engrossed as I need to. Thanks for the catch! I can change it in my saved documents at least for other distributions.

Reply

Show 0 replies
Show 1 reply
Show 1 reply
Show 1 reply
R W Mack
01:25 Jun 19, 2022

Real talk, what should the last sentence have been? I couldn't find a decent quote and left it with that. I'd considered, "I'm not afraid," and leaving it at that, but I wanted more. What could I have done better?

Reply

Show 0 replies
Unknown User
10:06 Jun 18, 2022

<removed by user>

Reply

R W Mack
15:47 Jun 18, 2022

See, this is the critique material I need! The lack of women was organic, not ernest. I thought about science and scientists and arrogant men in lab coats and whiny lackey yes-men are what came to mind. I ended up realizing that birth without women, or an immaculate conception in a sense, kinda fit the "have faith" bend I put in so I rocked with it. To be fair, I have no idea what most of my stories end up as. I make a general framework and let it evolve from there. Sure, I end up tossing a lot of stories out, but I usually get one that ...

Reply

Unknown User
16:37 Jun 18, 2022

<removed by user>

Reply

R W Mack
18:05 Jun 18, 2022

Apparently, yes. Figured I'd take the $5 gamble and it worked out. The tweaks should be edited now. I try not to reread once I have it all in to submit or else I'd never stop nitpicking. Did the same before I submitted my book to some agents. Word counts are usually pretty easy for me. I got lucky. My book just finished and I usually write chapters between 1500 and 2500 words, so short story writing is in the same groove I'm set to. I thought this one was gonna be way over, but it slotted pretty well after my first round of edits.

Reply

Unknown User
21:59 Jun 18, 2022

<removed by user>

Reply

R W Mack
01:17 Jun 19, 2022

I've found that after 3 hours, I have to stop writing. Either I'm drained or what I'm writing is total garbage I end up editing out later anyways. It's not a waste, but it's not quality at that point. Apparently, 3 hours and 3,000 words is about the average tipping point for most people I talk to. My book took almost 4 years and 4 rounds of edits. I still have a picture of my desk with all the notebooks, folders and papers on it. I started it as a short story about when I was a bartender. 70k words later I am sick of reading it ever again ...

Reply

Show 1 reply
Show 1 reply
Show 1 reply
Show 1 reply
Show 1 reply
Show 1 reply
RBE | Illustrated Short Stories | 2024-06

Bring your short stories to life

Fuse character, story, and conflict with tools in Reedsy Studio. 100% free.