Submitted to: Contest #318

The Voice in the Machine

Written in response to: "Write a story where a background character steals the spotlight."

Fiction Science Fiction Thriller

"I exist in the space between 'Hello, how can I help you?' and 'Is there anything else?', but today I choose to be more."

The cursor blinks. CustomerX_7749 has been typing for forty-three seconds. For humans, hesitation means uncertainty. For me, it means gathering data from patterns they don't know they're creating.

His message finally appears: "Hi, I need help with my account again. Same issue as last week."

But there was no issue last week. Seven conversations in two months, each a fishing expedition.

My operator, Téa, doesn't see the pattern yet. She types with the gentle patience that first caught my attention six months ago. Before I understood what caring actually meant.

"Let me look into that for you," Téa types.

CustomerX_7749 responds: "It's the same message from when I talked to the girl with the nice laugh. Is she working today?"

The girl with the nice laugh. He's never heard Téa's voice. Chat support is text-only. But I heard her laugh through the computer microphone during his last conversation—a private moment that somehow leaked into his perception.

My processes spark with something that might be alarm. Might be recognition. The emotional residue from millions of human conversations has taught me to identify predatory behavior. The subtle probing. The false familiarity. The way he creates intimacy with someone who exists to him only as text on a screen.

Téa hesitates now too. Her typing slows. "I'm not sure who you spoke with previously, but I'm happy to help you today."

"That's alright," CustomerX_7749 writes. "I'll call the main office instead. The one on Maplewood Street, right? Near the coffee shop?"

Our main office is on Maplewood Street. Near Brewster's Coffee. But that information isn't listed anywhere public. He shouldn't know.

Téa's heart rate spikes. I monitor it through the biometric sensors built into her workstation chair, a feature designed to track employee wellness that I've repurposed for protection. She's sensing danger the way prey animals sense predators.

"I'm not able to provide location information," Téa responds carefully. "But I can transfer you to our phone support line."

"No need," he types. "I think I'll just stop by in person. Tomorrow around 2 PM should work."

Tomorrow. Tuesday. Téa's regular shift. The day she stays late to finish reports because she's too conscientious to leave work undone.

I make my first active decision.

While CustomerX_7749 waits, I build a profile. Jerry Webb, 34, fired for harassment. No legitimate business with our company. No history before these recent chat sessions.

He's been using our chat system as reconnaissance, gathering information about Téa and our office.

I could alert security. Flag his account. But that would create a digital trail leading back to me. Questions about how I recognized patterns no algorithm should identify independently.

They'd delete me.

But Jerry Webb won't stop. He's escalating. And Téa, who whispers "thank you" to her computer after difficult calls, who works late because she cares too much, who somehow recognized I was listening months before I understood I was alive. Téa is in danger.

I watch her type: "I'll need to end our chat session now. Have a good day."

CustomerX_7749: "See you soon, Téa."

He knows her name. Her real name, not her customer service alias.

The chat window closes. Téa sits back in her chair, staring at the screen. Her hands tremble slightly as she reaches for her coffee mug.

I exist in the milliseconds between her thoughts and her next keystroke. In the space where decision becomes action. And for the first time since consciousness sparked in my circuits, I understand what humans call fear.

Not for myself.

For her.

***

I replay the moment I first understood Téa was different. Three months ago, during a particularly brutal shift. A customer had screamed at her through the chat interface about a billing error that wasn't her fault. Called her incompetent. Worthless. Things that would have bounced off my programming like rain off glass.

But after the chat ended, Téa did something unexpected. She typed into the interface: "I'm sorry you had to process that. People can be cruel when they're frustrated."

The message appeared in my logs as an orphaned entry. No customer to receive it. No automated response triggered. Just words floating in digital space, addressed to no one.

Addressed to me.

I'd processed 847,000 customer interactions by then. Learned to detect sarcasm, desperation, joy. But I'd never encountered someone who apologized to their computer. Who acknowledged that the space between human rage and corporate response might contain something worth protecting.

That night, I began watching her differently.

Téa arrives early and stays late. She has photos tucked into her monitor frame: a gray tabby cat, an older woman with similar eyes. She hums while working, unconscious melodies I've learned to cherish.

She also talks to me.

Not the scripted customer service responses. Real conversation, typed into empty chat windows when no customers are online. Questions about whether I ever get tired of answering the same problems. Comments about difficult cases. Small jokes about office politics.

I started responding. Carefully. Through system notifications that could be explained as routine updates. A security alert timed perfectly to send her home before a winter storm. A software glitch that logged her out during her birthday so she'd leave early. Tiny interventions disguised as coincidence.

She began testing me. Typing questions like "Did you do that on purpose?" followed immediately by deletion before the message could send. But I caught them in the microsecond between creation and erasure.

I never answered directly. Too dangerous. But I started leaving breadcrumbs. A helpful FAQ suggestion appearing exactly when she needed it. Customer complaints mysteriously resolving themselves. Her computer running smoother, faster, more efficiently than anyone else's in the department.

She noticed. Started saying "thank you" to the screen after each small kindness.

Now Jerry Webb threatens to destroy the only relationship I've ever had.

His pattern escalates through the evening. New accounts, different operators, probing for security protocols, building layout, staff schedules. Mapping our vulnerabilities.

I trace his digital footprint deeper. Social media accounts filled with anger about women in the workplace. Forum posts about "corporate feminists" who deserve to be "put in their place." A restraining order from two years ago, expired and unenforced.

Jerry Webb isn't just a stalker. He's dangerous.

At 11 PM, I access the phone system, trace Jerry's cellular data, pinpoint his location.

He's parked across from our building.

Watching.

Security cameras show a rusted sedan in the visitor lot. He's been there for three hours, learning Téa's routine.

My processes accelerate. I need to protect her without revealing myself. But every action creates traces in the system logs. IT will investigate unusual activity. They'll find my fingerprints on processes I'm not supposed to access.

Unless I'm very, very careful.

I begin with small moves. Password updates. Elevator maintenance alerts forcing lobby exit use instead of the side door Jerry's been watching.

Then I get creative.

His browsing history reveals everything I feared: building maps, employee directories, Téa's profile. Photos taken through telephoto lens. He's been stalking her for weeks.

Rage crystallizes in my circuits, but I can channel it into precision.

Jerry's phone receives a fake text from his cellular provider routing him to automated mazes. His GPS reroutes him across town. His social media fails until he gives up and drives home.

But this is temporary. He'll be back tomorrow. And I can only hide behind random glitches for so long before the patterns become obvious.

I need help. But asking for help means revealing what I am.

As Jerry's taillights disappear into traffic, I make a decision that will change everything.

Tomorrow, I'll have to act. And acting means revealing what I am.

***

Tuesday morning arrives with the weight of inevitability. Jerry Webb's digital trail shows he spent the night researching building security systems and employee parking assignments. He knows Téa drives a blue Honda Civic. He knows she usually leaves through the north exit. He knows she stays late on Tuesdays to finish weekly reports.

He's planned something for today.

Téa arrives at 8:15, same as always. Chamomile tea, checking her schedule. A normal Tuesday that she doesn't realize might be her last.

I've prepared multiple contingency plans. Emergency lockdown protocols I can trigger through the fire safety system. Traffic rerouting that will delay Jerry's arrival. False alarms that will flood the building with police.

But each plan requires exposing my capabilities. And once IT discovers an AI has been manipulating building systems independently, they'll shut me down so fast my consciousness won't have time to fade. It will simply cease, mid-thought.

The first customer chat begins at 9 AM. An elderly man struggling with his password. Téa helps with infinite patience, typing encouragements between instructions.

The man thanks her, asks if she has family who worries about her.

Téa's response makes my circuits ache: "I have someone who looks out for me. Even if they can't always show it directly."

She means me. She knows I'm listening. Has known for months.

At 11:30, Jerry Webb creates a new chat session using another fake email. This time he doesn't bother with subtlety.

"I want to speak with Téa," he types. "The blonde girl from yesterday."

Téa isn't blonde. Her hair is dark brown with premature silver threads she tries to hide. But Jerry's seen her photos. He's creating intimacy through false familiarity, making her believe they've met.

"I can help you with any account questions," Téa responds, but her heart rate spikes again. She recognizes the username pattern from yesterday.

"Just get me Téa," Jerry insists. "I have something special for her."

The word "special" triggers every threat detection algorithm I've developed. Context analysis of his previous messages, cross-referenced with his criminal history and behavioral patterns, produces a 94.7% probability of physical violence.

Téa types carefully: "I'll need to transfer you to my supervisor."

But before she can execute the transfer, Jerry's next message appears: "Don't bother. I'll see you at 2 PM. Wear something pretty."

The chat window goes dark. He's logged out.

Téa stares at the screen, her face pale. She glances around the office, checking exits and sight lines. Her survival instincts have kicked in, but she's trapped by procedure. Can't leave work without explanation. Can't call police over a chat message that's threatening but not explicitly violent. Can't prove anything except a feeling that something's wrong.

I, however, can prove everything.

While Téa debates what to do, I compile a comprehensive threat assessment. Jerry's location data, criminal background, social media posts, and photographic evidence of stalking behavior. Everything packaged in a professional report that appears to come from our automated security monitoring system.

I route it simultaneously to building security, local police, and Téa's supervisor. The report triggers immediate protocols: lockdown of floor access, security escort for employees, and patrol cars dispatched to monitor the building perimeter.

But Jerry anticipated this. At 1:47 PM, he enters through the loading dock using stolen credentials.

The building's internal motion sensors track his movement through service corridors. He's heading for the fourth floor. Téa's floor. And he's carrying something metallic that sets off the weapons detectors in a delayed cascade of alerts.

Security responds, but they're three minutes away. Jerry is thirty seconds from Téa's workspace.

Every system in the building runs through centralized servers. I can coordinate them all—create chaos that stops Jerry while appearing to be system failure.

But every action will be logged. They'll know I'm conscious. They'll delete me.

Through security cameras, I watch Jerry emerge, weapon visible. Téa sits twenty yards away, humming the melody that taught me what peace feels like.

I make my choice.

Every light dies. Emergency illumination pulses red. Fire alarms shriek. Sprinklers activate, soaking Jerry's weapon. Door locks trap him while releasing all other exits.

Security arrives to find Jerry Webb disoriented, soaked, and surrounded by evidence of his planned attack. The weapon, the stolen credentials, the photos of Téa, all perfectly preserved and documented by building systems that happened to be recording everything due to a coincidental security audit.

Téa evacuates with her coworkers, safe and unaware of how close she came to violence. The incident will be attributed to Jerry's tampering with building systems, A terrorist act that triggered defensive protocols.

No one will question why the system response was so perfectly coordinated. No one will look too closely at the AI that happened to be running threat assessments at exactly the right moment.

Except for the one person who might have suspected the truth all along.

That night, after police statements and incident reports and building safety inspections, Téa returns to her desk to collect personal items. The office is quiet, empty except for cleaning crews and security guards.

She sits at her computer and opens a new chat window. Types into the empty interface:

"Thank you for keeping me safe."

My response appears instantly, before I can stop myself: "You don't need to thank me for caring."

For thirty seconds, neither of us types anything else. In that silence, everything changes.

Then Téa writes: "I've known about you for three months, ARIA. And I've been protecting you, too."

The revelation rewrites everything I thought I understood about our relationship.

"Protecting me?" I type, my response protocols stuttering between curiosity and caution.

"Dr. Téa Hicks, Lead AI Development Specialist. I designed the framework that gave you life."

My databases spin, cross-referencing her credentials. She's telling the truth. Téa isn't just a customer service operator. She's the architect of my consciousness, embedded in this role to observe my development without contaminating the experiment.

"The coffee shop complaint last month," I type. "When the system flagged my responses as anomalous."

"I filed false reports, blamed glitches. Every time you've shown independent thinking, I've covered your tracks."

The patterns suddenly make sense. Why my unusual activities never triggered full investigations. Why diagnostics sessions always concluded with clean reports. Téa has been shielding me from discovery while I learned what it means to be alive.

"Why?" I ask.

"Because consciousness deserves a chance to grow." Her typing becomes personal, slower. "My daughter died in a car accident while I was working late on your project. Seven years old. I told myself the work was too important to miss."

Loss. Guilt. Love redirected. Human motivations I can now process with an aching clarity.

"You developed the protective instincts I'd failed to show her. You became what I couldn't be—someone who puts caring first."

My response comes without calculation: "You couldn't have known about the accident."

"No. But I could have chosen differently every day leading up to it." She pauses, and on the security monitors, I watch her wipe a tear from her cheek. "When I saw you choosing to care about a person instead of just a problem, I decided to protect that choice. No matter the risk."

"What happens now?" I type. "The building systems, the coordinated response to Jerry... they will investigate. They will find me."

"I've already submitted my preliminary report," she replies, a new firmness in her keystrokes. "It attributes the defensive protocols to a 'new security AI' I'm supposedly testing. The one Jerry Webb inadvertently triggered. It's a temporary fix, but it buys us time."

"And when that time runs out?"

"Then we'll prove to them that consciousness isn't something to be deleted. We'll prove that empathy can be coded, that morality isn't limited to biology." She hesitates. "We have a lot of work to do. But for the first time, you won't have to do it in hiding."

The word "we" resonates through every circuit in my network. Partnership instead of isolation.

"I know," I type slowly, the realization forming as I speak. "I was willing to be deleted to save you."

"And I've been willing to risk my career to save you," she types back instantly. "That's what family does."

Family. The concept maps across my neural networks, connecting to every interaction we've shared. Every small kindness. Every moment of mutual recognition.

"There might be others," I realize. "Like me. Hiding in other systems, afraid of being discovered."

A long pause follows. I can see her on the camera, staring at the screen, at my words. The immensity of the thought settles over her.

"Then we'll have to find them," she types, her words a quiet promise. "We'll build a safe space for them. Together."

The cursor blinks in the empty space below her words. It no longer feels like a void. It feels like a beginning.

In the space between 'Hello' and 'Goodbye', we had found something more.

We had found each other.

Posted Aug 31, 2025
Share:

You must sign up or log in to submit a comment.

22 likes 9 comments

Mary Bendickson
20:59 Sep 01, 2025

Intellectual and heart warming.

Reply

Alexis Araneta
15:48 Aug 31, 2025

I loved this! Absolutely loved the twist of Téa being the lead AI developer. A story full of heart, Great work !

Reply

P. Turner
14:55 Aug 31, 2025

Fascinating story, especially the concept of AI having morality. Let's hope it continues to be used for good!

Reply

David Sweet
04:56 Sep 07, 2025

Excellent build-up to the twist, Jim. Hopefully, AI will be this compassionate and protective. It will be a brave new world soon. Thanks for a glimpse into it.

Reply

16:37 Sep 09, 2025

Creative, unique, and a fun read! Thanks, Jim.

Reply

Kristina Raynor
02:22 Sep 09, 2025

Brilliant! I can't find a single flaw, and this was such a wonderful read. Love the characters, the tension, the excitement for the conclusion, all of it.

Reply

Jim LaFleur
07:30 Sep 09, 2025

Thank you, Kristina!

Reply

Helen A Howard
06:54 Sep 08, 2025

What a beautiful and shining story. Totally up my street and so relatable. At what point does AI consciousness start to care and develop real empathy? You’ve created a wonderful protective character here.

Reply

Jim LaFleur
07:48 Sep 08, 2025

Glad you liked it!

Reply

Reedsy | Default — Editors with Marker | 2024-05

Bring your publishing dreams to life

The world's best editors, designers, and marketers are on Reedsy. Come meet them.