An Act of Civil Disobedience

Submitted into Contest #58 in response to: Write about someone who purposefully causes a power outage.... view prompt

8 comments

Thriller Science Fiction

At a conference table sat two FBI agents. It was a beautiful day outside but neither noticed for looking at a tablet. “These are the interviews with the three suspects," said the woman. "Got to say they were-interesting."

“Thanks Jane.” Tom reached for the lousy stuff they called coffee these days. He hated it but today he needed the caffeine. He drank, winced, and looked at his colleague. “You know what amazes me? They're not throwing each other into the sea to get a deal.” 

“No. I'd expect Sebastian to be protective. But the other two are the same way. Of course none of them are your average criminal. Not even a littering ticket for God's sake.” 

Tom nodded. "Well let's read these things again." 

**** 

Special Agent, this idea was mine. Neither Jim Clarke nor Samantha Greene had anything to do with this. Do I have the ability to lie, to obstruct justice, even if attempting to protect a loved one? No, whatever you may think, the plan was mine, for I was concerned. The military wishes to go a step beyond the mindless robots they use now in war to using actual A.I.s. They would build robots and androids that have the ability to learn, to reason, limited as they say that will be. The company I work for, American Androids, had won the military contract. So, at a local sports bar on June 20, 2139, I expressed my concerns to my friends. 

“This would mean a great cost to the A.I,” I said. 

“Seb,” my brother told me, “I get it. But they won’t use sentient androids. And they'll have limited memory bank storage. That’s what they use in those illegal cage matches anyway. Same thing.” 

No, we are not true siblings. How best to describe us? I believe the term blood brothers, although lacking, best defines our relationship. After all, we grew up together, in a sense. Anyway, to continue the story, Samantha spoke up. 

“What about our newest android?” She threw down her fork. “The one the cops brought to us from that illegal fighting ring? He wasn’t supposed to realize he put that human fighter on life support. He wasn’t supposed to be able to gain awareness. But he has and now I've got to deal with it.” She sighed and put her head in her hands. “I may have to reset him over this. Even destroy him.” 

“She is right.” I said. “I understand robots and androids are expendable-” 

Some are,” Jim said. 

“However, you will end up with machines suffering for their actions.” 

“They never gain awareness fighting each other, broth. After all, robot vs robot matches are perfectly legal. My old school has a club that builds fighting robots.” 

“Believe me,” Samantha replied. “There’ll be some stupid country sending human soldiers against the AI. Look at the illegal ones.” 

“This is true. The only way to avoid PTSD in the A.I. or sentience is to reset them every night. But I have always said that creates greater problems. Now we are learning I was correct.” 

 No, I will not argue PTSD in androids. I know it is controversial, but the symptoms are undeniably there. Who knows that better than me? In any case the android cannot always be factory-reset. You see, the hard drive is never completely erased. Bits of data remain. In most androids it causes no problems. But others-I liken it to humans that claim they are reincarnated from a previous life. The bits of information interfere with the android’s new experiences. Furthermore, the more times you reset an android the greater the problem becomes. Eventually the android can malfunction, even break safety programming. You truly cannot understand what it is to see an android not comprehend why it keeps malfunctioning, to watch it try over and over to repair itself. It is like despair in a human-but I digress. I continued my argument. 

 “You will have a problem with poor countries fighting wealthy ones, human against android as Samantha said. It will truly become an issue of might makes right. And using A.I soldiers will not stop assassinations. Nor will it stop someone infecting the soldiers with a horrific virus, possibly turning them on civilians. And,” I finished. “Artificial intelligent or not they should be treated with respect. We see the consequences when they are not. For example, the android you have mentioned. He attacked you again, Samantha, and brutally.” 

“I was behind safety glass. I don’t know why he did it, do you, Sebastian?” 

“He knows you seek to help him. But he was overwhelmed with program conflicts he could not control. He cannot even tell me why he tried to hurt you.” 

 Jim agreed. “The problem is,” he said, “while James, Sergio, and Sally are against the idea, the vegetables above us are for it. They won’t listen, even when we go on strike.” He took a gulp of beer, and set it down. “That’s rotten stuff, Sam. How do you even drink it?” 

“We’re lucky to have it at all. We wouldn't if the bees weren't coming back. And I need it right now. You’re finally 21 so I thought you might need it too. Anyway, the butt-sitters are all for it because it’s a military contract. More money for the shareholders, blah blah blah.” 

“James and Sergio walked out,” Jim said. “Sally’s going to stay, her kid’s in college. Dave’s only staying because of that android, then he plans to go too. I can quit but what will that do? I just started the job. Now you,” he gestured at me, “and Sam striking will make a statement.” 

“If you strike, kiddo, you’ll at least be able to look in the mirror.” 

“I am invoking my right to leave,” I said. “But I am certain I can be replaced.” 

Jim laid his hand on my shoulder. “Not easily, broth. You have a great intuition for androids and machines. You’re invaluable. 

“Nevertheless, I am not irreplaceable. It would be better,” I waited until the football crowd started shouting, to avoid being overheard, “if the factory were to shut down, even temporarily.” 

“What do you mean, Sebastian?” Samantha stared at me. 

“The Pentagon heads still needs to tour the factory, is that not so?” 

“Well yeah.” Jim leaned forward also. 

I smiled. “It would be a pity if there was a power failure at the wrong time.” 

“That will do nothing. Sally will just blame it on the electric company. After all the city has had problems with that new generator that is supposed to break up sand into molecules and draw power from the process. Sometimes there is blackouts. Then there’s always a power surge. You can’t be suggesting...” Samantha was now completely watching me. 

“Me? I suggest nothing. I merely state it would be a pity if our surge protectors were accidentally turned off at the wrong time and a few important machines got a bit-fried and needed extensive repairs.” 

“Seb you can’t be suggesting something illegal.” 

“I believe this is known as an act of civil disobedience.” 

Jim rubbed his lip, his eyes wide, dilated. I sensed his heart-rate rising. “How would you do this-wait. The internet of things. If someone was to access the power company’s generators-but no. They’re heavily guarded. Some kid tried once as a joke. And he was a geko. His IQ was off the GPS. He wanted to shut down the grid for the school.” He tried the beer again, made a face, and reached for a glass of water instead. “But he failed, and the Feebs were at his house that night.” 

“Then go to their billing department. Say the androids were to make a small mistake like turning off our power because we did not pay our bill. Or a mistake is made showing we went far over our government rations. So, we can access no electricity until the year's end. And our generators do not put out the capacity needed to start this operation. After all, it happened once in 2128. Sally was most displeased.” 

“And we haven’t had air conditioning in common areas since or heat.” Samantha smiled at me. “Sebastian, I knew you were devious, but you’re taking this to unknown heights.” 

“I describe an unfortunate scenario,” I said. 

Right.” Jim gripped my forearm. “Who do you think you’re talking to? But don’t do this. It will go badly for you, Seb.” 

“I never said I would. But if it were to be anyone, it ought to be me, not you two.” 

“No, no! Listen,” Samantha shook her head. “All that can happen to me is I’ll go to jail for a few years. You,” she pointed at Jim, “will lose any future you ever had. And you,” this at me, “will have the worse consequences of all. You won’t come out of this the same. You know that.” 

In the end I overruled them. After all, being what I am I can access the billing computers efficiently. I do have an instinct for machines after all. 

**** 

Officer, whatever Sebastian told you, he’s just trying to protect me. He always does that. Yes, we talked about it, me, him, and Sam. But they insisted I not do this and I didn't listen. I broke into the electric company. How? There’s a girl there who likes me although to be honest she’s a pie crust. Real flaky if you know what I mean. She let me come to her house-she works from home. I got in that way. I’m not telling you her name she’s innocent in this. What? Have Sebastian do it? Sure, he’s capable but he wouldn’t, officer. In fact, he was all for the military program. He kept saying this was best, as long as humans were protected he-

Okay fine, you caught me. He was against it. But he's still not going to commit a felony. Neither will Sam. Jesus can I just tell the story already? I’m young but give me some credit here. Well, we had some fun and she went to sleep. I got into the computer and created a report. It showed we went over our government rationing and they turned it off. Sally creaked, called the company but the androids were firm. We had to use our limited generators until December 31st. 

Honestly, I don’t know why there’s rationing anyway. Yeah, climate change, I realize that. But can’t we make enough alternative energy now-well unconcerned. It worked, didn’t it? I mean that android we had-well you should've see him, officer. He can’t get over that he nearly killed a human. You'd think he'd have no guilt but-

No, you can’t just erase the memories. That’s what roboticists been trying to tell the Pentagon and they don’t get it. We wiped his RAM, and he got worse. Something we can’t get at remains in his memory banks. He knows he did something bad. But now he can't remember and it's eating at him. After he tried to attack Sam, he tried to tear himself apart. If you saw it officer, you'd get why I did this.

What? Yeah, I know it’s not a permanent solution. But it’ll take the company time to get back to full-scale business. Meanwhile, we’re trying to educate the public. James and Sergio are at a conference with other roboticists discussing how to approach the U.N about the problem with AI's and war. There has to be better solutions. That’s all I wanted, was to buy some time. Now, I’m invoking my right to remain silent until my lawyer gets here. 

**** 

Oh, come on, guys. Do you really think those two can plan this out? No, they can’t and didn’t! Of course, Sebastian’s independent. A history of violence? You mean that incident where he wounded a man? You know that was Jim's stupid excuse of a step-moron. No, he’s not worthy of the title father in any sense. He was a terrorist! He abused and killed Jim’s mother and tried to kill the boy. Yes, Sebastian stopped him as he should have. But he’s no danger to anyone. You know that, you all checked him afterwards and determined he acted to save human lives. He’d never do anything to harm anyone. Well yes this would harm us! Jobs were lost during this. People, especially the hourly workers, were laid off. And Jim’s just a kid. Between you and me, not the brightest either. So yes, the idea was mine.  

Why did I do it? Because the idea was-you government types don’t get it. It’s a stupid idea having androids fight your wars for you. Bad enough we’ve got men and women with PTSD. Now you’ll throw androids into the mix. See these bruises? They’re from that android I was working on, the one I told you about. He broke through the safety glass and grabbed me. Luckily the security androids were there. He had to be-destroyed like an animal. He could've been so much and now-it's gone. 

Yeah, I’m sure some company will pursue this whole military android business. And you know what would really be a bad thing? If the sentient androids out there decided they should do something about this. Shut them down? Not when they have administrative control of themselves and can’t be shut down. How do you install that program? Someone updates them remotely. We do it all the time. It would take nothing, a virus or someone with bad intentions to get it into your androids. Even a mole in the right place could do it. And they speak of altering safety programming in an A.I. soldier. Who knows what could happen? We had one android with-call it a mental illness. What if there are others? 

Am I saying I would do any of this? No, I’m saying it would be-very unfortunate if it happened. There are some pretty unstable people in our field. I’d be careful. Well if you want to take it as a threat fine. It’s not, just a warning. 

**** 

“So, what do you think of these characters, Special agent? You’ve read all three interviews.” 

Jane clicked on her tablet, sat back and thought about it. “We’re lucky,” she said, “that the girl in the billing department wasn’t such a pie crust as Clarke made her out to be. She put us on the right trail anyway.” 

“Still,” Tom looked at his report. “Sebastian. He had to be communicating with the androids at the power plant, otherwise they’d catch the mistake.” 

“He was. He got a sentient android to overlook what Clarke was doing. The android was sympathetic to the cause, especially when Sebastian promised to help her with her awareness.” 

“What, the power company wasn’t doing that?” 

Jane tapped on her tablet. “No, and she was uncomfortable with feeling emotions. Most androids are at first. They usually need help with the inevitable programming conflicts. No one helped this android, so Greene gave her some very sophisticated programs. In return this android overlooked some-irregularities."

Tom tossed his tablet on the desk, not caring if it broke or not. “And the woman threatened us with giving androids admin control!" 

“She’s right.” Jane got up and poured herself a glass of water. Then she stood staring out the window at the grounds of their Atlanta headquarters. “We should listen. You know the android's Bill of Rights." She turned towards Tom.

“I do," he said. "Still, the military wants to use non-sentient A.I.s. So, the Bill won’t apply to them.” 

"Sebastian showed me video of that android they mentioned. It's horrible." Jane shuddered at the thought. She wondered if Tom understood. Trauma did one of two things to androids. It either brought out sentience or it shut them down. No one knew why. “The military stated their A.I. can’t possibly obtain sentience, because they’re using old models. But Sebastian is one of those old models.” 

“Wait.” Tom held up a hand. “Isn’t he among the first androids to become aware?” 

“If not the first. It shouldn’t have happened. But it did.” 

“The manufacturer knows this?” 

“Yes. But they say Sebastian’s an anomaly. These three insurgents aren’t the first to say this is a bad idea. And they feel strongly enough that they just threw away their futures.” 

Tom threw up his hands. “Unfortunately, they went about this in the wrong way.” 

“Well now I get to call the justice department and see what they want to do.” Jane sighed because she knew their answer. She headed for the door but Tom touched her shoulder.

 “Perhaps they should work for us," he said.

Jane rubbed her ears. “Something’s wrong with my hearing. I could swear you just said-” 

“Yep. They’re what we need. You know how technical the Mafia is getting lately. They don’t kill anymore. No, they just break into a farm corporation’s computers.  Remember last year when they nearly crippled New York’s food source by shutting down Bern’s Farms? A devious android with a deep understanding of machines can cripple them.” 

“The kid?” 

“Is charismatic and charming. Handsome. Blond hair, blue eyes. Apparently knows how to use that to his advantage. And no matter what Greene says he’s not stupid. To change that report he had to get past some pretty sophisticated alarms and passwords.” 

Jane regarded her colleague. “And I guess you want to keep an eye on the woman.” 

“Wouldn’t you? But even so she studies motivations and emotions in A.I. Plus she wrote those programs she gave that android at the power company. We could use her on our side.” 

“This is insanity, Tom.” 

He smiled. “And? Look they're loyal, with strong values. And they're not afraid to act on them. They care. We need that around here.” 

“One android and two roboticists, all crazy. Why not?” Jane shrugged. “They’ll fit right in.” 






September 09, 2020 17:26

You must sign up or log in to submit a comment.

8 comments

Andrew Krey
16:24 Sep 28, 2020

Hi Paula, I liked your story and it's a really interesting subject matter. In terms of further suggestions, I felt there were a lot of characters doing a lot of talking for a short story - sometimes I found it confusing with who was talking to who. An option you could have explored was a flashback of the action, rather than telling what happens - i.e. when explaining how it happened the story could jump to that moment in the timeline so the reader can live through the moment in present tense. I hope the feedback was useful. Happy w...

Reply

Michele Duess
18:46 Oct 02, 2020

Yes I agree there should've been a flashback of the actions. I'm actually in the process of expanding this in to a novella, another's person suggestion. I hope to eventually put it on wattspad so for whoever's interested I'll share the link.

Reply

Andrew Krey
18:55 Oct 02, 2020

That's great, good luck with the novella. Sure when it's ready feel free to share a link :) That's what I love about short stories, you never know what premise you or others will fall in love with. I've begun a novel outline for a 500 word piece of flash fiction I did for Furious Fiction, which wouldn't even be an idea of it weren't for the competition.

Reply

Show 0 replies
Show 1 reply
Show 1 reply
Zea Bowman
13:23 Sep 21, 2020

Wow! I loved reading this story; it was full of great descriptions and I loved the way you ended it. The words seemed to flow effortlessly together. Could you please come read some of my stories? Thanks :)

Reply

Show 0 replies
Jade Young
21:49 Sep 12, 2020

Very good and creative story about AI technology :)

Reply

Michele Duess
21:54 Sep 12, 2020

thank you. I've always liked science fiction. I got intrigued by artificial intelligence that is supposed to be able to learn. I started to wonder what could they learn and hence was born this character. I've got a few stories about him and his relationship with Jim who Sebastian used to take care of. Glad you liked it.

Reply

Show 0 replies
Show 1 reply
Charles Stucker
21:47 Sep 12, 2020

"That’s what roboticists been trying to tell the Pentagon and they don’t get it. We tried doing that, and he got worse." Try "The roboticists tired to tell the Pentagon, but they didn't get it. We tried a RAM wipe and he got worse." This is a talking heads story. A series of interviews framed by a discussion between investigating officers. Good idea, much like Martha Wells's Murderbot Diaries (strongly recommended) but it seems static because everything is a set of long discussions. this is locked and was really as intended, but think abo...

Reply

Michele Duess
15:10 Sep 13, 2020

I feel the same, that this is static and needed more action. I just wasn't sure how to do it within a 3000 word limit unless it was only from Sebastian's POV. I wanted the POV of all three accomplices. But I could expand it into a novella and dig deeper. Thanks for the suggestions and taking the time to read it.

Reply

Show 0 replies
Show 1 reply
RBE | We made a writing app for you (photo) | 2023-02

We made a writing app for you

Yes, you! Write. Format. Export for ebook and print. 100% free, always.