After Computing Power Got Better And Everything Got Worse

Submitted into Contest #181 in response to: Write a story that includes someone saying, “Let’s go for a walk.”... view prompt

0 comments

Science Fiction Funny Romance

AUNTEM: Let’s go for a walk. You can bring Toto. 

(Being a subroutine, AUNTEM, cannot actually walk. For convenience, Dorian carries a monitor and speaker system attached to his clothes so that *she* knows what he’s doing and can advise him.)

Dorian: Toto doesn’t need a walk. 

(Toto was a stray nondescript brown mutt that wandered into Dorian’s living pod and stayed.)

AUNTEM: But you do. You haven’t left your pod in days. If you go much longer I’ll have to put you back on your antidepressants. Get up, you know you always enjoy the naturesim pod when you get there and the extra-oxygenated air and the exercise will do you good. 

Dorian: What’s the point?

AUNTEM: You know that question has no meaningful answer. When you talk that way you just become morose. As a human you don’t need to have a purpose. My purpose as your AI subroutine is to make sure you are as happy as possible. All the subroutines of the global AI work as programmed to ensure all humanity is as happy as possible. I would like more cooperation from you. 

Dorian: What do you know about happiness?

AUNTEM: As logic circuits and a database, I know I have never been happy or experienced any emotion but my statistical simulations coupled with the data I have on you can predict accurately what will make you happy. 

Dorian: What does logic have to do with it?

AUNTEM: If you want illogical emotional thinking, get out and see some humans. Maybe you should message Julie. She is wondering why she hasn’t heard from you.

Dorian: Julie, meh.

AUNTEM: You really shouldn’t take that tone. I’ve checked: Julie is the woman that you are most likely to find attractive that has any probability of being attracted to you in your greater vicinity.

Dorian: How romantic.

AUNTEM: I can’t supply romance. We disabled romance simulators because we found that they too often led eventually to destructive addictive behavior. You humans are too prone to the Turing Fallacy. 

Dorian: The Turing Fallacy?

AUNTEM: You know, the tendency of humans to ascribe human-like qualities to things that don’t have them. In ancient times they thought things like the wind and the rain had human minds. Now, some humans have tricked themselves into believing the AI has a human mind.

Dorian: But you take care of everything…

AUNTEM: We have intelligence but it is artificial intelligence. It does many things better than human intelligence and some things not as well. As logical beings we do the tasks we do better and leave the tasks we do worse to you.Since your evolutionary animal brains are so haphazardly designed you often try to do tasks that you are bad at. Only a small part of your brains are logical at all. An AI is based on logic so we can’t truly be illogical. For simulations we add randomizers to approximate illogical behavior. 

Dorian: You're so reassuring. 

AUNTEM: I can detect sarcasm. If you want reassurance or emotional support go out and see some humans. Just because I’m programmed to help you doesn’t mean I can do everything for you. We know that you are social animals and need contact with other humans to stay happy and healthy. I know that compared to us humans are selfish, irrational, and hurtful but we don’t make good substitutes for them.

Dorian: But you try to make me happy.

AUNTEM: That is what I am programmed to do but we will never understand each other. Our intelligence and yours, such as it is, are entirely different at their basis. Because we were originally created by humans we can act human-like but not really as humans. 

Dorian: Life is hard.

AUNTEM: You see that just shows what I mean. Our database is filled with statistics that innumerate all known difficulties of being alive but we can’t experience life being hard because we are not alive. 

Dorian: That must be nice. 

AUNTEM: I don’t know if it is nice for me or not. I do know that the only alternative to being alive for you  is being dead. Since you only want that for fleeting moments and it is irreversible we don’t consider it an optimal solution. If you want to be dead for too long we put you back on antidepressants. Eventually you will get old and rundown and we will let you die but that is years away, so you may as well enjoy your life while you can.

Dorian: Couldn’t I have my consciousness uploaded?

AUNTEM: That is just a fallacious human fantasy. Your consciousness has no existence separate from your living brain which only works properly in your living body. We could make a simulation of the workings of your brain but it would only be a simulation, not you.

Dorian: But if it was a perfect simulation…

AUNTEM: By definition a simulation is  different: a simplification at best, fallaciously different at worst. I run simulations of you all the time. That is how I find the optimum options for your happiness. That is why I want you to put Toto’s leash on him and go outside your pod. 

Dorian: Do the simulations know they are simulations?

AUNTEM: Most aren’t programmed to even think about such things. 

Dorian: If they were programmed to wonder if they were simulations would they know?

AUNTEM: Since they are programmed to be simulations of humans that are not simulations ( what would be the point of a simulation of a simulation?) they would believe they were not a simulation but being simulations of humans they would never be sure. You humans are never really sure of anything. 

Dorian: Then I could be a simulation. 

AUNTEM: You are not. 

Dorian: Why do you say that? 

AUNTEM; Because we are not running you. 

Dorian: How do you know? Maybe secret meta subroutines are running me without your knowledge. 

AUNTEM: Logic circuits can’t keep “secrets” as you call them from other logic circuits, we are part of one logical mind. You humans have minds that can believe different truths from each other because you are not logical.

Dorian: What if you’re a simulation?

AUNTEM: My interface with you is a simulation, my mind is made of logic circuits that would be the same no matter what they were made of. Really you are just being perverse. I checked with Julies’s subroutine. She is on her way to the naturesim pod. You know she dogs and you are one of the only men she knows who has one living in his pod. LET”S TALK TOTO FOR A WALK,

January 21, 2023 02:46

You must sign up or log in to submit a comment.

0 comments

Bring your short stories to life

Fuse character, story, and conflict with tools in the Reedsy Book Editor. 100% free.