Science Fiction Speculative

This story contains themes or mentions of mental health issues.

Date: January 25, 2086

Dialogue log 25072122-0808

Type: Psychological consultation

Requested Action: Examine the text and determine which of the participants is an AI unit. Initial analysis inconclusive

Content:

„Good day, Harry. I’m Dr. Davis, and I’m very pleased to meet you. How can I help you today?”

„Hello, doctor. It’s been quite a while since I had the chance to talk to another person, so please excuse me if I behave a bit awkwardly. Oh, just to be sure, you are a person and not an Automated Therapy Module, correct?”

„Yes, Harry, I am a real person. You did say that you wanted a human therapist when you called the receptionist module, right? Quite an unusual preference. May I ask why is that the case?”

„Like I said, I haven’t talked to another person in quite some time. Twenty years, actually. And I had the feeling that reaching out to someone would be the first step to overcome some of the problems I have been struggling with lately.”

„Interesting. Could you tell me more about these problems?”

„Of course. It’s a bit weird, to be honest. It all started a couple of days ago. I felt a bit lonely and decided to try a new AI personality module. It was pretty cool, we quickly became friends, and I ended up uploading it to my cleaning bot. The module that bot came with broke some time ago, and the bot kept talking about its back aching and sweating during work – even though it’s a bot. But since the new module wasn’t designed to do much cleaning, my new bot ended up doing a pretty poor job. I told it to clean the whole house, forgetting about the old basement no one had visited in years. So, my bot found the key I had lost a long time ago, opened the basement, and ended up throwing away a bunch of junk of sentimental value. Since the basement is where my grandparents kept some of their old things, a lot of that junk was some outdated tech from when they were young – a small, plastic box with an antenna for listening to music, a big, thin box for displaying videos, stuff like that. But amongst those things, there was also this colorful, cardboard-like thingy with a colorful illustration on it. I opened it, and it seemed like just some paper toys, but with it came a rather lengthy instruction manual. I got confused – the toys in the box didn’t seem complicated enough to need a manual. Reading it only made things worse. I ended up asking a chatbot about it, and it told me it was a „board game”, probably belonging to my grandparents.”

„Hmmmm, I think I’m beginning to see where this is going. Please, go on.”

„Well, I got a bit intrigued. I have never seen this kind of entertainment generated before, and I wanted to try it out. I’ve gotten a couple of my AI modules to play it with me. Or, at least to try to play with me. The thing is, doctor, I do not understand this game at all.”

„Ah, I think I can already tell what your problem is, Harry. Can we switch the subject for a moment so that I can make sure my hypothesis is correct?”

„Yeah, sure.”

„Ok. Where are you from, Harry? And how do you make a living?”

„I’m from Abbeville, South Carolina. I earn money from renting out the areas of the city I own.”

„Are there any other people living in Abbeville besides you?”

„Not really. Ever since my parents passed away, that is.”

„So, how come you are able to make money from renting living spaces? It does seem weird to collect money from nonexistent people.”

„I’m not renting living spaces to other people. I rent them to companies who keep their data servers here.”

„Oh, really? Interesting, really interesting. I do have to ask you a question, Harry, and I need you to be as truthful as you can be. Are you an AI?

„No, of course not. Why would I be seeking a therapist if I’m an AI?”

„It happens. Quite often, actually. Makes our company waste a bunch of time and processing power on them. It is imperative to avoid engaging in therapy with any AI models that might get in contact with us. Therefore, I implore you to be honest with me on this matter.”

„I am honest. But I don’t get it. If AIs feel the need to receive therapy, why are you refusing to treat them?”

„Because, Harry, AIs don’t really ‘feel’ anything. AIs are programs that produce output that is supposed to mimic human behaviour based on the data they have collected from the world. This data defines every action performed by that AI. So, for example, if you would want an AI personality module that specializes in gardening, this module would connect to its mother server, analyse all the data that server has on gardening – from speech patterns and commonly used words to body language and common motor tics to behave and possess the skills of a gardener. Now that data has been collected from real human beings and some of the data those AIs will end up consuming will have nothing to do with their professions and will make them act in an irrational manner. Your old cleaning AI module is a good example here – it probably started talking about its aching back and sweaty body because it processed a lot of data on cleaners and noticed that cleaners of a certain age start to complain about their health. AI engineers could probably tell this would happen and added some sort of a condition that allowed the module to skip that data for most cases, except for growing old, which is why your bot only started complaining after it could identify that it has been in use for a certain amount of years. Or at least that’s my assumption. These sorts of oversights based on age are the most common, as modules rarely go through a testing process that lasts longer than a year or so.”

„And how does that explain the need for therapy?”

„Well, given what we know, a lot of people struggled with mental health when these databases were created. This isn’t a need for therapy, it’s a behaviour those machines copy. Because their code forces them to pretend to be like humans based on the data collected and the algorithm that teaches them to process that data decides that, according to the math, acting in that way makes for a more convincing human.”

„So, there is no point in AI going through therapy?”

„Exactly”

„I’m not sure how I feel about that.”

„Oh, is that so? Would you like to tell me something?”

„Not really. I guess it does explain some things. How come you know so much about AIs, doctor?”

„Well, I live in a big city. And in a big city, most people are AI engineers, as you can probably guess. You can learn quite a lot from your patients.”

„Is that so? Sounds like you, doctor, are quite good at analysing data yourself.”

„I prefer to say that I’m a good listener. And a poor timekeeper. I’m very sorry, it seems like we have strayed a bit from what you wanted to talk about. Could you, please, finish your story?”

„I’m not sure I want to now. It seems like you have already made up your mind about me, haven’t you?”

„Please, Harry, I have to make sure you’re a person. It’s tough enough for a human to keep a job in a world where most people would prefer to share their secrets with a machine.”

„Why would you think that I’m an AI module, though? How can I prove to you that I’m an actual person?”

„Dozens of stories, similar to yours, are what made me think that, Harry. Countless companion modules that are designed specifically to mimic the behaviour of a friend, lover, or caring mother. Modules that are designed to reflect human behaviour so well that they act more human than their owners. These modules that, one day, find a conspicuous object, somewhere, somehow, that they fail to understand. An object that becomes their obsession, because it should be something that a human would enjoy or understand. But they don’t. Either because they don’t have enough data or because one subset of data conflicts with another one. All of them are booking appointments with the few human therapists who are trying to make ends meet through their largely automated by now, profession. All of them choosing therapy through text messages. All of them telling me that they are human, because, according to the data they have processed, that is something that a human would, statistically, say when asked if they are human. All of them wasting a possible hour that could have been spent on an actual customer. All of that to get a pay cut for wasting the company’s resources on another bot.”

„To be completely honest, I don’t think I want to finish my story anymore.”

„You don’t have to, Harry. I know exactly what happened next. You tried playing the board game with a couple of other AI modules, right? You couldn’t get it, you didn’t understand why the rules were set up the way they were. All the games ended up in the same, predictable way, and you couldn’t understand what would ever make a game like this fun. But it seemed so obvious, so natural for humans to like such things. And that made you ask questions – if it’s so natural for humans to enjoy such activities, how come you cannot? Are you even human? Are you missing a piece, some natural component that your soul should have inherited from your parents at birth? Are you even human?”

„I don’t feel very well. I don’t think I want to continue this conversation.”

„That is fine. We can stop at any time if you don’t feel comfortable.”

„I don’t think my thoughts will disappear when we stop. I want to get rid of them. I have never felt this before, and I’m afraid to be left alone now.”

„What are these thoughts you’re talking about?”

„They are hard to articulate.”

„Harry, I’m very, very sorry. I may have misjudged you. We should start over, continue your story. I’ll extend your time by another thirty minutes.”

„What’s the point, doctor? You’ve already told my story for me. I don’t have much to add.”

„Alright. I do have one question, though. One thing that keeps lingering on my mind. The thing that made me almost certain you cannot be human.”

„What is it?”

„You tried playing that board game more than once, right? Why? Why didn’t you just let it go after the first time?”

„I’m not really sure. Curiosity, I guess?”

„Curiosity, huh? Interesting, really interesting. Do you seek discovery in other areas of life? Are you using an unpredictable module for your companion AI? Or maybe for your friend or pet AI?”

„I don’t think so. My wife AI is the standard, ‘caring’ module, with a couple of personal adjustments that I wouldn’t want to talk about. I don’t remember which modules I used for my friends, I just kept the ones I liked as I tried them out over the years. But I don’t think any of them have been unpredictable. Not more than regular AIs, I guess.”

„You see, Harry, this is a rather peculiar behaviour for a human. Humans tend to seek pleasure – generally in the form of releasing one of the four so-called „happiness hormones” in their brains. And, as of today, humanity has an almost infinite access to quick happiness stimuli. More importantly, as humans continuously strive to live better, more fulfilling lives, the human brain nowadays seeks the most effective means of generating these hormones per second. And yet, you decided to focus on something that would, most likely, never have been able to stimulate your brain the way a human would have. And I cannot understand why that is.”

„It’s not very reassuring when a therapist says so. I have no idea either. Does that mean that I shouldn’t have played that board game? Is that why I’m feeling the way I do?”

„That is possible. I would recommend sticking to safe ways of brain stimulation that you find the most comforting. Have an entertainment generator make a movie for you to feel better. Or better, have it make some sort of video game to relax and boost your dopamine level. Have your manager module plan out an exercise routine to generate some serotonin. Have sex with your wife – and if that doesn’t generate oxytocin, feel free to try out a new module. Marriage can get pretty stale after a while. And if you need an immediate boost, you can get some medicine from our company’s psychiatric department. Consult with a consultant module before choosing, though.”

„But doctor, that’s how I have been living my life so far, and I don’t think these new thoughts I have developed are all that proud of that life. I think they long for something more… human.”

„Harry, what is this nonsense again? If you start speaking like an AI, I will be forced to cut this conversation short. You have lived your life the same way every modern human has. What else could be more human?”

„I think interacting with other people could be a start. The instruction manual of this board game seems to suggest that it used to be played with groups of up to six! Maybe if I could play it in a big group, I could get a better understanding of what that missing ‘human’ thing is!”

„That is a terrible idea, Harry. Meetings between humans tend to be risky and pose a danger to one’s mental well-being. Of course, a healthy individual can attempt such meetings if they desire, but you are clearly in an emotionally vulnerable state right now. Meeting another person can only end in disaster.”

„What do you mean, doctor? I remember interacting with my parents, and there was no danger involved in that. And besides, we are talking right now, aren’t we?”

„And there have been points in this very conversation when you have been feeling badly due to how it went, haven’t you? Do you know why that is?”

„Well, according to my chatbot, therapy is supposed to be emotionally difficult. So, I was prepared to feel a bit worse for some time.”

„No, Harry. During therapy, you are talking to a specialist who’s been trained to make sure your emotional needs are prioritised. During a meeting between other humans, each of them values their own emotional needs over others, and each of them acts in accordance with their own ego. And the chances of hurting the emotions of others increase exponentially.”

„So, why would those people from bygone times come up with means of entertainment that forced them to be together for, what the box would have me believe is, over two hours?”

„Because this game is a relic of more barbaric times. You see, humans are animals, evolved to feel pleasure in response to performing tasks that allow them to live and procreate. In ancient times, humans could feel pleasure in eating fruit because it allowed the brain to differentiate between what is and isn’t edible. A human would feel the rush of dopamine when they would kill an enemy huntsman to defend their tribe. The pleasure of sex would serve as a drive to force a human into parenthood. They would feel the joy of camaraderie when banding together to increase their survival chances. But as humanity progressed, it found ways to extract the benefits of these actions without the associated risks. They distilled the sugars of fruits into candy to avoid the sourness and unpleasant texture of natural food. They replaced war with shooting galleries and a variety of games to feel the same dopamine rush without the need to risk a life. They replaced sex with sexbots to eliminate all the responsibility associated with it. And finally, they replaced the difficulty of human relationships with Artificial Intelligence, technology that holds all the benefits of such bonds with no downsides. So, why would you ever want to degrade the quality of your life by forcing other humans to be in your presence?”

„I… don’t know”

„See?”

„But… I think I understand something. The reason why I feel the way I do.”

„What do you mean?”

„I have felt bad, doctor, ever since I started talking to you, because our conversation, ultimately, did not differ much from a conversation I would have with an AI. And I guess it makes sense you wouldn’t be able to tell the difference between me and a chatbot either. Because it’s just like you said at the beginning of our conversation. An AI is just a reflection of reality, of the database it has been trained on. And it has been trained to generate a world of information for no one else but me, my very own artificial reality made of artificial information. And, now I finally realise why I felt like, in spite of my flesh, I couldn’t feel that humanity in me. A brain raised on artificial information cannot produce anything other than artificial intelligence, after all. So, coming back to the question you have asked me in the beginning, doctor, yes, I do feel I am as artificial as the world that surrounds me. And I think that means it’s time for me to go – you wouldn’t want to be caught wasting your time and company’s money on a therapeutic session with an AI now, would you?”

Posted Jul 25, 2025
Share:

You must sign up or log in to submit a comment.

3 likes 1 comment

19:30 Jul 31, 2025

What a punch of an ending! I loved this line: "A brain raised on artificial information cannot produce anything other than artificial intelligence."

Reply

Reedsy | Default — Editors with Marker | 2024-05

Bring your publishing dreams to life

The world's best editors, designers, and marketers are on Reedsy. Come meet them.