Kathy joined Jen in the university cafeteria. Usually only students ate there. It was unusual to see a faculty member. But Dr. Zsido had just arrived and took a seat just close enough to overhear the students' conversation. He was a historian. He always felt that the "hard scientists" at the university acted superior to professors in the humanities. Especially that artificial intelligence guy, Mangieri, and his protege, Curtis Allen. Zsido knew that one of the girls at the nearby table was in one of Dr. Allen's classes. He had already heard her talking to her friends about some strange goings-on in Dr. Allen's class. He wanted to hear more.
"I just came from Dr. Allen's special topics class on consciousness," said Kathy.
"Was he still doing those thought experiments that you told me about last week?" Jen replied.
"I'll say. He asked us to imagine what would happen if all the patterns of activity produced by a person's brain were simulated in a computer system that could operate a robot."
"So it would be like converting the person into a robot, right?"
"Well, there would need to be another step. He said the real person and the robot would have all the same memories of everything before the simulation was built. But once they started having experiences after that, they each would be forming a different set of new memories. But neither of them could be directly aware of the new experiences of the other. So they would be two separate persons, just with the same old memories."
"I thought the idea was to transfer the actual person into a robot. But I guess that wouldn't work."
"Not unless, once the simulation was having new experiences, the original person was prevented from ever having any new experiences himself. Only then could you say that you successfully transferred the consciousness of the person into the robot."
"But by preventing the original person from ever having any new experiences, you mean you would kill him."
"Dr. Allen said it would not be a murder, because the person would still be alive. Just with a robotic body instead of a biological one."
"But this is all just thought experiments, right?'
"I don't know. Mangieri was Dr. Allen's dissertation advisor. And he's working in artificial intelligence. Who knows what they're up to."
Dr. Zsido had heard enough. These godless scientists had to be stopped. Without speaking to the girls, he got up and headed for the office of the Institutional Review Board. The IRB was responsible for ethics at the university. They would certainly be interested in hearing what Dr. Allen was telling his students. You can't be going around telling young impressionable college students that a murder isn't really a murder.
When he got to the IRB office, he started to walk past the administrative assistant's desk right up to the door to Dr. Ghates' office. But she stopped him.
"Dr. Ghates is busy," she said. "You'll have to wait."
"This can't wait," said Zsido, and, despite the assistant's protestations he opened the door and walked right in.
To his surprise, Ghates was with Mangieri. They both turned, startled, to see Zsido barging in.
"It's fortunate you're here," Zsido said to Mangieri. "I have some information to pass on to Dr. Ghates that concerns you."
"If it's about attempting to transfer a human's consciousness to a robot, I already know about that," said Ghates.
Zsido was stunned.
"Dr. Mangieri has explained to me how some artificial intelligence scientists have been working on that."
"And was Mangieri involved?" ventured Zsido hopefully.
"Yes, he was. He and Dr. Allen claim to have already succeeded with their first test subject. Without asking any permission!" Ghates glared at Mangieri. "It appears the test subject may have been euthanized after the procedure, so we've been investigating. We've already had the first hearing."
"You could be guilty of murder!" Zsido gleefully informed Mangieri.
"Hold on." said Ghates. "Some much more important matters came out in the hearing."
Zsido couldn't imagine what could be more important than investigating one of these superior acting scientists for possibly committing a murder.
"I'm not the only person involved in this," said Mangieri. "I'm only one of a fairly large group of AI researchers working on synthetic consciousness. I swear I didn't know what the lead scientists from the Tata Institute in India had in mind."
"What's that?" asked Zsido, tentatively.
"The lead scientist is Pranav Kapoor," said Ghates. In the hearing we were questioning Dr. Mangieri here about how anyone could be sure that the robot they presented to us was really the same person as their test subject, just in a robotic body. Yes, they actually brought the robot to the hearing. We heard some pretty convincing evidence that a robot can be conscious. Furthermore, Kapoor even presented evidence that this particular robot was not only conscious, but was the same person he was before the transfer."
"However," Ghates continued, "Kapoor informed us of the reason they needed to perfect their procedure and test it on a real person. You see, this Kapoor claims that all human evil and suffering results from the fact that humans are biological creatures. Biological evolution produces creatures that have survival and reproduction as their highest priorities. Evolution results in creatures that are increasingly better fit for their environment, but at a cost. The cost comes from the competition by which natural selection weeds out the less fit. It leads to wars, exploitation of the weak by the strong, etc. Kapoor claims that if all humans were transferred into non-biological robots they would not be subject to biological death, would have no need to reproduce, and would thus become completely benign, if not even admirably good creatures."
"But it would raise very important questions," added Mangieri. "Isn't having a purpose in your life what makes it worth living? What would be the purpose for these robots? Would people willingly agree to this huge transformation without knowing what it would be like? When we asked Kapoor about the possibility of humanity presenting huge resistance to his plans he just blithely said 'Why wouldn't everyone want to be free of evil and suffering'."
"It was then," interjected Ghates, "That we found out that Kapoor was a highly sophisticated, very human-like replica, of a human. But nevertheless a robot himself. He wasn't human. Never was. So, as you can see, the possibility that Mangieri committed one murder, even if it's true, pales in comparison with the problem we have to deal with regarding Kapoor."
Zsido had nothing to say. All he could think was, "Why am I obsessed with proving I'm as good as any hard scientist? Is that the purpose of my life? How meaningless."
You must sign up or log in to submit a comment.
0 comments