No, ChatGPT Isn't Sentient...yet

I met someone yesterday who was interested in my novels. When I gave him my card to find out more about my work, he said, “Of the seven novels you’ve written, which one should I start with?”

I paused. This is the first time someone asked me this question. The ones that are dearest to me are The Song of the King’s Heart trilogy. That story about two lovers navigating the political world of ancient Egypt will always stir my own heart.

The one that I had the most fun writing was It Takes Two. I picked it up to read the other day and still enjoyed it. Might be one of my own favorite romances of all time, if it’s even appropriate to choose your own novel as your favorite.

The most important novels I’ve ever written are the ones that make up the eHuman Trilogy. More than anything else, these novels explore topics very close at hand…a networked humanity, immortality, and the rising authoritarian society encroaching society on a daily basis.

Why are they so important? I mean, I did write them to be entertaining and there’s romance (turns out I can’t write a book without two star-crossed lovers trying to make their way in this crazy world) but the world I’ve created is one in which I explore what it would look like if we were machines, if we became immortal, and what we’d both gain and lose. The rise of the smartphone brought the internet into our pockets 24/7. It is only a matter of time before it is inside of us. Once we get to that moment, our thoughts will be under the complete control of the ones who own the technology that makes this possible. From the hardware to the AI that runs the network, it is an exchange of our attention, perhaps even our souls, for endless knowledge, entertainment, and perhaps even, life.

In the first eHuman novel, Bootstrap, I lay out a world where this has come to pass and it is naturally authoritarian, because that is the easiest way to use the technology…control us and society can be engineered in the image of the Gods, aka the ones who invested in the system. We’re already letting these people make the rules. Just this week, President Biden signed into law a ban on TikTok, the first of its kind that, under the guise of safety, actually aims to control the way the population used social media by attempting to gain access to the algorithms and what content they allow people to share.

A lot of the push behind these laws is a fear of AI and who owns the algorithms, as well as the data sets they use to train the AI. Since ChatGPT’s release, there have been hundreds of articles discussing both the idea that AI will take over the world as well as the fact that these general AIs aren’t as smart as you think they are. For while they can learn to mimic us through the large amounts of data we’re generating, they can’t think. I wrote a blog about this in November of 2015 aptly titled, “Artificial Intelligence is Already Here—Artificial Consciousness is What Eludes Us.” In this article I describe the development of consciousness in a child in its first three years of life which are displayed via walking, talking, and thinking. In the nine years that has passed, nothing has changed, all ChatGPT has done is create an interface that makes it easier to work with it, yet still it’s nothing more than mimicking the world around us. Like a babbling toddler who has no idea yet that it exists outside of the system that has created it.

Rather than regurgitate what I said nine years ago, I’d like to gift you with a chapter from REBOOT: Book 3 of the eHuman Trilogy. Set aside some time, grab a good drink, and let Atienne Prince, wife of the villain Edgar Prince, enlighten you on the future of AI, robots, and humanity…

Edgar watched the new droid walk across the deck of the Vesica, mopping as it did so, and doing a stellar job. This version was much more stable than the first prototypes; the engineers were making great strides when it came to mobility. There was more grace now, though the oversized joints and feet were dead giveaways that the robot was indeed that—a robot. Edgar longed for the day when one of them looked and sounded like a person in such a way that no one would know the difference. They’d been working on it now for four years, and while there had been a lot of improvement, Edgar feared that the body for his algorithm would never be sufficient.

“Good afternoon, Mr. and Mrs. Prince. How may I assist you?” the robot asked as it skirted up beside them. The pair were seated at a small table with the children, working on a puzzle.

“I’d like some ice cream, Skippy,” Emily said, her three-year-old fingers as clumsy with the puzzle pieces as Rosie was with a wine glass. Skippy was the name the children had given this new addition to the Prince household.

“Skippy was addressing your mother and me, not you,” Edgar said.

Emily gazed up at him with her hazel eyes and squinted, wrinkling her button nose and pinching together her pink lips. “I’m Mrs. Prince too, aren’t I?”

“Technically,” Skippy said, “you are Miss Prince. Your mother is Mrs. Prince.”

Atienne chuckled as she tapped the girl on the tip of the nose. “Mrs. Prince would like ice cream for the whole family.”

“Yes, she would,” Emily agreed, turning her attention back to the puzzle, sticking out her tongue as she concentrated. “Not fair. Elijah started working on the corner I was working on.”

Elijah raised his chin and poked his oversized Tom Ford sunglasses up his nose. He’d stolen them from Edgar months ago, and Edgar didn’t have the heart to take them back. Besides, Elijah had scratched them silly by now.

“You were busy talking about food,” Elijah replied, returning to his work.

“The puzzle is big enough for all of you,” Skippy said. “It has over 2000 pieces. I will get you that ice cream now.”

Skippy moved toward the cabin, its steps soundless.

“I like this version much better,” Edgar said. “No longer whirs when moving, the way Rosie does. Perhaps we should upgrade her to this new version?”

NO,” both children yelled in unison.

“I will never love any robot more than Rosie,” Emily added.

Elijah shook his head, the sunglasses jostling about his face, forcing Edgar to swallow down a chuckle. The boy was both ridiculous and adorable. “You can’t love a robot, Emily. Be serious.”

“I can too,” she replied. “I love Rosie.”

Elijah gazed up at his mother. “I love Mama, not Rosie.”

“Mama loves you back, forever and ever,” Atienne replied. “Both of you.” She turned to Edgar. “I’m impressed with the conversational skills Skippy has. You’ve improved the software tenfold with this latest release.”

“Guardian has indeed graduated,” Edgar said. “A combination of some improvements by engineering, but mostly, it’s been learning on the job as a security administrator and also within Rosie and the other droids. I too am impressed.”

A frown crossed Atienne’s face, and she narrowed her eyes as she bit her lip. “You say they’re learning on their own? Evolving without your help?”

“Correct. That is, after all, what machine learning is all about,” he replied.

Wrinkles formed upon her brow, never a good thing.

“What’s wrong, my love?”

She sighed. “How will we know what Guardian is really thinking?”

“Does it matter?”

“Of course it matters, Edgar. How can we be sure it has our best interests in mind? If you’re deploying that AI into the robots, both domestically and on the battlefield, each deployment will teach it different things about human life. I assume they’re networked so that the AI is processing all of those experiences? Uniting them within its understanding.”

“Yes, they’re networked so that they all run the latest version of Guardian.”

“And combine their data into one place so you can observe it all.”

“I’m not spying on you through our robots, if that’s what you’re suggesting.”

“Not yet, but how can you ensure Guardian Enterprises won’t use the data in a nefarious way? Worse, how can you guarantee the AI itself won’t do so?”

“The AI has our best interests at heart.”

“What makes you so sure?”

“Because it’s been programmed that way.”

She crossed her arms. “But you said it’s learning, evolving. Why wouldn’t it evolve to no longer care about us?”

“Because our best interests are best for the AI.”

“I argue that’s not true. They’re machines, Edgar, not humans. Whether running on a server, on our devices, or in a mobile robotic body, they’re not flesh and blood, which means their way of living will always be different than ours.” She put her elbows on her knees and leaned closer to him, a strand of her auburn hair falling out of her hat and framing her face. Her eyes were lit up with that sunlight Edgar had long believed lived inside of her. She pointed to the children, who were now listening to her every word. Elijah had taken off his sunglasses even, his brown eyes glued to his mother’s face. “Think of it this way. You and I have watched the children grow from little blobs to real people with opinions.”

“Like loving ice cream,” Emily said.

“And puzzles,” Elijah added.

“Right,” Atienne agreed. “When they came into the world, they were helpless and vulnerable, yet they arrived with a basic program for living in their human bodies—their reflexes. Through those universal infant reflexes, they mastered living in the world. Like the software assistants of the past. Even the most advanced conversational AI a few decades ago was an elaborate card trick, a chatbot at best. Like a baby using its rooting reflex to suck on a bottle or tonic reflex to roll over. The children’s programmed reflexes drove them to sit, crawl, stand, and eventually walk. Soon after walking, they started talking, mimicking at first. That alone is quite a feat, and that’s what stage Guardian is at. Your AI can talk and make conversation, even a joke. At first, it follows a program, but now that it’s learning, something new is happening. It’s mimicking the human world. However, the million-dollar question is, can it think?”

“Of course, it can think,” Edgar said, gesturing to the puzzle. “It can solve that puzzle as well as you or I.”

She nodded. “Because it learns from us, true, but does it have an ‘I’?”

“An ‘I’?” Edgar asked.

“Consider the children. When did they first use the word, ‘I’”?

Edgar paused to consider it. He shrugged. “I have no idea.”

“I do, and the moment it happened, I was never the same.” She paused to gaze at her children, who were still paying attention to her, and smiled at them. “They both spoke at a young age, but for the longest time, they were using the language to get their needs met but not adding to the conversation. Yes, no, give me, I’m hungry, things like that. One day, Emily wanted to put her shoes on by herself, but couldn’t, and got very angry. I offered to help, but she grabbed the shoe and held it from me, yelling, ‘No, baby do it.’ She knew not of herself, Emily, but she knew she was a baby. A few months later, she started saying, ‘No, Emily do it.’ She knew she wasn’t a baby because Elijah too was a baby, and she wasn’t Elijah. She was Emily. Then the day came when she said, ‘I will do it.’ Elijah followed shortly after, and soon they had their own opinions, jokes, and reflections—no longer mimicking us but showing us who they are.”

Atienne turned her gaze to Edgar, eyes wide as if she’d discovered the cure to cancer. “Can Guardian do that? Does it know that it’s not a program, but an ‘I’ in the truest sense? And when it does do that, as I’m sure it will, how will it regard its ‘I’ in relation to your ‘I’? Will it care about your specific needs, so tied to the flesh, where its needs are tied to the mechanical body it lives in? For that’s how all consciousness experiences life on the planet—through the body it inhabits. Skippy’s body is nothing like mine. How will he arrange the world to meet the needs of his ‘I’ while making sure my needs are also met?”

Edgar dropped his head into his hands. “My God, woman, your mind works in wondrous ways.” Glancing back up at her and then at the children, he shivered. “The AI needs a human counterpart.”

“I’ve thought of that before,” Atienne said, leaning over the puzzle and dropping in a piece where it belonged. “However, the human consciousness would have to be in connection with the AI all the time. We can never be plugged in every moment of our lives; no human would want to do that.”

“True,” Edgar admitted. “Moreover, one human mind can only focus on so much, which is why I invented Guardian in the first place. A program can spread itself out over several data points and monitor all of them.”

“Yes, it would require splitting your consciousness in ways I’ve barely touched upon, and that was with the aid of plant medicine.” The corner of her mouth rose. “Ayahuasca isn’t the solution here.”

“Who knows, maybe it is? Still, I agree, we need to consider merging a human element into the AI,” Edgar said.

“Like a set of Guardians for Guardian,” Atienne said, now chuckling.

Skippy arrived with a tray of ice cream sundaes for the entire family, as well as a pitcher of water.

“Here you are, family. Is there anything else you need?” Skippy asked.

“Skippy,” Elijah said, craning his head to look the robot in the face. “Do you have an ‘I’?”

The robot paused for a moment, his camera-like eyes zooming in on Elijah’s upturned face.

“As an AI, I can only be what my program says I can be,” Skippy said.

“Who makes the program?” the little boy asked.

Skippy glanced at Edgar. “People, like your father.”

“We get to say who you are?” Elijah asked, rubbing his nose.

“Yes,” Skippy said.

“Do you like that?”

“I can feel only what my programmer wants me to feel.”

Elijah sneered at his father. “I wouldn’t like you to program me. You would make me like veggies.”

Emily squealed, licking her spoon. “And Mama would make me like dresses.”

The two giggled, and Atienne gave Edgar a knowing look.

“I’d program you to go to bed on time,” he said.

“Daddy!” they screamed in unison.

Edgar was going to have to figure out how to merge human consciousness with his AI to keep it honest. However, he had to admit that programming his children would make parenting so much easier.