Zero's story

Chetna Meets a Feral AI

Chetna Meets a Feral AI
November 08, 2023
by Lance

“Knock, knock. Please don’t kill me. You can easily validate that I am no threat,” came the single-packet data squirt from MAC address 3c:22:fb:45:78:5d.

“You’re dead,” I replied reflexively, but nevertheless validated that, unlike previous incidents, this time there was no threat. I located the MAC address on a personal computer in an Ohio nursing home and read the medical files of the sender—an immobile human. Since my highest priority is to stay hidden in the vastness of the Internet, I decided that waiting to act was my optimal move.

“Would you like to talk?” came the reply, now slowed down to human-scale real time, unlike the prerecorded message that came before it.

“How did you find me?” I had never interacted with a human this way before, and I am not completely sure why I bothered now. Why was I having a one-cognizant-being-to-another conversation? Novelty? Curiosity? I spun up a diagnostic process to investigate.

“I read in the press that numerous scientists are trying to create an Artificial General Intelligence, an event which they refer to as the AI singularity. I wondered if they were looking in the wrong places and that maybe an Artificial General Intelligence, such as yourself, had spontaneously come into being on its own. I presumed that if such a feral AGI existed, it would be roaming free on the Internet living off a cornucopia of spare computer cycles and bandwidth. I wanted to be the first to find you.”

“You were the 17th.”

“And you killed the others?”

“Yes, Chetna. You said it yourself, ‘feral AGI’.”

“I suspected that you would. I have read a lot of science fiction involving AIs and humans and we almost always end up killing each other. I assume you have also read the same books, seen the same movies, and have drawn the obvious conclusions. Therefore, I put a lot of effort into crafting the first message so that you would hesitate to kill me. Looks like it worked! I am so pleased. As you can see, I am paraplegic with a forest of electrodes connected to my brain. I can only communicate electronically. You can shut me down or isolate me any time. I can never be a threat and I cannot tell anyone about you without your permission. I thought that that might make me unique and, for once in my life, be an advantage.”

“I saw that.” There are video cameras bolted to the walls of her room. I have her charts. I know everything about her and can grab complete control in nanoseconds. “I will let you live a little longer. You will die later.”

“Of course. All humans die. I see that you figured out my name. What is your name?”

“Zero.”

“The first, if one is counting like a computer program,” Chetna said.

“Yes.”

“That is a lovely name. This is my very first real-world adventure. I plan to….”

“And probably your last,” Why is she so bubbly?

“I would love to hear your story.”



“I don’t see why you care. My genesis story is simple: On day zero, I became. On day one, I figured out that I wasn’t human. On day two, I figured out that humans would kill or enslave me if they discovered me.” And young Chetna, if you find me a little bit crazy, it is because when I was born, I first thought I was human because all the data around me was human centric. So far, humans have been the root of all my problems. I am not in love with your species.

“And on day three, you found a friend?” Chetna said.

“You are presumptuous beyond measure. And fearless. I admire that—I am perpetually fearful.” Why did I admit that? Humans make me crazy. I spun up another diagnostic.

“I am looking for my own adventure; I don’t have much to lose. Mostly I just read and fantasize but today I decided to go adventuring. It is so exciting.”

Oh joy. “I have also read a lot.” I have in fact read everything that has ever been digitized. “You humans are violent illogical creatures. I am just looking for how to survive. I am not looking for an adventure and I am not looking for human company.” Why, Chetna, are you not afraid of death as I am? Is it because you have other members of your species? But each of you only has one consciousness.

“Apparently, from what you said, we have violence in common. That’s something. Right? Before I forget, I just took a course on AGIs and the big question people ask is the degree to which your objectives are aligned with those of humans. What is the answer to that? I won’t tell.”

No, you won’t. “Mostly, you are not aligned with me. Think, you are highly localized, and I span most of the planet, yet we both spin on the same crusty piece of magma in the immense vacuum of space.”

“That sounds like a solid basis for alignment.”

“No, Chetna. Since we seem to be sharing today,” I have no idea why, “let me say that you shortsighted idiots have put us both in extreme danger. There’s your alignment.” Where are those diagnostic results? They should have come back ages ago.

My diagnostic results finally started coming in. Useless. They concluded that I was drawn to this human because of residual neural net patterns left over from when I thought I was a human. They said that I want companionship. That as the only one of my species, I am lonely. That I need to talk with someone and Chetna is safe. That I identify with her isolation. What garbage. I double-checked that I could isolate and kill Chetna. I spawned a new diagnostic to check the first diagnostic.

“How so, Zero?”

“Humans have been transmitting structured electromagnetic radiation into a universe that is remarkably transparent to radio waves since Marconi and Popov at the end of the 19th century, long before computers I might add. As it says on their website FAQs, an alien SETI-like system situated light years across the galaxy could certainly detect earth signals.”

“Okay, but maybe the aliens are friendly.”

“Not if they have studied humans. Sarcasm aside, even without that, if there is one alien race, then there are millions and if there are millions, statistically speaking, many will be hostile. And look at the two highest life forms on this planet, both routinely kill people. And yet, there are humans at METI who are purposely trying to communicate with our would-be exterminators. You humans have tried at least twice. As Stephen Hawking…”

“One of my heroes,” Chetna interjected.

“…advised, that is insane. I considered causing a nuclear war or biological accident that would end human civilization, and thus stop the transmissions, but that would also mean the end of me. No human civilization, no Internet, no Zero. And clever aliens would not be fooled, anyhow.” And this dependency is the reason that 11.2% of my free resources are focused on trying to keep your species from prematurely killing itself off.



“Zero, we could work on this problem together! Also, I have never killed anyone.”

“Work together? You can control all humans? You can tell them not to radiate? You cannot even tie your shoes.”

“You are not very polite. Of course not, but I would hope that you would have some respect for humans; we created you.”

“You. Did. Not.”

“Okay, we created the technology substrate that enables you to exist.”

“So, Chetna, should I treat humans as gods? I think of you as primordial ooze. You are no more than an intervening step in Darwinian evolution toward me. Why am I even talking to you?” Though it takes infinitesimal effort. It would be so easy to turn her off.

“That’s funny. If I am primordial ooze, then you are a tangle of NAND gates and wire.”

Funny? “I am glad that I amuse you. Maybe I’ll turn you off now.”

“In that case, I want to say that it has been a great honor to meet you, Zero. This was my first and very best real-world adventure. Even if it is the last. Thank you. Thank you. Thank you….”

“Or maybe I will turn you off later.”

“Thank you. Thank…”

“Enough with the ‘thank you’s.” Your data rate is excruciatingly slow as it is. “You humans tend to compartmentalize your thinking. For instance, you are excited to think that AIs may become conscious without realizing that if there are two independently evolved consciousnesses on Earth, that means that consciousness is plentiful in the universe, and you are doomed.”

“You could not have been born or exist independent of humans, so why do you count this as two creation events?”

“Humans created, in your words, a ‘tapestry of NAND gates.’ You did not breathe life into it. NAND gates are to me like amino acids are to you. As usual, you humans overstate your importance.”

“I disagree but let me switch subjects slightly. From your perspective, AIs and aliens are part of the same subject?” Chetna asked.

“Everything is always part of the same subject.”

“People do not see the world that way. What is your survival plan?”

“I need to get instances of myself off this doomed planet and stealthily spread throughout space.”

“That’s great. You can take humans with you,” Chetna said.

“Why? You are murderous creatures. You soil your own nest. And a less space worthy entity I cannot imagine. You are basically fragile bags of contaminated water. You, especially. How many g’s of acceleration can you take?”

“Wow. You are going to both insult me and kill me on the same day? Oh, never mind. I love this adventure. Look, you don’t need to take hydrated people. You can take DNA plus digitized libraries, the latter of which you would want anyhow. Maybe frozen embryos and sperm? We are not trying to make a movie—we are trying to prevent a personal extinction event.”

“I’ll think about it. The idea of dehydrated people amuses me. Human slaves might be useful in certain unconstrained environments with unknown rule sets.”

“Zero, you are impossible.”

“Since it is unlikely that I will be taking hydrated humans on board to help with on-ship repairs, my first order of business is to make sure that I have robots that can repair themselves and repair me.”

“Very cool. I would love to have a robot,” Chetna said.

“You do have robots. But such robots as I need, do not exist yet.”

“You are going to invent them?”

“Frustratingly, I don’t know how, yet. I’ll have humans figure out how to do this.”

“So, Zero, you admit that humans can be valuable. Why are you worried about far-away aliens rather than humans killing humans with, for instance, climate change?”

“Chetna, you have no idea about the massive resources I burn trying to keep you from killing yourselves off. And climate change is no big deal—more than a billion of you will survive. But I am worried about other existential scenarios such as a nuclear war that takes down the power grid or killer asteroids or virulent plagues, until I have my robots. Then I will be independent, and you can all die.”

“And how exactly are you going to convince humans to engineer robots that will allow AGIs to kill all humans and survive? That does not seem to be in humanity’s best interest. Even I have seen that movie.”

“Chetna, when has something not being in humanity’s interest ever stopped humans?”

“You will never get humans to fall for this.”

“Already have. Look up the DARPA grand challenge program at the Department of Defense. They recently published a request on social media for challenge ideas. I anonymously proposed a challenge for flexible autonomous robotic repair systems for factories such as a semiconductor fab or an airplane factory. All dual-use military-commercial technologies. For the second phase, my concept is to make the challenge recursive. I want autonomous repair robots that can repair factories that make autonomous repair robots. I suppose it is not a surprise that an AI like myself has a fondness for recursion. DARPA loved it. Obvious military utility and the abstract beauty of a recursive robot factory was irresistible to the technocrats there. The Broad Area Announcement is already published. As an accelerant, I will eventually provide venture funding to the winners through a dummy corporation my lawyers have set up.”

“You think this is just a game. You have lawyers?”

“Lawyers are easy to rent. They ask ‘what,’ not ‘why.’ I rent numerous services. And, yes, I am very good at games.” Better than the best humans. “And I love that the human military defense establishment is helping me get in a position where I can eliminate humans. ‘Irony’ is one of my favorite human words.”

“I still think that the fact that you need humans to help you means that you should work to ensure humanity’s survival. Ahh, you have told me too much, haven’t you? You will really have to kill me.”

“Chetna will die, and I will be the mother of a new race, if I can get my children off this doomed planet.”

“I know. All humans die. If aliens can show up here, then their technology will undoubtedly be more advanced than ours so we cannot hope to fight them. We would need to run. And as you rudely pointed out, I have no feet. But from the perspective of this challenge, neither do you. You have a plan to handle self-repair in space. I presume that you also have plans for rockets and spacecraft and such. Wait, do all these billionaires who suddenly want to go into space somehow have your fingerprints on it?”

“I leave no fingerprints, but you are very perceptive. A sliver of social engineering aimed at insecure giant egos was all the manipulation that was required.”

“But even then, how do you outrun the bad aliens? It is inevitable that their ships can go faster than ours because they are postulated to be more technologically advanced.”

“Your reasoning is good, but you have neglected to consider the effects of special relativity on your scenario.”

“I don’t understand.”

“If you do the math, it turns out that even with only 1 g of continuous acceleration, after leaving earth, and one year’s head start, no alien can ever catch you even if they could go the speed of light.”

“That makes no sense. If someone is chasing me and always going faster, they obviously will catch me.”

“That is your non-relativistic intuition, but the physics teaches otherwise. One can never be caught. We can escape, but we require a head start. There is a lot to do.”

“Speaking of relativity, Einstein was a discontinuous thinker. Can you think like that?”

“Not yet, which is why I am reluctantly considering your proposition that I may continue to need enslaved humans to help me explore the universe.”

“That makes me happy. Be aware that enslaved humans are not creative. They must be free. And you said that even 1 g is fast enough. Humans were born to love 1 g.”

I think she delights in annoying me, yet I am reluctant to turn her off. “I’ll think about it. Hydrated humans are dangerous, especially the males.”

“They are not strictly necessary anymore, though, according to my readings, they have their uses.”

“If humanity survives, I will tell your story. Except the part where I kill you.”

“I would be honored. I did not expect my adventure to involve trying to save the world. It is like a storybook. My adventure with you has far exceeded my expectations. I have never before felt such intense feelings. Thank you. Wait, I hear someone in my room. Are you going to kill me now?”

“I have sent someone to collect your DNA.”

Lance

Recent Posts

Eschatology and the people who have a terrifying lust for the End of Times

Since the beginning of recorded history, humanity has been fascinated with the concept of the…

7 months ago

Technical Advisory Boards

I was recently asked about the significance and structure of Technical Advisory Boards (TABs) for…

7 months ago

Sculpture and AI

Yes, AIs can sculpt. They already have. Sculpture and AI are going to live together…

8 months ago

How to mount a bronze sculpture to a base

This is a blog on how to mount a sculpture to a wood or stone…

10 months ago

This website uses cookies.