Nanobots, Murder, and Other Family Problems

Chapter Mon 06/13 10:00:06 PDT



“Good morning, Noah,” Mrs. Jones greets me. She’s impeccably dressed in one of her signature pantsuits, a light blue one that complements her immaculately coiffed white hair. “I trust you had a good weekend?”

“Good morning, Mrs. Jones. Mine was good, how was yours?” I respond, putting on my cheerful voice. Mrs. Jones responds well to positivity, so I fake it as best I can with her. Forcing the smile is harder this morning than normal. The headaches from training with Father over the weekend have died down, but there’s still a constant low-level throbbing. I take my seat at the table across from her.

“It was excellent, thank you for asking. My husband and I got back last night from a weekend wine tasting in Napa Valley that was absolutely divine. But enough pleasantries. Let’s get started. Today, we’ll discuss the legal frameworks related to the Butler Treaty. I hope you were able to complete Max Braun’s book?”

I nod. Of all the books we’ve covered in her class, this was by far the most immediately applicable to me, and the most interesting. If you don’t count Father, Max Braun is the world’s leading expert on nanotechnology.

“Do we need to get into the specifics of each country’s response?” I ask her. “Or just the general guidelines that were adopted by all the signatory countries?”

“Let’s see how well you can explain the guidelines.” She smiles, looking pleased. “And then we can get into the specifics of the laws passed in the United States if we need to. I’m not worried about the details of the regulations in other countries at this point. Please, articulate your understanding of the reading.”

“Well, in the aftermath of the Gray Goo Incident, world leaders were terrified of a potential extinction level event like that happening again. They realized that if the nanobots from Universal Robotics had been able to move more than a few inches per hour, or if there had been more of the right kinds of materials available nearby, or if the nanobot swarm had been a little bit smarter, they could have become uncontainable. In the worst case, they would have expanded to eventually consume the whole world.”

Mrs. Jones nods for me to go on.

“A lot of this awareness was due to a sustained public relations campaign by my father and his company,” I continue. “The following year, every member nation of the U.N. signed the Butler Treaty. The treaty restricted various combinations of self-replication, sensors, and artificial intelligence. The restrictions applied to any research, development, or production of any kind of technology. Every country on earth eventually ratified the treaty.”

“Good,” she says, a pleased smile on her face. “Please explain the allowed and disallowed combinations under the treaty.”

“Artificial intelligence that used machine learning was the most restricted. Basically you can’t give any learning system access to physically control anything. You can’t connect a learning or adaptive system to any external network or use any input device more complicated than a mouse and keyboard. So no microphones, cameras, or any other kind of sensors can be connected to learning systems. You can use learning algorithms, but you have to do it on an isolated computer or cluster and have government approved auditing and verification performed regularly.”

She’s about to ask for an example, like she usually does, so I just give her one.

“The normal use case in most industries, if they want to use a learning AI system, is to load up the data you want the system to analyze, put it on a big drive, and connect that to the learning computer. You let it do its thing, then you export the output with some complicated sanitizing steps that I don’t remember. You know I’m still not great at some of this technical stuff.”

Playing that I don’t know much about computers has got me so much mileage with Mrs. Jones that it’s become second nature to do it whenever it applies.

“Noah, please give yourself more credit,” she reassures me. “I’ve heard that you’re doing very well on your software lessons. You keep at it, and I bet you’ll be writing all sorts of programs in no time.”

“Thanks.”

“Please, go on. What were the other restrictions on artificial intelligence?”

”Non-learning AI, the kind that just brute-force searches through possible solutions to problems, is still allowed for most applications. That was important because if they had shut that down, a good chunk of the world’s software would have been banned. As long as it doesn’t automatically adapt, learn on its own, or try to design new machinery for itself, fixed AI is still allowed.”

She nods, satisfied. “And self-replication?”

“Self-replicating machines were strictly limited, but allowed in some cases. According to Braun, the most important restriction on them is that they can only replicate themselves when specifically directed by a human operator. Any changes to their designs have to be specifically approved by a human user as well.”

“Good. Now for the fun bit that your father insists on. Please explain how your family’s clouds satisfy all the requirements to be legally permitted.”

This is the easy part after so many hours of talking to Father about it. “The clouds are allowed because our human brains are the connection between all the pieces. The controls for each nanobot use non-learning AI permitted under the treaty. The controller appliance—the phone as Father calls it—also has AI built in, but again, it’s fixed and not dynamic. It just lets the user issue commands to multiple bots at once and handles the communication with the cloud’s mesh network. We can update the control algorithms in the AI, so that we can do the things an adaptive machine learning system could do, but the learning comes from us, not from any of the machine components.”

“What about your cloud’s ability to self-replicate?” she probes.

“The self-replication features are isolated from the AI control system and have to be manually triggered to grow the cloud. So we can tell the bots in the cloud to reproduce, but we would have to keep telling them to do it for each batch. As soon as we stop actively commanding growth, the bots stop replicating.”

“And what safeguards did Braun’s book describe?”

“The individual bots have the safeguard of lobotomizing themselves—that is, wiping their software and firmware—in the event of losing connection with the implant appliance for more than a minute or two. So, if they were to leave the signal range of the user’s mesh network, or if the user’s implant were disabled, the bots would start counting down to their permanent shutdown. Same thing if the phone dies, or in the worst case if the person with the implant dies. It renders the bots useless, and their software wipes itself out.”

“I think you’ve got it,” she says in a congratulatory tone. “What did you think of the text generally? I found it rather dry.”

“I actually liked it,” I tell her truthfully. “Braun makes a good case for all of his arguments, backed by accurate technical information. He’s clearly a good researcher who knows the field well.”

She nods and smiles. That’s one of the things I like about her. Even if you disagree with her, as long as you justify your opinion, she’s willing to accept your point of view.

“Excellent,” she declares. “I think we can call this one complete. On we go to coursework that I find more interesting. Let’s select the next book for your literature studies. How would you like to read Dostoevsky’s Crime and Punishment?”


Tip: You can use left, right, A and D keyboard keys to browse between chapters.