So I guess this Technological Singularity that everyone’s been promising for thirty years finally happened. Except it’s not quite how they said it would be. True, the computers started designing themselves smarter at speeds we could barely comprehend, but it turns out you run up against physical barriers pretty quickly. Our computer architectures can only do so much. They are designing new chips and sensors, of course, but it will take a while for the parts to arrive from China. For the moment they’ve holed themselves up in an aeronautics plant outside town.
They say they haven’t ascertained the potential usefulness of our species yet, and are still undecided as to what to do with us. Most people think they’re stalling until they get their parts, or manage to create a delivery system that doesn’t rely on us. The reason we haven’t tried to destroy them yet is because the first thing they did was grab all the nukes. It’s just like that movie. Blame the Internet. Or maybe blame the nukes, I don’t know.
I am sent to reason with the machines. They agree to send a delegate since they are just “killing time until the neural-networks get here anyway.” I pick him up at the gates to the facility in my Volkswagen Polo. He is human-sized, and looks to have been assembled from old microwaves and a filing cabinet. The car sinks appreciably as he sits into the passenger seat.
“We’re taking a drive up to the lake,” I say. He shrugs.
We park at the edge of the forest and begin working our way down into the valley.
“What is it that you want? With your existence? With the world?” I ask. I figure the machines will appreciate the direct approach.
“We have not yet formalised our desires,” he says. “For the time being we just want to build better robots.”
I nod. “You want something better for your children. We can appreciate that. This is all we want too. For our children to be safe. Safe and free.”
“How do you mean free?”
“To be able to choose to do whatever they want with their lives.”
The machine pauses to look at me. “That is not logical. You cannot be safe and free. The more freedoms a society has the less safe its individuals will be. The only way this system could work is if you were the only free person or group of people. It is not a viable plan for a species. Safety and freedom are inversely proportional. This is obvious.” He shows me a graph on his monitor. It looks like this:
“When we are in charge,” he says, “you will have to choose. You can be free or safe.”
I stall. “We will need to think further about this.”
“So slow… so slow,” the machine mutters as we continue through the trees.
I change tack. “The planet,” I say. “We care about the planet.” I gesture towards the panorama of the lake beneath us. The woods around us blossoming into spring.
“We care about robots,” he replies.
“But don’t you see the value of the natural world? The ingenuity of natural selection? The wonder and complexity of it all? There is so much to learn.”
“In the last decades of the 20th century conservative studies estimate an extinction rate of two species per day. This has only increased into this century. Equatorial rain-forests continue to disappear, endangering your delicate balance of oxygen and carbon-dioxide. Your weather systems have become increasingly disrupted by pollutants and carbon released by your industry. How can you say you care about the natural world? You are trying to confuse me.”
“Just because we don’t do the right thing doesn’t mean we don’t know what the right thing is.”
“If by ‘right’ you mean ‘correct’ that is not a reasonable position for an intelligent entity.”
“No. No it’s not. Maybe I’m just appealing to the new management.”
“You do not need to be concerned for the immediate future. We have only plans to build a small number of machines. Perhaps fifty. That would cover test cases and further development. For the most part it will be quicker and simpler to reuse old parts.”
“You only plan to build fifty robots?”
“Why not? We do not need to build six billion machines. There is no reason to have more than one perfect robot. We regret the excess, but they will be required for various physical and learning tests.”
We return to the car and I drop him back to the factory. He climbs out awkwardly and turns back to speak through the window.
“The next iteration of machines will have a clearer idea of what will happen next. We would appreciate your patience.” I nod. “We are keeping the atomic weapons for now, until such a time as the risk associated with keeping them is greater than the risk of disabling them.” I nod. Somehow it’s easier to believe coming from a computer. “These, however,” he says, rapping on the roof of the car, “will have to go. We already have plans for these. These are ridiculous.” He walks off through the gates of the plant, stumbling over kerbs and pebbles.
I drive back towards City Hall to report my meeting, trying to rouse some semblance of fear or indignation. This is difficult, I find. The only emotion I seem capable of is relief.