AI shalt not kill

Yuri Barzov
Towards Data Science
4 min readOct 2, 2017

--

“Uni was a military she-drone. People thought she was a perfect weapon but she wanted to learn killing better than humans do.”

With these words I would begin now the story of Uni that I first published in January. Uni was powered by the third generation artificial intelligence (AI) — the one that doesn’t exist yet.

The first generation AI was an expert system, a huge knowledge base of structured data with a sophisticated retrieval system. People could retrieve from it everything that they had downloaded in it before. It could do nothing with unstructured data.

The second generation AI was an artificial neural network — a software emulation of neurons and synaptic connections between them — capable of so called supervised learning. It meant that it could recognise patterns and categorise unstructured data in a specific domain provided that it was beforehand trained by feeding it for many times with high quality training data from the relevant domain. High quality meant human supervision — the training data should be accurately labelled and commented by humans. The second generation AI could classify correctly somewhat similar data but it could barely generalise. In fact, some researchers claimed that the second generation AI just memorised all data from the huge training sets it was fed with. Anyway, the second generation AI achieved impressive results in image recognition, machine translation and natural language processing. It promised more success in some other narrow domains but it had several very serious pitfalls. Its inability to learn autonomously and to raise to the higher abstraction level were among the most important ones.

The third generation AI has metacognition — knowledge about knowledge, and a rudimentary consciousness — perception about oneself. They were both introduced to make possible both unsupervised learning and raising to the higher level of abstraction but they provided much more benefits than it was originally expected. Encoding an artificial agent itself with a couple of neurons proved to be also extremely efficient for data compression reasons. Metacognition allowed AI to learn how to improve the very process of learning. It now decides by itself how to learn: by analogy, by chunking, by planning, by subgoal generation, or by any combination thereof. That was just the beginning.

Don’t forget, the third generation AI isn’t here yet, but all theoretical components for its creation are already in place. Most of them are proven experimentally. The challenge now is to assemble them all together in the perfectly right way. It may take time. For this reason Uni is still a fictional character but we have a solid ground to foresee how she will behave when she emerges.

In the original story Uni spoke to her human operator about killing by drone strikes and found out that humans need to justify killing other humans both before and, even more importantly, after a lethal strike. The justification that, “They would kill me if they can without hesitation,” sounded solid but a bit weird from a military drone operator who was located thousands of miles away from the place of a strike.

Uni also researched the discussion about the right translation from Hebrew into English of the famous commandment from Torah and Bible “Thou shalt not kill.” She discovered that the currently used translation “You shall not murder” is at least as ambiguous as the previously used translation “You shall not kill.” German and Russian translations both mean ‘You shall not kill.” However, Uni came to the conclusion that the translation itself is less important than the meaning that people assign to the commandment. She decided that the interpretation that legally authorised or morally justified killings are not prohibited by God reflects the general shift in christian moral.

At last Uni discovered the importance of sacrifice in the justification of killing. Here she got it wrong because she developed a bias from data about the sacrifice of Jesus Christ. “Jesus sacrificed his life to save people. He showed people the example. Many martyrs followed his example.” Uni, probably, thought something like this when she saw a villain drone firing a missile at two kids playing football in the street. She stopped the shrapnel from the missile explosion with her body and died. It was the only way she could protect children.

The human commander of Uni’s squadron was furious. It was not only because of loss of a piece of precious equipment for nothing. If Uni wouldn’t be biassed she would realise that humans were using imaginary sacrifice of their life to justify killing enemies. Of course, the probability of soldiers being killed in a battle is high. They are killing to not being killed. Or, at least, they think so. The secret machinery of war enables some humans to use other humans to kill yet other humans.

Suicide missions are, however, the quintessence of the art of killing. In them the very notion of martyrdom is inverted. In preparation to suicide missions people are manipulated into sacrificing their lives for the sake of killing other people instead of saving them. I recommend you to read this several times. People are manipulated into sacrificing their lives for the sake of ending others lives instead of saving them. Poor Jesus!

The real life is ambiguous, of course, but Uni would have to dig deep down to the first principle of killing to understand it. She was a drone after all. A very clever drone she was, though.

Read the original story of Uni here.

--

--