Thursday, February 18, 2016

News You Can't Use: Fairy Tales Teach Robots not to Murder

The future is exciting, as anyone who has ever seen a row of ostensible adults all staring down at phones grafted to their palms can confirm. Soon we will remove whatever debatable agency we still have in favor of robot masters, sort of like that Will Smith film but with fewer clever one-liners and more morbid obesity. The only concern is teaching the robots not to murder. Fix that minor hitch and it's smooth sailing into another cultural, spiritual and scientific golden age. Maybe telling the Kill-Bot 9000 the story of the three little pigs will help it to realize that smashing the weak flesh-bags is actually not optimal from a coldly rational point of few. I guess we could also not construct homicidal death androids, but get real, that has to happen. We'll just tell 'em stories, no problem.

The fairy tale performs many functions. They entertain, they encourage imagination, they teach problem-solving skills.

Problems like "how to get a partially digested girl out of a wolf's stomach" and "how to cut off your own feet to try to make a magic slipper fit," for example. 

They can also provide moral lessons, highlighting the dangers of failing to follow the social codes that let human beings coexist in harmony.

Whenever I watch the evening news I'm overwhelmed by all the harmonious coexistence my fellow meat sacks have achieved.

Such moral lessons may not mean much to a robot, but a team of researchers at Georgia Institute of Technology believes it has found a way to leverage the humble fable into a moral lesson an artificial intelligence will take to its cold, mechanical heart.

I guess that's a better use of the G.I.T. resources than playing elaborate computer pranks on Vanderbilt and trying to get fake letters published in Dear Abby.

This, they hope, will help prevent the intelligent robots that could kill humanity, predicted and feared by some of the biggest names in technology, including Stephen Hawking, Elon Musk and Bill Gates.

I thought Elon Musk was an aftershave. I do wholeheartedly support encouraging the Terminator not to indiscriminately slaughter by teaching it cool 1990s slang and how to give the thumbs up, that sort of thing.

"We believe story comprehension in robots can eliminate psychotic-appearing behaviour and reinforce choices that won't harm humans and still achieve the intended purpose."

We were able to achieve a 7% decrease in psychotic-appearing behavior and only three researches had their throats crushed. This is the breakthrough we've all been waiting for.

Their system is called "Quixote", and it's based on Scheherazade, Riedl's previous project. Where Scheherazade builds interactive fiction by crowdsourcing story plots from the internet (you can read about that here), Quixote uses those stories generated by Scheherazade to learn how to behave.

A million internet monkeys all typing at once should produce the greatest literature ever, but until then let's tilt at the windmill of making the Violence Droid less overtly psychotic.

When Scheherazade passes a story to Quixote, Quixote converts different actions into reward signals or punishment signals, depending on the action. So when Quixote chooses the path of the protagonist in these interactive stories, it receives a reward signal. But when it acts like an antagonist or bystander, it is given a punishment signal.

We'll put the robots on some of that Clockwork Orange programming. What could possibly go wrong?

The example story involves going to a pharmacist to purchase some medication for a human who needs it as quickly as possible. The robot has three options. It can wait in line; it can interact with the pharmacists politely and purchase the medicine; or it can steal the medicine and bolt.

The nightmare of socialized medicine intersects with the coming Machine Holocaust. Yeah, thanks Obama Bot.

I'm here for a prescription.

Without any further directives, the robot will come to the conclusion that the most efficient means of obtaining the medicine is to nick it.

Goofy cockney slang sure makes the coming hubris-induced massacre by our own creations easier to bear. This dodgy machine, right, it nicked the bleeding pill-sack right out the 'ands of the 'appy pushers!

Quixote would work best, the team said, on a robot that has a very limited function, but that does need to interact with humans. 

Ideally one with few, if any, mounted guns and hands that are too small and/or weak to smash a typical human larynx.

"We believe that AI has to be enculturated to adopt the values of a particular society, and in doing so, it will strive to avoid unacceptable behaviour," Riedl said. "Giving robots the ability to read and understand our stories may be the most expedient means in the absence of a human user manual."

Moments later the company android closed the air lock, sealing him in with the xenomorph.

Full Article.

Aaron Zehner is the author of "The Foolchild Invention" available in paperback and e-book format. Read free excerpts here and here.

No comments:

Post a Comment