Wednesday, September 17, 2014

A Major Flaw: "Ethical Trap: Robot Paralyzed by Choice of Who to Save"

You don't want hesitation in your robotrader.
From New Scientist via Communications of the ACM:
Bristol Robotics Laboratory's Alan Winfield and colleagues recently tested an ethical challenge for a robot, programming it to prevent other automatons--representing humans--from falling into a hole.

When researchers used two human proxies, the robot was forced to choose which to save. In some cases, it saved one proxy while letting the other perish, while in others, it saved both. However, in 14 out of 33 trials, the robot spent so much time making its decision that both proxies fell into the hole.

Winfield describes his robot as an "ethical zombie" that has no choice but to behave as it does....MORE
At the same time you want the computer to discriminate between the command "Execute the trade" and the command "Execute the trader".