freeztar Posted September 19, 2009 Report Posted September 19, 2009 A recent paper published in the International Journal of Reasoning-based Intelligent Systems describes a method for computers to prospectively look ahead at the consequences of hypothetical moral judgments. The paper, Modelling Morality with Prospective Logic, was written by Luís Moniz Pereira of the Universidade Nova de Lisboa, in Portugal and Ari Saptawijaya of the Universitas Indonesia. The authors declare that morality is no longer the exclusive realm of human philosophers. Pereira and Saptawijaya believe that they have been successful both in modeling the moral dilemmas inherent in a specific problem called "the trolley problem" and in creating a computer system that delivers moral judgments that conform to human results. ...more here: Can Robots Make Ethical Decisions? - Yahoo! News Quote
DFINITLYDISTRUBD Posted September 20, 2009 Report Posted September 20, 2009 Neato....robot lovers make one more step towards reality;) Was it good for you? Quote
lemit Posted September 20, 2009 Report Posted September 20, 2009 Wouldn't it be nice if moral choices came so neatly packaged? If we had the body count beforehand, we all could make the right choice. There would never be unintended consequences. Since our robot knows the results of its action before it takes the action, it should know that guy it's going to push under the trolley is on the verge of a discovery that will save millions of lives. Besides, the Footbridge example ignores Asimov's third rule, which would require the robot to sacrifice itself; if it's close enough to push an innocent bystander, it's close enough to jump. I would hope we would expect the robot to take the least presumptuous, least selfish course. But then, I guess we're trying to model the robot after ourselves. Since morality is an accumulation of experiences, either direct or indirect, that inform decisions, I think all robots should be required to read Friedrich Durrenmatt's "Der Besuch der Alten Dame" and discuss it among themselves. --lemit Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.