for create robots that can understand honorable dilemma and interact safely with humans , scientists are turn to one of the oldest methods of instruct morals : story . Researchers at the Georgia Institute of Technology are developing an unreal intelligence organisation called “ Quixote ” that can record and comprehend the game of written stories and then hear to act like socially appropriate supporter instead of unlawful or psychotic antagonists .
“ The collected stories of unlike cultures learn child how to behave in socially satisfactory ways with example of right and unlawful behavior in fables , novel and other literature , ” researcher Mark Riedlsays . “ We believe story comprehension in automaton can pass psychotic - appearing behavior and reinforce choices that wo n’t harm humans and still reach the intended purpose . ”
The melodic theme , according toFuturity , is to train A.I. systems to imitate the moral actions of the protagonists in stories . Quixote learns to identify moral behavior in stories through a reward system that reinforces good action mechanism and punishes tough . It ’s a system based on Riedl ’s early A.I. system , call “ Scheherazade , ” which analyzes story plots from the Internet . Quixote goes one whole tone further , not just identify game elements , but assess persona ’ actions .

Riedl and his team presented their novel organisation at this year ’s Association for the Advancement of Artificial Intelligence get together . Though Quixote is a work in forward motion , Riedl claim it could one day help robots make real - world decisions ( for example , choosing to conform to the law of nature instead of intrust a crime ) .
“ We believe that AI has to be enculturated to adopt the values of a particular society , and in doing so , it will strain to avoid unacceptable conduct , ” Riedl says . “ Giving golem the ability to read and understand our stories may be the most expedient mean in the absence of a human exploiter manual of arms . ”
[ h / tFuturity ]