码迷,mamicode.com
首页 > 其他好文 > 详细

英语流利说 第23天

时间:2018-11-30 14:10:10      阅读:272      评论:0      收藏:0      [点我收藏+]

标签:neu   rms   ips   smi   nes   ide   是什么   span   font   

红色表示重点词汇

蓝色表示句子主干

带着问题听讲解

Q1: “preference”是什么意思?

Q2: 人们最明显的倾向是优先救哪几类人?

Q3: 罪犯的优先级如何?

How people think that self-driving cars should behave in an accident?

In a paper just published in Nature, a team of psychologists and computer scientists describe a different approach. They created the “Moral Machine”, a website which presents visitors with a series of choices about whom to save and whom to kill.

The strongest preferences, expressed by respondents from all over the world, were for saving human lives over animal ones, preferring to save many rather than few and prioritising children over the old. There were weaker preferences for saving women over men, pedestrians over passengers in the car and for taking action rather than doing nothing. Criminals were seen as literally subhuman—ranking below dogs in the public’s priority list, but above cats. It is easy to imagine the utilitarian argument for preserving the lives of doctors over others. Humanity’s (weak) preference for saving athletes seems less intuitive.

Many people, says Dr Rahwan, a computer scientist at MIT and one of the paper’s authors, dismiss the trolley problem as a piece of pointless hypothesising that is vanishingly unlikely to arise in real life. He is unconvinced. The specific situations posed by the website may hardly ever occur, he says. But all sorts of choices made by the firms producing self-driving cars will affect who lives and who dies in indirect, statistical ways. He gives the example of overtaking cyclists: “If you stay relatively near to the cycle lane, you’re increasing the chance of hitting a cyclist, but reducing the chance of hitting another car in the next lane over,” he says. “Repeat that over hundreds of millions of trips, and you’re going to see a skew in the [accident] statistics.”

自动驾驶汽车在事故中应该保护谁?

在最近发表于《自然》杂志上的一篇论文中,一支由心理学家和计算机科学家组成的团队描述了另一种方法。他们创造了“道德机器”网站。这个网站会给访客展示一系列的选择,从而决定该救谁,该杀谁。

全世界的答卷人展现出的最明显的倾向是,优先救人类而不是动物的性命,优先救多数而不是少数,优先救孩子而不是老人。以下几种倾向则没有那么明显:救女性而非男性,行人而非车里的乘客,采取行动而非听其自然。罪犯真正意义上地被视为低人一等,在公众的优先级名单上排序低于狗,但是高于猫。优先救医生而非其他人的性命,这背后功利主义的考量并不难想象。但人们(较弱的)优先救运动员的倾向,看起来就没那么容易理解了。

拉万博士是麻省理工的计算机科学家,也是这篇论文的作者之一,他说许多人不再思考电车问题,将它视为一个没有意义的假说,认为它几乎不可能在现实中出现。拉万并没有被这种看法说服。他说,网站上给出的具体场景可能几乎不会出现,但是生产自动驾驶汽车的公司做的各种各样的抉择会影响到孰生孰死,这种影响是间接的,统计学层面的。他拿赶超骑行者一事举了例子:“如果你相对离自行车道近一些,你就增加了撞到骑行者的几率,但是降低了和旁边车道的另一辆汽车相撞的几率。”他说,“在上亿次旅行中重复这一情况,你就发现事故统计数据的偏态。”

英语流利说 第23天

标签:neu   rms   ips   smi   nes   ide   是什么   span   font   

原文地址:https://www.cnblogs.com/qianyindichang/p/10043176.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!