023-89865555

热点推荐
您当前的位置:新东方网 > 重庆新东方学校 > 大学考试 > 考研 > 文章正文

2016考研英语阅读每日精选:机器人大战_考研数学一

2016-01-13 09:59  作者:  来源:新东方网整理  字号:T|T

在考研英语中,阅读分数可谓是占到了总分的半壁江山,正所谓“得阅读者得考研”。对于备考2016考研的同学们,在平时的复习中一定要拓展阅读思路,各类话题都要关注,这样才能在整体上提升考研英语阅读水平!新东方网考研频道分享《2016考研英语阅读精选》,一起来学习吧!

Worried over robot war

机器人大战:科技背后的隐忧

导读:科技日新月异,机器人技术的迅速发展给我们的生活带来了诸多便利。不过,如果机器人成为了《终结者》中的战争机器,人类有能力应对吗?

It sounds like a science-fiction nightmare. But “killer robots” have the likes of British scientist Stephen Hawking and Apple co-founder Steve Wozniak fretting and warning the machines could fuel ethnic cleansings and an arms race.

机器人变杀手,听上去很像科幻小说中的情节,但是英国科学家霍金以及苹果联合创始人沃兹尼亚克都对此忧心忡忡。他们警告世人:这样的机器可能会引发种族清洗和军备竞赛。

Autonomous weapons, which use artificial intelligence to select targets without human intervention, have been described as “the third revolution in warfare, after gunpowder and nuclear arms,” about 1,000 tech bigwigs wrote in an open letter on July 28.

7月28日,大约1000名科技界的大人物联名签署公开信,信中表示:自主武器可以使用人工智能选择目标,不需要人力介入,这样的技术也被形容为“继火药和核武器之后的第三次战争革命”。

Unlike drones, which require a human hand in their action, this kind of robot would have someautonomous decision-making abilities and the capacity to act on its own authority.

无人机还需要人来操控其行动,杀手机器人与其不同的是,他们某种程度上拥有自主决策能力,以及自我行动的能力。

“The key question for humanity today is whether to start a global AI (artificial intelligence) arms race or to prevent it from starting,” they wrote.

科学家在信中表示,“当今人性的关键问题在于,是去开启一次全球的人工智能军备竞赛,还是将这样的趋势扼杀在摇篮中。”

“If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable,” said the letter released at the opening of the 2015 International Joint Conference on Artificial Intelligence in Buenos Aires.考研数学一

2015年的国际人工智能联合大会在布宜诺斯艾利斯举行,这封信在大会的开幕式上公布:“如果有军事力量开始推动人工智能武器发展,全球军备竞赛将不可避免”。

The idea of an automated killing machine – made famous by Arnold Schwarzenegger’s Terminator – is moving swiftly from science fiction to reality, according to the scientists.

自主化杀人机器的理念,从施瓦辛格的《终结者》电影开始为人熟知。而科学家们认为,这一概念正从科幻小说中进入到现实世界。

“The deployment of such systems is – practically if not legally – feasible within years, not decades,” the letter said.

“部署这类系统——特别是非法部署——可以在短短数年时间内完成,不需要几十年的时间。”

Lower bar for entry

门槛低

The development of such weapons, while potentially reducing the extent of battlefield casualties, might also lower the threshold for going to battle, noted the scientists.

科学家表示,这类武器的发展,有可能降低战场伤亡,但同时也可能降低了战争爆发的门槛。

The scientists painted an apocalyptic scenario in which autonomous weapons fall into the hands of terrorists, dictators or warlords hoping to carry out ethnic cleansings.

而如果自主化武器落入恐怖分子、独裁者手中,或者被军阀用于种族清洗,那人类将要大难临头。

The group concluded with an appeal for a “ban on offensive autonomous weapons beyond meaningful human control.”

科学家们请求“在有意义的人类控制基础上,禁止攻击性自主化武器”。

In a 2014 BBC interview, Hawking said the development of full artificial intelligence could spell the end of the human race.

在2014年BBC的采访中,霍金表示全方位人工智能的发展,可能会把人类推向末日。

“It would take off on its own, and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded,” he said.

他表示,“他们可以自行启动,以惊人的速率重塑自身。而人类受限于缓慢的生物进化,无法与其匹敌,最终会被其取代。”

Authorities are gradually waking up to the risk of robot wars. Last May, for the first time, the United Nations brought governments together to begin talks on so-called “lethal autonomous weapons systems” that can select targets and carry out attacks without direct human intervention.

有关部门也逐渐意识到机器人战争的危险性。去年5月,联合国第一次将众多政府部分汇聚在一起,讨论所谓“致命自主武器系统”可以在没有直接人力干预的情况下,选择目标并且实施攻击的问题。

In 2012, the US government imposed a 10-year human control requirement on automated weapons.

2012年,美国政府强制规定自主武器需要10年人工控制。

There have been examples of weapons being stopped in their infancy.

自主武器在发展初期就被终止的例子屡见不鲜。

After UN-backed talks, blinding laser weapons were banned in 1998, before they ever hit the battlefield.

1998年,经过联合国支持下的讨论,激光致盲武器在真正亮相战场之前就被禁止。



    论坛热门