One argument for why robots will never fully measure up to people is because they lack human-like social skills.
But researchers are experimenting with new methods to give robots social skillsto better interact with humans. Two new studies provide evidence of progress in this kind of research.
One experimentwas carried out by researchers from the Massachusetts Institute of Technology, MIT. The team developed a machine learning system for self-driving vehiclesthat is designed to learn the social characteristics of other drivers.
The researchers studied driving situations to learn how other drivers on the road were likely to behave. Since not all human drivers act the same way, the data was meant to teach the driverless car to avoid dangerous situations.
The researchers say the technology uses tools borrowed from the field of social psychology. In this experiment, scientists created a system that attempted to decide whether a person's driving style is more selfish or selfless. In road tests, self-driving vehicles equipped with the system improved their ability to predict what other drivers would do by up to 25 percent.
研究人员说，这项技术使用了从社会心理学领域借来的工具。 在该实验中，科学家创建了一个系统，试图确定一个人的驾驶风格是更自私还是无私。 在路试中，配备有该系统的无人驾驶车辆预测其他驾驶员将做什么的能力提高了25％。
In one test, the self-driving car was observed making a left-hand turn. The study found the system could cause the vehicle to wait before making the turn if it predicted the oncoming drivers acted selfishly and might be unsafe. But when oncoming vehicles were judged to be selfless, the self-driving car could make the turn without delay because it saw less risk of unsafe behavior.
Wilko Schwarting is the lead writer of a report describing the research. He toldMIT Newsthat any robot working with or operating around humans needs to be able to effectively learn their intentions to better understand their behavior.
在一项测试中，观察到自动驾驶汽车左转弯。 研究发现，如果系统预测迎面而来的驾驶员自私自利并且可能不安全，则该系统可能导致车辆在转弯之前等待。 但是当对面的车辆被判定为无私时，自动驾驶汽车可以毫不延误地转弯，因为它看到不安全行为的风险较小。
Wilko Schwarting是描述该研究报告的主要作者。 他告诉《麻省理工新闻》，任何与人类合作或在人类周围工作的机器人都必须能够有效地学习其意图，以更好地理解其行为。
Another social experiment involved a game competitionbetween humans and a robot. Researchers from Carnegie Mellon University testedwhether a robot's "trash talk" would affect humans playing in a game against the machine. To "trash talk" is to talk about someone in a negative or insulting way usually to get them to make a mistake.
另一个社会实验涉及人与机器人之间的游戏竞赛。 卡内基梅隆大学的研究人员测试了机器人的“垃圾话语”是否会影响人类在与机器的游戏中玩游戏。 “垃圾谈话”是通常以消极或侮辱性的方式谈论某人，以使他们犯错。
A humanoid robot, named Pepper, was programmed to say things to a human opponent like "I have to say you are a terrible player." Another robot statement was, "Over the course of the game, your playing has become confused."
The study involved each of the humans playing a game with the robot 35 different times. The game was called Guards and Treasures which is used to study decision making. The study found that players criticized by the robot generally performed worse in the games than humans receiving praise.
这项研究涉及每个人在35次不同时间与机器人玩游戏的情况。 该游戏称为“守卫与宝藏”，用于研究决策。 研究发现，受到机器人批评的玩家在游戏中的表现通常要比受到称赞的人差。
Words in This Story：
characteristic – n.a quality that makes one person or thing different from others
style – n.way of doing something
intention – n.something a person plans to do
tendency – n.something someone often does
collaborative – adj.working together for a particular purpose