999精品在线视频,手机成人午夜在线视频,久久不卡国产精品无码,中日无码在线观看,成人av手机在线观看,日韩精品亚洲一区中文字幕,亚洲av无码人妻,四虎国产在线观看 ?

Want to Get Humans to Trust Robots? Let Them Dance人機共舞,建立信任

2024-02-19 11:42:53薩姆·瓊斯/文袁峰/譯
英語世界 2024年2期
關鍵詞:動作情感研究

薩姆·瓊斯/文 袁峰/譯

A performance with living and mechanical partners can teach researchers how to design more relatable1 bots.

人機搭檔表演能教會研究人員如何設計更讓人認同的機器人。

A dancer shrouded in shades of blue rises to her feet and steps forward on stage. Under a spotlight, she gazes at her partner: a tall, sleek2 robotic arm. As they dance together, the machine’s fluid movements make it seem less stere-otypically3 robotic—and, researchers hope, more trustworthy.

舞者從深淺不一的藍色光暈中站起身來,走上舞臺。聚光燈下,她凝視著舞伴——一架頎長、優美的機械臂。人機共舞時,機械臂動作流暢,看起來并不刻板機械,研究人員希望這也會讓它看上去更可靠。

“When a human moves one joint, it isn’t the only thing that moves. The rest of our body follows along,” says Amit Rogel, a music technology graduate researcher at Georgia Institute of Technology. “There’s this slight continuity that almost all animals have, and this is really what makes us feel human in our movements.” Rogel programmed this subtle follow-through4 into robotic arms to help create FOREST, a performance collaboration between researchers at Georgia Tech, dancers at Kennesaw State University and a group of robots.

“當人活動關節時,不只是關節在動,身體其他部位也順勢而動。”佐治亞理工學院音樂技術研究生研究員阿米特·羅杰爾說,“幾乎所有動物都具有這種細微的動作連貫性,而這確實讓我們感覺自己是人而非機器。”羅杰爾將這種微妙的順勢動作編入機械臂的程序,助力創作“福雷斯特”——佐治亞理工學院研究人員、肯尼索州立大學舞者和一組機器人三方協作的表演項目。

The goal is not only to create a memorable performance, but to put into practice what the researchers have learned about building trust between humans and robots. Robotics are already widely used, and the number of collaborative robots—which work with humans on tasks such as tending factory machines and inspecting manufacturing equipment—is expected to climb significantly in the coming years. But although they are becoming more common, trust in them is still low—and this makes humans more reluctant to work with them. “People may not understand how the robot operates, nor what it wants to accomplish,” says Harold Soh, a computer scientist at the National University of Singapore. He was not involved in the project, but his work focuses on human-robot interaction and developing more trustworthy collaborative robots.

該項目的目標不僅是創作令人難忘的表演,而且還要將研究人員對建立人類與機器人互信的認識付諸實踐。機器人技術已得到廣泛應用,與人類協同執行照看工廠機器和檢查制造設備等任務的協作機器人的數量有望在未來幾年大幅攀升。但盡管機器人日益常見,人們對它們的信任度卻仍很低,因而越加不愿意與其協作。“人們可能不了解機器人如何運作,也不明白它想要完成什么任務。”新加坡國立大學計算機科學家蘇順鴻表示。他雖未參與該項目,但其工作側重于人類與機器人交互和開發更值得信賴的協作機器人。

Although humans love cute fictional machines like R2-D2 or WALL-E, the best real-world robot for a given task may not have the friendliest looks, or move in the most appealing way. “Calibrating5 trust can be difficult when the robot’s appearance and behavior are markedly different from humans,” Soh says. However, he adds, even a disembodied6 robot arm can be designed to act in a way that makes it more relatable7. “Conveying emotion and social messages via a combination of sound and motion is a compelling approach that can make interactions more fluent and natural,” he explains.

人類喜愛R2-D2或WALL-E之類可愛的科幻機器人,但現實世界中執行特定任務的最佳機器人未必是外表最友善或動作最迷人的。“當機器人的外表和行為與人類迥然不同時,難以通過調試建立信任。”蘇順鴻說,但他又指出,即便是無軀體的機械臂,也可設計得行為舉止更讓人認同。他解釋說:“通過聲音與動作相結合的方式表達情感和傳達社交信息具有說服力,能使交互更加順暢、自然。”

That’s why the Georgia Tech team decided to program nonhumanoid8 machines to appear to convey emotion, through both motion and sound. Rogel’s latest work in this area builds off years of research. For instance, to figure out which sounds best convey specific emotions, Georgia Tech researchers asked singers and guitarists to look at a diagram called an “emotion wheel,” pick an emotion, and then sing or play notes to match that feeling. The researchers then trained a machine learning model—one they planned to embed in the robots—on the resulting data set. They wanted to allow the robots to produce a vast range of sounds, some more complex than others. “You could say, ‘I want it to be a little bit happy, a little excited and a little bit calm,’” says project collaborator Gil Weinberg, director of Georgia Tech’s Center for Music Technology.

正因如此,佐治亞理工學院團隊決定為非人形機器編制程序,使其看似能通過動作和聲音表達情感。羅杰爾在這一領域的最新工作建立在多年研究的基礎上。例如,為了弄清哪些聲音最能表達特定情感,佐治亞理工學院研究人員讓多名歌手和吉他手觀看《情感輪盤》示意圖,挑選一種情感,然后詠唱或演奏匹配的樂音來表達該情感。然后,研究人員運用由此獲得的數據集訓練一個機器學習模型——他們計劃將該模型嵌入機器人。他們想讓機器人發出各種各樣的聲音,其中一些聲音比其他聲音更復雜。佐治亞理工學院音樂技術中心主任、項目協作者吉爾·溫伯格說:“你可以說,‘我希望它有些許快樂、些許興奮、些許平靜。’”

Next, the team worked to tie those sounds to movement. In 2020, the researchers had demonstrated that combining movement with emotion-based sound improved trust in robotic arms in a virtual setting (a requirement fostered by the pandemic). But that experiment only needed the robots to perform four different gestures to convey four different emotions. To broaden a machine’s emotional-movement options for his new study, which has been conditionally accepted for publication in Frontiers in Robotics and AI, Rogel waded through9 research related to human body language. “For each one of those body language [elements], I looked at how to adapt that to a robotic movement,” he says. Then, dancers affiliated with Kennesaw State University helped the scientists refine those movements. As the performers moved in ways intended to convey emotion, Rogel and fellow researchers recorded them with cameras and motion-capture suits, and subsequently generated algorithms so that the robots could match those movements. “I would ask [Rogel], ‘can you make the robots breathe?’ And the next week, the arms would be kind of ‘inhaling’ and ‘exhaling,’” says Kennesaw State University dance professor Ivan Pulinkala.

接下來,研究團隊將這些聲音與動作結合起來。2020年,研究人員證明,將動作與情感性聲音相結合,增進了人們在虛擬環境中對機械臂的信任(這是疫情催生的要求)。但這項實驗只需機器人做出四種不同手勢來表達四種不同情感。羅杰爾費心進行與人類肢體語言相關的研究,從而在自己的新研究中拓寬了機器的情感動作選項。該研究已被《機器人與人工智能前沿》雜志擬錄用。他表示:“對于其中每一種肢體語言[元素],我都在研究如何使其適配機器人動作。”隨后,肯尼索州立大學下屬的舞者們協助科研人員優化了這些動作。當表演者舞動傳情時,羅杰爾與其他研究人員用相機和動作捕捉服予以記錄,然后生成算法,以便機器人能適配這些動作。“我會問(羅杰爾),‘你能讓機器人呼吸嗎?’于是在下一周,機器臂將會做‘吸氣’‘呼氣’動作。”肯尼索州立大學舞蹈學教授伊萬·普林卡拉說。

Pulinkala choreographed10 the FOREST performance, which put into practice what the researcher-dancer team learned about creating and deploying emotion-based sounds and movements. “My approach was to kind of breathe a sense of life into the robots and have the dancers [appear] more ‘mechanized,’” Pulinkala says, reflecting on the start of the collaboration. “I asked, ‘How can the robots have more emotional physicality11? And how does a dancer then respond to that?’”

普林卡拉編排了這場弗雷斯特表演,將研究人員與舞者聯合團隊在創作和運用富于情感的聲音和動作方面的認識付諸實踐。“我的做法是給機器人注入生命感,而讓舞者[顯得]更具‘機械感’。”普林卡拉說,并回想起合作之初:“我自問,‘機器人如何才能有更多的情感性體征?舞者對此又作何回應?’”

According to the dancers, this resulted in machines that seemed a little more like people. Christina Massad, a freelance professional dancer and an alumna of Kennesaw State University, recalls going into the project thinking she would be dancing around the robots—not with them. But she says her mindset shifted as soon as she saw the fluidity of the robots’ movements, and she quickly started viewing them as more than machines. “In one of the first rehearsals, I accidentally bumped into one, and I immediately told it, ‘Oh my gosh, I’m so sorry,’” she says. “Amit laughed and told me, ‘It’s okay, it’s just a robot.’ But it felt like more than a robot.”

舞者們說這導致機器看上去更有點像人。克里斯蒂娜·馬薩德是一名自由職業舞者,也是肯尼索州立大學校友。她記得加入這個項目時,還以為自己會圍著機器人跳舞,而不是與其共舞。但她說,看到機器人動作流暢,她的看法頓時改變,隨即不再將它們視為單純的機器。“在最初的一次排練中,我不小心撞到一個機器人,立馬就對它說,‘天哪,真對不起。’”她說。“阿米特笑著對我說,‘沒事,它只是個機器人。’可它給人的感覺卻不僅僅是機器人。”

Soh says he finds the performance fascinating and thinks it could bring value to the field of human-robot relationships. “The formation and dynamics of trust in human-robot teams is not well-understood,” he says, “and this work may shed light on the evolution of trust in teams.”

蘇順鴻說,他發現機器人的表演引人入勝,認為這或可對研究人類與機器人的關系具有意義。“人們尚未充分了解人類與機器人組合中信任的形成和發展變化。”他說,“這項工作可使人進一步了解人機組合中信任的演化。”

(譯者為“《英語世界》杯”翻譯大賽獲獎者)

1 relatable能讓人認同的,能讓人產生共鳴的。? 2 sleek線條流暢的,造型優美的。? 3 stereotypically模式化地,刻板地。? 4 follow-through順勢動作。

5 calibrate調諧,調適。

6 disembodied脫離軀體的;由看不見的人發出的。? 7 relatable可明白的,可理解的。? 8 nonhumanoid非人形的,非類人的。

9 wade through艱難地處理,費力地閱讀。? 10 choreograph設計舞蹈動作,編舞。

11 physicality身體特征,肉體性。

猜你喜歡
動作情感研究
FMS與YBT相關性的實證研究
遼代千人邑研究述論
如何在情感中自我成長,保持獨立
失落的情感
北極光(2019年12期)2020-01-18 06:22:10
視錯覺在平面設計中的應用與研究
科技傳播(2019年22期)2020-01-14 03:06:54
情感
EMA伺服控制系統研究
如何在情感中自我成長,保持獨立
動作描寫要具體
畫動作
主站蜘蛛池模板: 久夜色精品国产噜噜| 亚洲男人在线| 香蕉视频在线观看www| 国产精品xxx| 国产又粗又爽视频| 欧美h在线观看| 色噜噜在线观看| 国产成人a毛片在线| 一级毛片基地| 国产91av在线| 亚洲精品波多野结衣| 全部免费毛片免费播放| 国产亚洲欧美在线中文bt天堂| 亚洲国产成人精品无码区性色| 欧美精品综合视频一区二区| 青青青伊人色综合久久| 成人久久18免费网站| 九色视频最新网址| 国产亚洲现在一区二区中文| 国产成人亚洲综合A∨在线播放| 久久这里只精品国产99热8| 99精品这里只有精品高清视频| 亚洲精品视频网| 国产免费怡红院视频| 欧美综合成人| 亚洲人成网站日本片| 亚洲91在线精品| 99伊人精品| 欧美精品导航| 国产精品爽爽va在线无码观看| 超碰91免费人妻| 精品无码视频在线观看| 国产一区二区三区视频| 美女国产在线| 亚洲一区二区黄色| 国产网站免费观看| 国产综合另类小说色区色噜噜| 色一情一乱一伦一区二区三区小说| 任我操在线视频| 国产一二三区在线| 日韩av手机在线| 国产亚洲欧美在线人成aaaa| 99在线观看精品视频| 国产资源免费观看| 国产精品永久免费嫩草研究院| 国产毛片片精品天天看视频| 亚洲成人网在线观看| 久久婷婷综合色一区二区| 国产av无码日韩av无码网站| 国产69囗曝护士吞精在线视频| 91啦中文字幕| 午夜激情婷婷| 亚洲性影院| 欧美精品亚洲二区| 黄片在线永久| 一本大道无码高清| 亚洲高清无在码在线无弹窗| 亚洲人在线| 黄色网站在线观看无码| 免费人欧美成又黄又爽的视频| 亚洲第一香蕉视频| 国产成人精品高清在线| 色有码无码视频| 国产一区二区三区免费观看| 久久黄色免费电影| 精品国产香蕉伊思人在线| 免费国产不卡午夜福在线观看| 黄色福利在线| 又污又黄又无遮挡网站| 毛片卡一卡二| 亚洲熟女中文字幕男人总站| 国产欧美精品一区aⅴ影院| 免费黄色国产视频| 日韩在线2020专区| 97在线视频免费观看| 在线观看亚洲精品福利片| 精品色综合| 制服丝袜 91视频| 亚洲性日韩精品一区二区| 激情乱人伦| 有专无码视频| 日本欧美一二三区色视频|