英语新闻丨人工智能成为美国欺诈者的得力工具

英语新闻丨人工智能成为美国欺诈者的得力工具

00:00
04:59

People in the United States are being warned to stay vigilant against a growing number of scams using artificial intelligence that mimics a person's voice during a phone call to a concerned relative or friend, who is then asked to send money for ransom.

美国警告人们保持警惕,防范利用人工智能的新型诈骗手段,即通过模仿一个人的声音打电话给其亲朋好友,要求他们拿钱赎人来行骗。

The Federal Trade Commission, or FTC, issued the consumer warning alert this year after an increase in the number of people reporting they had been asked to send money after receiving a frantic phone call from a person who they believed was their loved one but was in fact a cloned voice using AI.

今年,联邦贸易委员会(FTC)发出了消费者警告,因为有越来越多的人上报称,他们在接到一个疯狂的电话后被要求汇款,他们以为通话者是他们的亲人,但实际上声音是由人工智能克隆伪造的。

Jennifer DeStefano from Scottsdale, Arizona, experienced the crime firsthand. She told a US Senate judiciary hearing last month that she got a call from an unlisted number in April, and when she picked up, she could hear her daughter, Briana, crying.

上个月,在美国参议院司法委员会的听证会上,来自亚利桑那州斯科茨代尔的珍妮弗·德斯特法诺亲身经历了这起犯罪事件。她说四月份接到了一个没有显示号码的电话,当她接听时,可以听到她的女儿布里安娜在哭泣。

"Mom! I messed up," her daughter said sobbing on the phone call.

“妈妈!我搞砸了,”她女儿在电话里哭着说道。

DeStefano asked her daughter, "OK, what happened?"

珍妮弗问她女儿:“好吧,发生了什么事?”

She then heard a man's voice on the phone telling her daughter to "lay down and put your head back".

然后,她听到一个男人的声音在电话里告诉她女儿“躺下,把头往后仰”。

He then told the worried mother: "Listen here, I have your daughter. You tell anyone, you call the cops, I am going to pump her stomach so full of drugs."

然后,那个男人告诉这位忧心忡忡的母亲:“听好了,你女儿在我手上你敢告诉任何人,你敢报警,我就让她肚子里灌满毒品。”

DeStefano was at her other daughter Aubrey's dance rehearsal when she picked up the phone. She put the phone on mute and asked nearby parents to call 911.

珍妮弗拿起电话时,她正在参加她另一个女儿奥布里的舞蹈排练。她把电话调成静音,让身边的其他家长拨打911。

The scammer first asked her to send $1 million, but when she said she did not have access to that much money, he asked for $50,000 in cash and arranged a meet-up spot.

骗子先是让她汇款100万美元,当珍妮弗说她没这么多钱时,骗子又要了5万美元现金,并安排了见面地点。

The terrified mother said the man on the phone told her that "if I didn't have all the money, then we were both going to be dead".

惊恐万分的母亲说,电话中的男子告诉她,“如果我没拿到所有的钱,我们都会死”。

However, she contacted her husband and daughter and found out Briana was safe, and it was a hoax.

不过,她联系了丈夫和女儿,发现布里安娜平安无事,这只是一个骗局。

Cybercrimes on rise

网络犯罪日益猖獗

Last year, frauds and scams rose 30 percent compared with the previous year, the FTC said. Cybercrimes are also increasing with losses of $10.2 billion last year, the FBI said.

美国联邦贸易委员会表示,去年的欺诈和诈骗案比前一年增加了30%。美国联邦调查局(FBI)表示,网络犯罪也在不断增加,去年损失达102亿美元。

Scammers use AI to mimic a person's voice by obtaining "a short audio clip of your family member's voice from content posted online and a voice-cloning program", the consumer protection watchdog said. When they call, they will sound just like the person's loved one.

美国联邦贸易委员会称,骗子利用人工智能模仿一个人的声音,“从网上发布的内容和语音克隆程序中获取您家人的声音片段”。当他们拨打电话时,他们的声音就和当事人的亲人一模一样。

In another scam, a Canadian couple was duped out of C$21,000($15,940) after listening to an AI voice that they thought was their son, The Washington Post reported in March.

据《华盛顿邮报》3月报道,在另一起骗局中,一对加拿大夫妇在收听了以为是自己儿子的人工智能声音后,被骗走了21,000加元(约合15,940美元)。

According to a recent poll by McAfee, an antivirus software organization in San Jose, California, at least 77 percent of AI scam victims have sent money to fraudsters.

根据位于加利福尼亚州圣何塞的杀毒软件组织McAfee最近进行的一项民意调查,至少有77%的人工智能诈骗受害者曾向骗子汇款。

Of those who reported losing money, 36 percent said they had lost between $500 and $3,000, while 7 percent got taken for anywhere between $5,000 and $15,000, McAfee said.

McAfee说,在报告损失钱财的人中,36%的人说他们损失了500到3000美元,而7%的人被骗了5000到15000美元。

About 45 percent of the 7,000 people polled from nine countries — Australia, Brazil, France, Ger — many, India, Japan, Mexico, the United Kingdom and the US — said they would reply and send money to a friend or loved one who had asked for financial help via a voicemail or note.

在来自澳大利亚、巴西、法国、德国、印度、日本、墨西哥、英国和美国九个国家的7,000名受访者中,约45%的人表示,如果朋友或亲人通过语音邮件或纸条请求经济帮助,他们会回复并汇款。

Forty-eight percent said they would respond quickly if they heard that a friend was in a car accident or had trouble with their vehicle.

48%的人表示,如果听说朋友出了车祸或车辆出了问题,他们会立刻作出回应。

Although phone scams are nothing new worldwide, in this AI version, fraudsters are getting the money sent to them in a variety of ways, including wire transfers, gift cards and cryptocurrency.

尽管电话诈骗在全球范围内已不是什么新鲜事,但在人工智能的加持下,诈骗分子通过电汇、礼品卡和加密货币等各种方式获得汇款。

Consumers are being encouraged to contact the person that they think is calling to check if they are OK before ever sending cash.

消费者被鼓励在寄送现金之前,先与他们认为是打来电话的人取得联系,确认他们是否安全。

FTC Chair Lina Khan warned House lawmakers in April that fraud and scams were being "turbocharged" by AI and were of "serious concern".

美国联邦贸易委员会主席莉娜·汗在4月份警告众议院立法者,人工智能正在“加速”欺诈和诈骗,需引起“严重关切”。

Avi Greengart, president and lead analyst at Techsponential, a technology analysis and market research company in the US, told China Daily: "I think that it is hard for us to estimate exactly how pervasive (AI) is likely to be because this is still relatively new technology. Laws should regulate AI."

美国技术分析和市场研究公司Techsponential总裁兼首席分析师阿维·格林加特告诉《中国日报》:“我认为,我们很难准确估计(人工智能)的普及程度,因为这仍然是一项相对较新的技术。法律应该对人工智能进行监管。”

The software to clone voices is becoming cheaper and more widely available, experts say.

专家说,克隆声音的软件越来越便宜,也越来越普及。

AI speech software ElevenLabs allows users to convert text into voice-overs meant for social media and videos, but many users have already shown how it can be misused to mimic the voices of celebrities, such as actress Emma Watson, podcast host Joe Rogan and columnist and author Ben Shapiro.

人工智能语音软件ElevenLabs允许用户将文本转换为社交媒体和视频的画外音,但许多用户已经展示了它如何被滥用来模仿名人的声音,如女演员艾玛·沃森、播客主持人乔·罗根和专栏作家兼作家本·夏皮罗。

Other videos mimicking the voices of US President Joe Biden and former president Donald Trump have also appeared on platforms such as Instagram.

其他模仿美国总统乔·拜登和前总统唐纳德·特朗普声音的视频也出现在Instagram等平台上。

Scammer

英/ˈskæmə(r)/ 美/ˈskæmər/

n.骗子

Cybercrime

英/ˈsaɪbəkraɪm/ 美/ˈsaɪbərkraɪm/

n.网络犯罪



以上内容来自专辑
用户评论
  • 丢了100的孩子

    stay vigilant against保持警惕send money for ransom拿钱赎人was duped out of被骗走 put the phone on mute把电话调静音 poll 民意调查

  • 紫蓝ZL

    打卡

  • Doris同学

    打卡

  • Ascarid2005

  • Jolene领队英语

    没英语字幕哈