Zürcher Nachrichten - Can you trust your ears? AI voice scams rattle US

EUR -
AED 3.872734
AFN 72.005736
ALL 98.118186
AMD 410.968376
ANG 1.906708
AOA 961.599428
ARS 1056.404251
AUD 1.628181
AWG 1.892622
AZN 1.792557
BAM 1.956125
BBD 2.136045
BDT 126.421034
BGN 1.958089
BHD 0.397096
BIF 3124.456905
BMD 1.054385
BND 1.418344
BOB 7.310112
BRL 6.111432
BSD 1.057881
BTN 88.882088
BWP 14.462749
BYN 3.46206
BYR 20665.954364
BZD 2.132445
CAD 1.484005
CDF 3020.814401
CHF 0.935893
CLF 0.037437
CLP 1032.991635
CNY 7.627444
CNH 7.634046
COP 4666.361951
CRC 538.42022
CUC 1.054385
CUP 27.941214
CVE 110.283349
CZK 25.258223
DJF 188.382236
DKK 7.458459
DOP 63.744233
DZD 140.480035
EGP 51.984044
ERN 15.815781
ETB 128.119907
FJD 2.399729
FKP 0.832244
GBP 0.834351
GEL 2.883733
GGP 0.832244
GHS 16.899571
GIP 0.832244
GMD 74.861719
GNF 9117.20866
GTQ 8.170359
GYD 221.22366
HKD 8.207705
HNL 26.717966
HRK 7.521204
HTG 139.083693
HUF 407.428254
IDR 16707.527875
ILS 3.953204
IMP 0.832244
INR 89.019917
IQD 1385.823999
IRR 44381.713142
ISK 145.09392
JEP 0.832244
JMD 168.017516
JOD 0.747665
JPY 163.11606
KES 136.996819
KGS 91.207906
KHR 4274.035393
KMF 491.949854
KPW 948.946484
KRW 1468.131511
KWD 0.324266
KYD 0.881655
KZT 525.732457
LAK 23245.757353
LBP 94734.612531
LKR 309.064353
LRD 194.648693
LSL 19.246211
LTL 3.113326
LVL 0.637787
LYD 5.166884
MAD 10.546605
MDL 19.222107
MGA 4921.028776
MKD 61.620252
MMK 3424.602737
MNT 3582.801623
MOP 8.482871
MRU 42.233029
MUR 49.777883
MVR 16.289872
MWK 1834.492213
MXN 21.448266
MYR 4.709414
MZN 67.409471
NAD 19.246485
NGN 1757.428672
NIO 38.929431
NOK 11.673913
NPR 142.216383
NZD 1.797105
OMR 0.405569
PAB 1.057861
PEN 4.016068
PGK 4.25393
PHP 61.918744
PKR 293.731742
PLN 4.3158
PYG 8254.412497
QAR 3.856542
RON 4.977964
RSD 117.02535
RUB 105.312253
RWF 1453.024436
SAR 3.960277
SBD 8.846736
SCR 14.593034
SDG 634.226864
SEK 11.562232
SGD 1.41469
SHP 0.832244
SLE 23.837493
SLL 22109.940199
SOS 604.635005
SRD 37.236149
STD 21823.649537
SVC 9.256628
SYP 2649.174867
SZL 19.239748
THB 36.622995
TJS 11.277062
TMT 3.700893
TND 3.337655
TOP 2.469478
TRY 36.367477
TTD 7.183263
TWD 34.289139
TZS 2804.665046
UAH 43.699036
UGX 3882.590743
USD 1.054385
UYU 45.396692
UZS 13541.252969
VES 48.21917
VND 26755.030203
VUV 125.178757
WST 2.943413
XAF 656.097273
XAG 0.034399
XAU 0.000407
XCD 2.849529
XDR 0.796951
XOF 656.069267
XPF 119.331742
YER 263.464552
ZAR 19.109469
ZMK 9490.735335
ZMW 29.044695
ZWL 339.511677
  • SCS

    -0.0400

    13.23

    -0.3%

  • NGG

    0.3800

    62.75

    +0.61%

  • GSK

    -0.6509

    33.35

    -1.95%

  • RIO

    0.5500

    60.98

    +0.9%

  • BCC

    -0.2600

    140.09

    -0.19%

  • BTI

    0.9000

    36.39

    +2.47%

  • AZN

    -1.8100

    63.23

    -2.86%

  • CMSD

    0.0822

    24.44

    +0.34%

  • CMSC

    0.0200

    24.57

    +0.08%

  • JRI

    0.0235

    13.1

    +0.18%

  • BCE

    -0.0200

    26.82

    -0.07%

  • RYCEF

    0.0400

    6.82

    +0.59%

  • VOD

    0.0900

    8.77

    +1.03%

  • RBGPF

    61.8400

    61.84

    +100%

  • BP

    -0.0700

    28.98

    -0.24%

  • RELX

    -1.5000

    44.45

    -3.37%

Can you trust your ears? AI voice scams rattle US
Can you trust your ears? AI voice scams rattle US / Photo: Chris Delmas - AFP

Can you trust your ears? AI voice scams rattle US

The voice on the phone seemed frighteningly real -- an American mother heard her daughter sobbing before a man took over and demanded a ransom. But the girl was an AI clone and the abduction was fake.

Text size:

The biggest peril of Artificial Intelligence, experts say, is its ability to demolish the boundaries between reality and fiction, handing cybercriminals a cheap and effective technology to propagate disinformation.

In a new breed of scams that has rattled US authorities, fraudsters are using strikingly convincing AI voice cloning tools -- widely available online -- to steal from people by impersonating family members.

"Help me, mom, please help me," Jennifer DeStefano, an Arizona-based mother, heard a voice saying on the other end of the line.

DeStefano was "100 percent" convinced it was her 15-year-old daughter in deep distress while away on a skiing trip.

"It was never a question of who is this? It was completely her voice... it was the way she would have cried," DeStefano told a local television station in April.

"I never doubted for one second it was her."

The scammer who took over the call, which came from a number unfamiliar to DeStefano, demanded up to $1 million.

The AI-powered ruse was over within minutes when DeStefano established contact with her daughter. But the terrifying case, now under police investigation, underscored the potential for cybercriminals to misuse AI clones.

- Grandparent scam -

"AI voice cloning, now almost indistinguishable from human speech, allows threat actors like scammers to extract information and funds from victims more effectively," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

A simple internet search yields a wide array of apps, many available for free, to create AI voices with a small sample -- sometimes only a few seconds -- of a person's real voice that can be easily stolen from content posted online.

"With a small audio sample, an AI voice clone can be used to leave voicemails and voice texts. It can even be used as a live voice changer on phone calls," Khaled said.

"Scammers can employ different accents, genders, or even mimic the speech patterns of loved ones. [The technology] allows for the creation of convincing deep fakes."

In a global survey of 7,000 people from nine countries, including the United States, one in four people said they had experienced an AI voice cloning scam or knew someone who had.

Seventy percent of the respondents said they were not confident they could "tell the difference between a cloned voice and the real thing," said the survey, published last month by the US-based McAfee Labs.

American officials have warned of a rise in what is popularly known as the "grandparent scam" -– where an imposter poses as a grandchild in urgent need of money in a distressful situation.

"You get a call. There's a panicked voice on the line. It's your grandson. He says he's in deep trouble —- he wrecked the car and landed in jail. But you can help by sending money," the US Federal Trade Commission said in a warning in March.

"It sounds just like him. How could it be a scam? Voice cloning, that's how."

In the comments beneath the FTC's warning were multiple testimonies of elderly people who had been duped that way.

- 'Malicious' -

That also mirrors the experience of Eddie, a 19-year-old in Chicago whose grandfather received a call from someone who sounded just like him, claiming he needed money after a car accident.

The ruse, reported by McAfee Labs, was so convincing that his grandfather urgently started scrounging together money and even considered re-mortgaging his house, before the lie was discovered.

"Because it is now easy to generate highly realistic voice clones... nearly anyone with any online presence is vulnerable to an attack," Hany Farid, a professor at the UC Berkeley School of Information, told AFP.

"These scams are gaining traction and spreading."

Earlier this year, AI startup ElevenLabs admitted that its voice cloning tool could be misused for "malicious purposes" after users posted a deepfake audio purporting to be actor Emma Watson reading Adolf Hitler's biography "Mein Kampf."

"We're fast approaching the point where you can't trust the things that you see on the internet," Gal Tal-Hochberg, group chief technology officer at the venture capital firm Team8, told AFP.

"We are going to need new technology to know if the person you think you're talking to is actually the person you're talking to," he said.

P.Gashi--NZN