Scammers are using AI to clone kids’ voices and fool parents in India

Audio deepfakes can be created with a real clip of the target’s voice. PHOTO: LIANHE ZAOBAO

When a father in India received a call from an unknown overseas number in January, he did not expect he would be the latest to fall for an elaborate fraud scheme involving Artificial Intelligence (AI).

The scammer, who claimed he was a police officer, told Mr Himanshu Shekhar Singh that his 18-year-old son had been caught with a gang of rapists and needed Rs30,000 (S$486) so he could clear his name, Indian media reported on Feb 12.

Mr Singh told The Indian Express of the Jan 8 incident: “The next minute, I heard a voice saying, ‘Papa please pay him, they are real policemen, please save me’.

“I could not doubt even for a second that he was not my boy. The style of speaking, crying… everything was the same.”

While he was suspicious, he feared the caller was a kidnapper, so he made an initial payment of Rs10,000 (S$162).

He then decided to look for his son on his own first. The teenager was later found taking a test at an education centre unharmed.

This was one of three prominent cases to rock New Delhi and its National Capital Region in recent weeks as scammers tap into AI to develop convincing fake voice recordings of children to trick their parents into transferring money.

Audio deepfakes can be created with a real clip of the target’s voice.

In a similar case, a mother from Noida, in Uttar Pradesh, received a scam call where the scammers used technology to mimic her son’s voice.

Fortunately, her son was studying in front of her when the scammers called.

The journalist, who was not named in the article, said: “It is a huge concern that cyber criminals are targeting children now. From where they are getting details of kids and their parents?… This must be thoroughly investigated with utmost priority.”

“Such cases are not very frequent, but recently there has been an uptick in cases of ‘cloning’. We are trying to understand how exactly cyber criminals are creating cloned voices to dupe people,” said Delhi police officer Manish Kumar Mishra.

Such voice cloning cases have happened in other parts of the world.

In May 2023, police in a region of Inner Mongolia were alerted to a case where a scammer used face-swopping technology to impersonate a victim’s friend during a video call.

Believing that his friend needed to pay a deposit to complete a bidding process, the victim transferred 4.3 million yuan (S$805,000) to the scammer.

He realised he had been duped only after the friend said he knew nothing of the situation.

In Hong Kong, a multi-national company was scammed of HK$200 million (S$34 million) after an employee attended a video conference call with deepfake recreations of the company’s Britain-based chief financial officer and other employees.

The “fake” colleagues had given orders to the employee to transfer the sum to separate accounts and the victim had complied.

Join ST's Telegram channel and get the latest breaking news delivered to you.