Image: Italian Defense Minister Guido Crosetto was the first to issue a warning about a ‘serious ongoing fraud’ through a post on social media.
A group of scammers, perfectly imitating the voice of Italian Defense Minister Guido Crosetto, called Italy’s top business leaders and said, “We urgently need ransom money to free Italian journalists kidnapped in the Middle East.” Who would doubt the voice of the defense minister on the other end of the line? Renowned fashion designer Giorgio Armani, Prada chairman Patrizio Bertelli, and other high-profile executives received these calls. One of them, former Inter Milan owner Massimo Moratti, believed the call and transferred nearly 1 million euros to an account in Hong Kong. (reuters.com)
The technology behind this scam was artificial intelligence (AI), which was able to flawlessly clone Crosetto’s voice. The scammers called from a government number and used an AI-processed voice to gain credibility. They even completed the scam by promising, as representatives of the Bank of Italy, to return the money. (ft.com)
As shocking as this incident is, it’s not the only one; AI-driven scams are increasing worldwide. According to a survey by McAfee, one in four people have been targeted by or know someone who has experienced AI voice cloning fraud. Among those affected, 77% suffered financial losses. (mcafee.com)
Another survey by Starling Bank revealed that 28% of adults in the UK have been targeted by AI voice-cloning scams, and 46% are unaware that such scams exist. Scammers can clone a voice from just a few seconds of audio and then call victims’ acquaintances requesting urgent financial help. (ffnews.com)
To protect yourself from these scams, experts advise establishing a “safe phrase” with your contacts which only you and your close ones know. In an emergency, asking for this phrase can help verify the authenticity of a call. (cbsnews.com)
How to Protect Yourself from AI Voice-Cloning Scams:
✅ Be cautious with unexpected calls or requests: If someone you know contacts you unexpectedly and asks for money, always verify first.
✅ Verify through another channel: If a caller’s voice seems suspicious, confirm their identity via message, email, or video call.
✅ Use a safe phrase: Establish a code word or phrase with family members or trusted contacts that scammers would not know.
✅ Avoid suspicious links and phishing: Don’t click suspicious links in emails or messages, especially if they claim to be from banks or government agencies.
✅ Stay informed about new AI scams: Keep up with technology and stay updated on new scam tactics.
✅ Protect your personal information: Don’t overshare your voice or personal details on social media, as scammers could use these to set up fraud schemes.
As AI technology advances, so do scamming techniques. If even Italy’s top business leaders can fall for these tricks, it means we all need to be vigilant. If you receive an unexpected or urgent request, verify it through another channel. Stay alert, stay safe.
Now, let’s see if you can detect an artificial intelligence-generated voice. Take the test at the website below:
http://mcafee.com/ai/news/ai-voice-scam/
affordablecarsales.co.nz

Leave a comment