The use of deepfake technology is taking a concerning turn as evidenced this week when scammers reportedly used Artificial Intelligence (AI) to clone the voice of a teenager who was traveling and issue a ransom demand to her mother. By all accounts, the voice of the virtual kidnapping “victim” had the identical inflections and emotion of the girl who was on a ski trip and unaware of the scam while her mother negotiated with the fake kidnappers demanding a $1 million ransom.
According to the FBI, the current abundance of easily available personal information, combined with the widespread use of social media has created a fertile environment for virtual kidnapping scams. Beyond a kidnapping, the most common scenarios include the “victim” being involved in a car accident, arrested or some other incident where they are unable to come to the phone.
There is concern about this type of scam proliferating in corporate settings since many executives have resumed overseas travel and companies may not have travel security programs that can verify the whereabouts of employees while they are on the road.
Insite has authoried an Advisory that details tactics used by the perpetrators of virtual kidnappings and offers mitigating strategies. The key preventative step is to reduce your digital footprint on social media and especially around personal information that is sold by data brokers. Insite provides programs to manage these risk reduction strategies for our clients.
Click here to request more information.