I am reading about deep faked images. What else should I be concerned about?
Deep fake technology is advancing rapidly because of AI. This generates a lot of risk for employers.
The one risk that concerns me the most is deep fake voice replication and wire scams. Criminals are already using AI for voice replication to scam parents and grandparents – specifically by replicating a loved one's voice.
Recently, a 73-year-old grandmother and her husband were scammed with voice replication into believing that her grandson needed bail money. The caller on the other line was not her grandson, but a scammer. The elderly couple wired money and were planning to wire even more money when a banker trained in these scams intervened to warn them of a possible fraud leading to the discovery of the scam. The tech used was so good that the elderly couple was convinced they were speaking to their grandson. https://twistedsifter.com/2023/04/grandmother-tricked-into-believing-her-grandchild-is-jailed-by-scammers-using-voice-cloning/
The same issue exists for employers. Deep fake voice technology has been used to convince bankers to provide access to accounts and likely will be used in other ways as well. https://www.euronews.com/next/2023/03/25/audio-deepfake-scams-criminals-are-using-ai-to-sound-like-family-and-people-are-falling-fo
Of real concern is coupling deep fake voice tech with business email compromise. For example, if CEO Joe has spyware on this computer and is out of the country negotiating a deal, a criminal may send a carefully crafted email to Emily, head of account payables, asking for a wire transfer of $500,000. Having been trained to spot email compromise scams and with wire transfer security standards in place, Emily does not respond to the email without voice confirmation from Joe. Using deep fake voice tech, the scammer calls Emily, sounding just like Joe (having heard voice mails from Joe via the spyware), providing details of the deal (having learned of it through spyware) and Emily wires the money - scam complete.
The final takeaway is that voice replication technology is a real risk for everyone from grandparents to CEOs. To counteract wire transfer scams, organizations must incorporate the use of codes before money is wired. To be effective, however, the code needs to be disseminated by voice in-person just in case malware is monitoring email and text communications.
This is one of the ironies of communication technology - the more we depend on tech to communicate, the more we have to rely on more old-fashioned methods to avoid the scams.
Jack McCalmon, Leslie Zieren, and Emily Brodzinski are attorneys with more than 50 years combined experience assisting employers in lowering their risk, including answering questions, like the one above, through the McCalmon Group's Best Practices Help Line. The Best Practice Help Line is a service of The McCalmon Group, Inc. Your organization may have access to The Best Practice Help Line or a similar service from another provider at no cost to you or at a discount. For questions about The Best Practice Help Line or what similar services are available to you via this Platform, call 888.712.7667.
If you have a question that you would like Jack McCalmon, Leslie Zieren, or Emily Brodzinski to consider for this column, please submit it to firstname.lastname@example.org. Please note that The McCalmon Group cannot guarantee that your question will be answered. Answers are based on generally accepted risk management best practices. They are not, and should not be considered, legal advice. If you need an answer immediately or desire legal advice, please call your local legal counsel.