Sydney hospital loses $2 million in alleged BEC fraud
A Sydney hospital lost $2M in a BEC scam. Learn how to protect your business with MFA, email authentication, and robust financial controls.
BitVex is a scam cryptocurrency trading platform. It steals the money innocent victims deposit into it. But inducing people into using the platform isn’t easy.
So, the scammers hit upon an ingenious idea.
They unscrupulously claim BitVex was founded by none other than Elon Musk!
However, their deception doesn’t end there. In order to “validate” the claim that BitVex was established by Musk, the scammers went about creating “proof” from the man himself. This was achieved through creating a fake video showing Musk espousing the benefits of BitVex, then uploading the recording via hacked YouTube accounts.
Look at this fake recording of Musk closely and it is possible to identify it as a fake. However, busy individuals who don’t closely inspect it may be deceived.
This is an example of a Deep Fake – a new and worryingly effective deception tactic. Not only are Deep Fakes being used to dupe people into signing up with fake crypto platforms. They are also being used to deceive Accounts Payable (AP) teams into processing illegitimate outgoing payments.
Imagine you are the CEO of a large energy firm.
One day you receive a phone call from the boss of your German-based parent company. He requests a favour. He needs you to make an urgent payment of US$243,000 to a supplier on his behalf.
He promises that the parent company will reimburse your firm with the funds later the same day.
You have previously spoken to the boss of your parent company numerous times. You immediately recognise his German accent. In fact, the incoming call was from the correct mobile phone number.
Naturally, you go ahead and process the payment.
This scenario actually happened.
After the CEO of the energy firm processed the requested funds to the supplier, he received a second phone call. This call was also supposedly from the parent company’s boss, who stated that the reimbursement had been sent to him, as promised in the initial phone call.
However, not long thereafter, a third phone call came through to the energy firm’s CEO. This time the number displayed was an Austrian phone number. A request was made for additional urgent funds to be sent to the same supplier.
Finally, the energy firm CEO became suspicious.
The reimbursement for the first payment had not yet been received. The fact that the third phone call came from an unrecognised Austrian number was another red flag.
He therefore refused to process the additional payment request and started investigating.
After considerable digging it turned out that the CEO had been scammed.
The fraudsters had apparently used artificial intelligence (AI) software to successfully mimic the German boss’s voice.
Once the funds had been transferred to the “supplier,” they had then been sent on to a bank account in Mexico, before being dispersed to a range of other accounts. This made tracing the fraudsters all but impossible.
This incident is an example of a new attack vector being employed by scammers.
Known as “Deep Fakes,” these scams use sophisticated AI technology to accurately mimic people. By creating Deep Fakes, fraudsters have the ability to realistically impersonate individuals in both video and audio formats.
According to the Australian Strategic Policy Institute:
“A Deep Fake is a digital forgery created through deep learning (a subset of AI). Deep Fakes can create entirely new content or manipulate existing content, including video, images, audio and text. They could be used to defame targets, impersonate or blackmail elected officials and be used in conjunction with cybercrime operations.”
As the technology behind Deep Fakes becomes increasingly sophisticated, not to mention cheaper and more accessible, we are likely to see an uptick in this type of fraud. Presently, Deep Fakes require a significant investment of time and labour to achieve the level of authenticity that is required to deceive people.
However, that is rapidly changing.
Lyrebird is a subscription-based voice generation tool. It allows users to create a synthesised voice based on a small audio sample of an authentic voice. It is even possible to use a tool called Overdub to replace the words a person has spoken in a recorded message with new words.
Neither humans nor machines have any way to easily detect a high-quality Deep Fake.
The use of Deep Fakes by fraudsters presents a significant challenge for any organisation’s Accounts Payable team.
Typical internal controls, such as conducting call backs before processing EFT payments, are clearly no match for such technologies. With no easy way to detect a Deep Fake, every organisation needs to seriously consider implementing additional layers of security.
Eftsure’s unique platform aggregates banking data from over 3 million Australian organisations. Before your Accounts Payable team processes any EFT payments, the banking details are cross referenced against this database to verify their validity.
Whilst call back controls remain important; they may be vulnerable to Deep Fakes. Eftsure gives you a level of knowledge that call backs can’t achieve.
Eftsure automatically notifies you when the banking details match our database, or whether you should pause to investigate further.
Contact Eftsure today for a no-obligation demonstration of the power of aggregated knowledge-sharing.
A Sydney hospital lost $2M in a BEC scam. Learn how to protect your business with MFA, email authentication, and robust financial controls.
CFOs, beware: cybercriminals are exploiting DocuSign’s legitimate business tools to deliver fraudulent invoices directly through trusted channels. This scheme is particularly dangerous …
Because LinkedIn is used as a professional networking platform, account holders don’t use the same caution as they would on Facebook or …
Eftsure provides continuous control monitoring to protect your eft payments. Our multi-factor verification approach protects your organisation from financial loss due to cybercrime, fraud and error.