Singapore Issues Multi-Agency Advisory On AI, Deepfake Scams

March 21, 2025
Back
Authorities in Singapore have urged firms to introduce technology-based solutions to prevent the spread of AI-generated deepfake content that may be used to scam their employees.

Authorities in Singapore have urged firms to introduce technology-based solutions to prevent the spread of AI-generated deepfake content that may be used to scam their employees.

Last week, the Singapore Police Force (SPF), Monetary Authority of Singapore (MAS) and Cyber Security Agency of Singapore (CSA) issued a joint advisory on the new scam typology.

The three agencies said that scammers are using AI to create or manipulate digital content to produce deepfakes.

The advisory calls attention to the use of this technology to impersonate senior executives at companies where the target of the scam is employed.

Victims are sent unsolicited WhatsApp messages from scammers who claim to be their executive colleagues, and the victim is then asked to join a live Zoom call to discuss company business.

Victims are then shown a manipulated video in which their senior colleagues appear to instruct them to transfer funds out of corporate accounts, typically under the guise of project financing or investments.

The three agencies said the deepfake videos sometimes include representations of investors or regulatory staff, such as MAS officials.

Some victims are also asked to disclose personal information, such as their National Registration Identity Card (NRIC) number or passport details.

In a more advanced variant of the scam, victims are directed to a second deepfake video purporting to show a legal counsel from the company, who asks the victim to sign a non-disclosure agreement (NDA) or receive a letter from the board.

In one case mentioned by the CSA, an employee of a multinational firm was tricked into sending $25m to scammers, after receiving a deepfake video that appeared to be his chief financial officer giving payment instructions.

Other known scams have used deepfake videos of Prime Minister Lawrence Wong and senior minister Lee Hsien Loong.

Verification measures recommended

The SPF, MAS and CSA are calling on firms to establish protocols that employees can use to verify the authenticity of any video calls or messages they receive from colleagues.

Firms are encouraged to educate their employees on the scam typology and to urge extreme caution when asked to carry out funds transfers.

“Be mindful of any sudden or urgent fund transfer instructions and verify the authenticity of the instructions with the relevant departments or personnel directly,” the advisory states.

“Analyse the audio-visual elements of the video call. Check for tell-tale signs that could suggest the manipulation of the audio or video through AI technology.”

The CSA has also provided further information that firms can use to train their employees on how to spot an AI-generated deepfake.

For example, faces and facial features may appear blurred around the edges, and image resolution may be uneven and may contain unnatural shadows.

The background may be inconsistent or out of focus, skin tones may be distorted, and the movement of the mouth and lips may not sync properly with the audio.

‘Nascent’ technology solutions

The CSA notes that there are ongoing efforts in multiple jurisdictions to create tools that can be used to detect AI-generated deepfake content.

For example, pixel analysis is a technique used to detect inconsistencies in facial expressions and lighting.

However, the agency warns that deepfake detection tools for general use are still “nascent”, so it cannot, at present, recommend a particular technology for firms to employ. 

For now, firms are advised to monitor the CSA’s correspondence to stay up to date with the latest developments in AI detection tools.

The agency mentioned that both Meta and OpenAI are currently developing metadata tags, or “watermarks”, that can indicate whether content is AI-generated.

In the meantime, the CSA advises firms to practice its “3A” approach: assess the message; analyse audio-visual elements; and authenticate content using tools.

Our premium content is available to users of our services.

To view articles, please Log-in to your account, or sign up today for full access:

Opt in to hear about webinars, events, industry and product news

Still can’t find what you’re looking for? Get in touch to speak to a member of our team, and we’ll do our best to answer.
No items found.