Where to get help?
If you believe you have received a video, call or image from someone posing as a Fidelity employee, please contact our UK-based Customer Services team on 0800 3 68 68 84.
Don’t believe everything you see or hear! Understand the tell-tale signs that could help you identify a deepfake.
Deepfakes are videos, images or audio in which faces and voices have been either swapped or digitally altered, with the help of artificial Intelligence, usually for the purposes of fraud or misinformation. Criminals can transform existing content, swapping one speaker for another, or create entirely new scenarios where someone is represented doing or saying something completely fictional.
This type of fake media is increasingly common and is used by fraudsters to give scams a sense of authenticity with the aim of manipulating victims into sending money, investing in fake schemes or divulging personal information.
As deepfake technology continues to mature it is vital you remain aware of the way this content can be used. Learning how to detect a deepfake will help maintain your security and reduce the chance of you falling victim to cybercrime, misinformation campaigns and fake news.
Most AI generated videos show front-on faces. They rarely show side profiles. You can also check if the video has a ‘too perfect’ or. ‘airbrushed’ appearance.
Watch for lips which are out-of-sync with audio and be alert for unnatural lip movements and speech without natural human pauses and hesitations.
Check if the person is blinking or looking around naturally. AI fakes sometimes appear more focused than an average person when talking and stare fixedly straight ahead.
Much improved from first generation of AI but hands may still look odd in AI generated images. Look out for strangely shaped or ‘flickering fingers’.
Does the light on the individuals’ face and body match the illumination in the room? Are their shadows where there shouldn’t be?
AI generated speech doesn’t tend to contain hesitancy like natural pauses with the usual ‘ums’ and ‘errs’.
Finally, trust your instincts - if something seems off about a person’s voice, manner or behaviour, be suspicious. Ensure you exercise extreme caution if what you are being asked to do is unusual. Confirm whether the source material is reliable and, if you are being asked to take action, proceed with caution. Always check and verify before you part with money, personal details or action any request.
If you believe you have received a video, call or image from someone posing as a Fidelity employee, please contact our UK-based Customer Services team on 0800 3 68 68 84.