You’ve seen the movies: a too-perfect Taylor Swift selling free cookware. A faux Tom Hanks providing dental insurance coverage.
They give the impression of being actual—however they’re not.
New analysis from McAfee Labs reveals simply how widespread these scams have turn into.
Our 2025 Most Harmful Celeb: Deepfake Deception Listing ranks the celebrities and influencers whose likenesses are most hijacked by scammers, and divulges a rising marketplace for AI-powered faux endorsements.
On the prime of the listing? Taylor Swift, adopted by Scarlett Johansson, Jenna Ortega, and Sydney Sweeney. Globally, names like Brad Pitt, Billie Eilish, and Emma Watson additionally seem among the many most exploited.
McAfee additionally launched its first-ever Influencer Deepfake Deception Listing, led by gamer and streamer Pokimane, displaying that scammers at the moment are concentrating on social platforms simply as aggressively as Hollywood.
High 10 Most Harmful Celebrities (2025): U.S

High 10 Most Harmful Celebrities (2025): International

High 10 Most Harmful Influencers (2025): International

Why Scammers Love Well-known Faces
The method is straightforward: use somebody folks belief to promote one thing that doesn’t exist.
Criminals clone celeb voices and faces with AI to advertise faux giveaways, skincare merchandise, crypto investments, or “unique” offers that lead straight to malware or cost fraud.
In keeping with McAfee’s survey of 8,600 folks worldwide:
- 72% of People have seen faux celeb or influencer endorsements.
- 39% have clicked on one.
- 1 in 10 misplaced cash or private knowledge, a median of $525 per sufferer.
Scammers exploit belief. Once you see a well-recognized face, your mind mechanically lowers its guard. And that’s precisely what they rely on.
How Deepfakes Are Making Headlines
AI has made these scams look frighteningly actual.
Fashionable deepfake mills can mimic voices, facial actions, and even micro-expressions with uncanny precision. Solely 29% of individuals really feel assured figuring out a faux, and 21% admit to having low confidence recognizing deepfakes.
That’s how faux endorsements and AI romance scams have exploded on-line.
- A girl in France misplaced almost $900,000 to a scammer posing as Brad Pitt, full with AI-generated pictures and voice messages.
- TV host Al Roker was lately focused by a faux deepfake video claiming he’d suffered coronary heart assaults.
- Tom Hanks, Oprah, and Scarlett Johansson have all been utilized in fraudulent adverts for merchandise they by no means touched.
“Seeing is believing” doesn’t apply anymore, and scammers understand it.
The Psychology of The Rip-off
Deepfake scams don’t simply depend on know-how; they prey on parasocial relationships, the one-sided emotional bonds followers type with public figures.
When a “celeb” DMs you, it doesn’t all the time really feel unusual. It feels private. That sense of intimacy makes folks act earlier than considering.
It’s the identical psychological playbook behind romance scams, now supercharged by AI instruments that make faux movies and voice messages sound heartbreakingly actual.
Defend Your self
- Pause earlier than you click on. If an advert or put up appears out of character or “too good to be true,” it in all probability is.
- Confirm on the supply. Verify the celeb’s verified account on social media. Scammers usually copy profile images and bios however miss delicate particulars like posting fashion or engagement patterns.
- Search for indicators of AI manipulation. Look ahead to off-sync lip actions, robotic tone, or lighting that appears inconsistent.
- By no means share private or cost particulars through messages, even when the sender seems to be verified.
- Use McAfee’s Rip-off Detector, included in all core plans, to mechanically analyze texts, emails, and movies for indicators of fraud or deepfake manipulation.
Key Takeaways
Celeb and influencer tradition has all the time formed what we purchase, however now it’s shaping how scammers deceive. These deepfakes don’t simply steal cash; they chip away at our belief in what we see, hear, and share on-line.
The celebrities on the heart of those scams aren’t accomplices, they’re victims, too, as criminals hijack their likenesses to use the bond between followers and the folks they admire. And as deepfake instruments turn into simpler to make use of, the road between actual and pretend is vanishing quick.
The subsequent viral “giveaway” may not be an advert in any respect…it might be bait.
You’ll be able to’t cease scammers from cloning well-known faces, however you may cease them from fooling you. Use McAfee’s Rip-off Detector to scan hyperlinks, messages, and movies earlier than you click on.
