Question:

I showed up to work early on a Monday after pulling a weekend sprint to meet a client deadline. Before I even made it to my desk, the front office coordinator told me HR wanted to see me. I figured it could wait until I hit send on the project I’d just wrapped.

Ten minutes later, HR walked into my office, shut the door, and told me I was being suspended “pending investigation,” barred from the workplace and locked out of my computer.  

When I asked, “What?!”, she told me I’d left sexually explicit voicemails for one of the company’s executives.

I thought I misheard her.

It took three weeks to prove those voicemails weren’t mine and were AI-generated fakes. Even now, no one knows who targeted me or why.

What I do know is this: I was treated as guilty until proven innocent. Although I received a “we’re so sorry this happened to you,” HR and my employer insist they “did the right thing.” Can you write about this? It can happen to anyone.

Answer:

You’re absolutely right to sound the alarm. Deepfake abuse is no longer a fringe tech horror story—it’s happening, and the fallout can wreck reputations, careers, and lives.

Deepfakes—hyper-realistic audio, video, or image fabrications powered by AI—are increasingly used as weapons in harassment, blackmail, and smear campaigns. Explicit deepfake content now accounts for tens of thousands of new videos each day, often targeting women and often spreading across anonymous websites before the victim even knows they exist, https://sensity.ai/blog/deepfake-phishing-attempts-in-corporate-and-personal-communications/.

Many employers are now receiving manipulated recordings simulating offensive conduct—from insubordination to sexual harassment.

Unfortunately, some employers still treat digital evidence like gospel. They assume if something’s in audio or on video, in audio, or shows up with metadata, it must be real—so many skip straight to discipline. That response is outdated. Your employer took a more measured approach—they investigated—but still placed much of the burden on you to prove your innocence.

Employee actions

If you’re an employee who’s been targeted:

  • Document everything. Request copies of the alleged recordings, save emails or notifications you receive, and take notes (dated, detailed) on every HR conversation.
  • Push for an independent investigation. Demand your company verify authenticity before acting—if possible, through forensic media experts.
  • Get legal support. Many states now have laws addressing deepfake harassment, and President Trump signed a new federal law TAKE IT DOWN Act (Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act) on May 19, 2025. This Act includes provisions for a fast-track takedown process for nonconsensual intimate content. A lawyer can help you navigate this and protect your job, your rights, and your name.
  • Tell your story—when you’re ready. Anonymous reporting to watchdog groups or speaking out (as you’ve done here) helps build awareness and momentum for change.

Employer actions

If you’re an employer, you need to get ahead of deep fake damage, by taking the following actions:

  • Updating your policies. If their employee handbook doesn’t mention AI-manipulated media, you’re already behind. Add explicit language about deepfakes, including a commitment to verify any suspicious media before enacting consequences.
  • Train your managers and HR teams. They need to know how to spot possible fakes—and how to respond without bias or kneejerk assumptions.
  • Support your employees. Even when a victim’s cleared, the social and emotional damage lingers. Offer counseling, restore access, issue a public statement or apology if appropriate. Don’t leave your employee target hanging.

The AI age demands a higher standard of care from employers, coworkers and investigators. We can’t afford to fall into the quicksand of believing every byte.

© 2025 Lynne Curry, PhD, SPHR, SHRM-SCP

Subscribing to the blog is easy

If you’d like to get 1 to 2 posts a week delivered to your inbox (and NO spam), just add your email address below. (I’ll never sell it.) I’m glad you’ve joined this vibrant blog. Thank you!

3 thoughts on “Deep Fake Takedown

  1. OMG and WTH, just when you think the conditions of the working world can’t get any worse…they come up with something worser!!!
    And there are NO LIMITS in sight!
    I liked it way better the ‘old way’. At least it was all pretty superficial, obvious, provable and, in most cases solvable and preventable.

  2. This is disturbing. Once again, too, HR misses the mark, but is self righteous and does the paper work and follows the policy. Blindly. How do you document the absence of sexually explicit materials, I’d like to know. Threaten HR with a lawsuit.

Leave a Reply

Your email address will not be published. Required fields are marked *