Why Would Scammers Collect Recordings of a Real Estate Pro’s Voice?

The short answer: to create voice impersonations for future fraud.

Most people look to real estate professionals when buying or selling a home. Imposter agents can do tremendous damage. We’ve already discussed AI-enabled deed fraud. Voice-cloning raises the stakes.

Here’s the latest.  

We-Buy Homes Companies Using AI Voice Agents

Are you interested in selling your home? You could get a call from an AI voice agent to ask you. This is not your ordinary robocall. AI technology can now easily clone real people’s voices, making solicitations convincing and uncannily real. This call is holding a conversation with you, and you think you’re speaking with another human being.

But do you sense something a little off? A little lag before answering? Unusually formal language?

You grill the voice agent. It admits to being an “AI assistant” working for an investment company.

These are not the kind of interactions you want, so you turn to the national do-not-call list. But this doesn’t stop them. And the story gets worse.

Using Voice to Manipulate Emotions

Today’s artificial intelligence can draft legal petitions, compose symphonies, and simulate human emotion, reports the American Bar Association. When bad actors become AI users, the results can be tragic in real estate deals, or in any situation where large sums of money are targeted. To get a feel for just how nasty the setup can be, and just how cautious we all need to be, the ABA asks its members to consider a real case. It happened this year in Florida.

Sharon B. picked up the phone to hear her daughter’s voice, crying. She told her mother she’d just survived a road accident, suffered a miscarriage, and needed legal help immediately to prevent criminal charges.

Sharon sent $15,000 in cash in response.

It wasn’t her daughter at all. It was a tech-enabled replication of her daughter’s voice.

Just a few seconds of a real audio recording, the ABA explains, is enough for machine learning models to copy voices and make new recordings through AI-enabled voice generating software.

As you can see, the same old scammer techniques are in play: create a sense of urgency, impersonate someone in distress, request immediate cash. But now, with AI, the emotional manipulation and the ability to convince targets are so much more potent.

Well-Known, Tech-Savvy Broker? Or Slick Replica?

Scam accounts that impersonate brokers are now all over the place. They attract hopeful renters with fabulous deals on apartments, marketed by well-known real estate brokers.

Fleeced home-seekers are calling the agents, demanding refunds of their fees and deposits.

Some agents get visited by the targeted people, who show up expecting to get the keys to apartments that aren’t actually on the market.

Scammers advertise the bogus deals under generic brokerage names at WhatsApp or Telegram. When it’s time to interact with hopeful clients, they impersonate the high-profile agents over on TikTok and Instagram. They display fake business cards. They’ll often direct their targets to actual agents’ listing pages.  

The rough-and-tumble New York City rental market is fertile ground for these types of swindles. No one should ever send a fee or deposit by wire transfer or by app for an apartment or a house, sight-unseen. But many people do, in competitive markets. Bad actors know this. They seek out parents looking to reserve rental properties for their young adult children who are directed to send immediate deposits to hold the units.

Scammers might offer video tours of the apartments. Kim Velsey at Curbed observes that “it’s easy enough to lift a reel of a $15,000-a-month rental and slap on a new voice-over and a $2,200 price.”

Gird Yourself Against Swindlers

Many professional real estate pros have published videos of themselves, so samples of their voices are easily available to bad actors. But anyone can be a target — either for their money, or their identity. Anyone’s voice is available to the phishing scammers who will try to get you on the phone to harvest the raw material they want — voice samples — for future fraudulent phone calls.

Voice cloning is a remarkable tool in the hands of swindlers. Who would love this technology more than someone who profits off fakery? And as time goes on, the machine-learning system only gets better at replicating speech cadence, gestures, and emotion. Forewarned is forearmed. Train yourself to take the following precautions against these sickening ploys:  

  • Resist the urge to give out financial or personal information or to send immediate funds — even when the requester sounds familiar.
  • Slow down and get a hold of the actual person before responding.
  • Use agreed-upon code phrases to confirm who people are.
  • Look up the phone number. And look out for international users of Voice-over Internet Protocol (VoIP), which enables voice calls using a broadband Internet connection instead of a phone account.
  • Do an online search for any wire-transfer address you’re given.
  • Set up multi-factor authentication for your accounts. (This is helpful, but not foolproof.)

Keep your deed safe from scams by setting up an auto-alert with your county. Many county deed office websites offer alert systems to let you know about any activity on your deed. You have to opt in for the messages. Signing up only takes a minute or two, so please do this now.

Can Law Enforcement Keep Up With AI?

People targeted by voice scams can and should notify local law enforcement promptly. Doing so might help them (and others) recover lost assets.

That said, there’s a limit to what law enforcement is prepared to say and do. The act of stealing brokers’ identities to steal other people’s funds is not effectively addressed as a crime. And scammers who are reported tend to change their numbers and keep perpetrating their frauds.

Lawyers have tried to assist real estate brokers deal with online financial platforms and social-media sites, and their efforts are often in vain. It’s notoriously difficult to get a real person at social-media companies. Pressing tech companies to delete fake videos and accounts can turn into a part-time job for agents.

One agent told the Curbed reporter that the scammers who impersonated him on Instagram had accumulated so many followers that Instagram decided he was the imposter. Frustrated agents are paying the Meta corporation, which owns Instagram, just so that they can get verified as who they are.

Words to the Wise

Stay alert. The use of AI to clone voices and impersonate people is a clear and present danger.

And if you do get fooled? Share your story with others. Informing the people you know might feel embarrassing, but it will strengthen their ability to steer away from similar attacks. 

Supporting References

Jonathan Delozier, for HousingWire (HW Media, LLC): EquityProtect Foils Alleged Title Fraud Scheme in Ohio (Oct. 2, 2025).

Kim Velsey, for Curbed.com, from New York magazine: It’s Stunningly Easy to Impersonate a Broker Online (Sep. 18, 2025).

Public Affairs Specialist Amy Thoreson, for the FBI Field Office in Newark, New Jersey: Fraudsters Are Stealing Land Out From Under Owners (May 28, 2024).

Jeffrey M Allen of Graves & Allen (Oakland, California) for the American Bar Association: Voice of Experience – The Rise of the AI-Cloned Voice Scam (Sep. 10, 2025).

And as linked.

More on topics: Deepfakes, AI-generated fraud in real estate

Photo credits: August de Richelieu and Ketut Subiyanto, via Pexels/Canva.