Now It’s the Deepfakes. Protect Deeds and Accounts From AI-Generated Imposters

Real estate transactions involve large money transfers. Creative swindlers love those. And now, progress in artificial intelligence (AI) brings new opportunities for real estate fraud.

Shady actors are using real people’s voice or image samples to create recordings and persuade buyers and sellers to respond to questions with key information. Then the scammers hijack the target’s accounts. That’s right. A new breed of AI-powered wire fraud can manipulate familiar voices and grab online pictures to create whole recordings — just to get at people’s funds.

Remaining in the dark is not an option. To keep transactions safe, what do we need to know? 

AI Is Hot, and It Will Stay Hot.

AI is hot. And it’s much more than a flash in the pan. According to a recent Bloomberg interview with Kristin Roth DeClark, Barclay’s Global Head of Technology Investment Banking, AI will continue to shape human productivity throughout the decade to come.

A high-profile example of game-changing technology is ChatGPT, from Open.ai. In response to a person’s requests, this tool constructs impressive visuals and text.

This is generative AI, which digests existing data to create new pieces of content. Today’s real estate professionals have embraced it. They use it to create descriptive text and graphics, and to generate marketing copy.

Of course, there are also people who misuse AI tools. Some are perpetrating frauds. So, we all need to know at least some basics about today’s AI and cybersecurity.

Telltale Signs of Fakery Are Getting Hard to Spot.

It’s getting much harder to notice anything “off” with newer fakes. Those super-obvious scam emails, littered with grammatical errors and misspellings, are a dying breed. AI-generated text can be very, very good at getting people to expose their personal details to the sender. And that’s how identity theft happens.

Swindlers can now whip up practically any kind of real estate or financial content — from webpages to text to voicemail. But of all these modes of stalking their targets, they especially like email as an easy starting point for their manipulations. Once they have a recipient’s attention, they can distribute misinformation — even false deeds and legal documents.

AI-created messages can also spread malicious links. Click, and the recipient goes to a site requesting valuable account details. Or the scammer hijacks the target’s phone.

Scam artists request bank account details and ask for money to be wired to an account they control. Some manage to fool buyers and sellers. Sometimes, even bankers, county officials, and real estate professionals get sucked in.

Here Come the Deepfakes to Thicken the Plot.

What are deepfakes? They’re images and recordings made on a computer, designed to fool the viewer or listener. For instance, someone could appropriate somebody’s YouTube presence to concoct a lip-synched video that imitates that person to convey false information.  

Thanks to generative AI and deepfake tech, today’s bad actors can:

  • Post deep-discount listings on real estate or vacation home-sharing websites, and request deposits from unsuspecting consumers.
  • Take over property listings to steal down payments. AI-assisted advertising can spruce up a property’s looks, while concocted listing descriptions overstate its various selling points. Note that deepfake videos could manipulate legitimate brokers into listing nonexistent homes.
  • Create virtual sales contracts, luring buyers who have funds to wire.
  • Create fake voicemails by feeding AI someone’s actual voice samples — to say things that the person would actually not say.
  • Impersonate a buyer, seller, or agent over a video call. New deepfake technology is so advanced, it can replicate a real person on a live video call, and dupe co-workers into thinking they’re interacting remotely with their real colleague. This is already happening.

This year, tech watchers expect deepfakes to become indistinguishable from real videos of people. And anyone, in the U.S. or in another country, can easily get the tech used to make them. So, does AI help swindlers to get account numbers and misdirect funds, in what can be a person’s biggest spending decision ever? Yes, and we all need to be on the lookout.

Here’s how to…

Know What’s Real and What’s Not.

Keep this in mind any time you encounter recordings, messages, listings. Or legal documents like approval letters, account statements, employment documents, and identity cards. Anything can be AI-assisted. Remember that swindlers may be trying to get you to send money, offer up sensitive personal data, or sign legally binding agreements.

Where money is at stake, stop! Don’t click or release funds. Verify information by separately checking records and references. Initiate your own response to a request, using a known number or email address.

Use helpful verification tools. For example:

  • Verify properties, images, and histories using Google’s set of viewing and mapping tools.
  • Review all documents and messages for possible AI influence. Watch videos on slow speed to pick up voice/mouth mismatches. (Again, the imperfections are becoming much harder to detect.)
  • Use secure connections. Avoid free wi-fi connections and email accounts. 
  • Set up biometrics or multifactor authentication on all accounts and devices.
  • Maintain anti-virus software and download the latest versions for all devices.
  • Ask about watermarks on remotely accessible real estate documents, to spot manipulation.

Note to agents: Pull a deed or run a title search first if you are asked to list a home. AI detection tools are still in their early stages. But there are steps you can take (see above) to protect your interests and your online reputation by avoiding scams.

Beware the Social Media Bot.

Unfortunately, the big social media, messaging, and video platforms have actually shrunk their online safety staffs. Many of these platforms, according to recent reporting by Axios, have weakened their misinformation control policies.

Stay on guard for robotic messages. And if you find an acquaintance sending out-of-character emails or social media posts, do not engage with them. Call the person using a known phone number instead. It could save your financial privacy and protect your funds.

If you think you’ve been targeted and your confidential information was exposed, you may need to notify those affected. Contact an attorney for situations in which money has moved improperly or for other legal questions. Meanwhile, dear Deeds.com readers, use AI responsibly, and stay on the lookout for those who don’t.  

Supporting References

Genady Vishnevetsky for the Virginia Land Title Association in the VLTA Examiner via VLTA.org: Generative AI Impact on Real Estate (Centreville, VA; Dec. 21, 2023).

Robin Gwaltney, News-Talk 1340 KROC-AM and 96.9 FM: Warning – AI in Real Estate Deep Fakes. Scammers Are Using Deep Fakes on Home Buyers and Sellers (Rochester, MN; Mar. 9, 2024).

Melissa Dittmann Tracey in REALTOR® Magazine, from the National Association of REALTORS®: Technology Scammers Use Agent Deepfakes to Fool Buyers, Sellers (Mar. 7, 2024; citing a report titled 2024 State of Wire Fraud by CertifID).

Tracey Hawkins for Inman.com: The Dark Side of ChatGPT, Deepfakes and Real Estate (updated Sep. 28, 2023). 

Jim VandeHei and Mike Allen for Axios via Axios.com: Behind the Curtain: What AI Architects Fear Most in 2024 (updated from Nov. 8, 2023).

Bloomberg Television via YouTube.com: Wall Street Week – Tech IPOs Search for Fed Cues (interview with Kristin Roth DeClark, Barclay’s Global Head of Technology Investment Banking; Mar. 22, 2024).

And as linked.

More on topics: White House on AI in real estate, AI for home searches

Photo credits: RDNE Stock Project and Andrea Piacquadio, via Pexels.