Sign up for the Slatest to get the most insightful analysis, criticism, and advice out there, delivered to your inbox daily.

DeAnn Wiley was on the hunt for a new rental in Detroit earlier this month when she had the displeasure of arriving at a property that looked nothing like what was advertised online.

“The photos made the home look brand new, only to get there and see the usual wear and tear and the old ‘landlord special,’ ” she told Slate. She tweeted the stark, even hilarious differences between what was posted and what she saw in person.

Her listing appeared to show a pristine, albeit A.I.-generated, house with smooth textures, clean walls and windows, a nice green lawn, and a bench out front under bucolic lighting. However, the photo she said she took in person showed a much shabbier house, featuring uneven grass and cluttered with yard equipment where the bench was supposed to be.

Like this had me screaming. This the pic in the listing v.s what shows up on google maps. 😂 https://t.co/8fmJt9b6zF pic.twitter.com/ybpPPlPXWy

— I appreciate you. (@DeeLaSheeArt) October 6, 2025

Wiley appeared to be yet another victim of A.I. slop proliferating online. Although the technology has encroached on people’s everyday lives in small and large ways, this seemed to be a more egregious example of it in housing. With the prospect of homeownership slipping out of reach for most Americans, who are struggling to afford their basic needs, the search for a home, even a rental, now comes with more stressors. And we can thank the overzealous adoption of A.I. for that.

In recent months, homebuyers, renters, real estate agents, and photographers have noticed an uptick in A.I.-rendered images in listings—some with fake staging and altered details, and others that seem to show entirely different houses. Although cosmetic edits and digital staging are nothing new to real estate, these A.I.-ified creations are causing clients and professionals to ask themselves if this is a straight-up deceptive practice.

Wiley says she’s OK with a landlord’s staging of a house, adding a few aesthetic touches here and there, but notes that the house she toured was completely “false advertising.”

It “ends up wasting a lot of time. That’s gas money, time and energy you have to waste during your search,” she said. “A.I. is making an already tedious task even more difficult.”

Santiago Torres, a freelance real estate photographer, has also been in the market for a home and tells Slate that of 100 listings he has seen, he believes that at least a dozen of them have been enhanced or entirely materialized with A.I. He predicts that this number is “definitely going to grow” this year.

Torres worries that this practice will become normalized and pervasive in his industry, leading to fewer jobs for human photographers like himself and his peers. “I think many photographers haven’t realized how many listings already use A.I.,” he added.

The technology is being deployed more widely in real estate, but housing professionals note that few of these nefarious listings are being posted to official housing databases that agents are using right now.

“What we are seeing is experimentation on the edges, mostly in consumer marketing or social media content,” Kevin Greene, the general manager of real estate solutions for a data solutions company, told Slate. “The ability to generate a full property listing from scratch is here, but it’s not being used in the MLS.” (MLS stands for “multiple listing service,” a private digital database that realtors and brokers use to circulate listings.)

Greene says most property agents are using A.I. to generate the descriptions for these listings, something he argues does save them a lot of time. In fact, he believes that it’s the “subtle” use of this tech that should be more concerning for buyers and renters.

“The bigger risk isn’t full fabrication; it’s subtle manipulation. Tools that can brighten a photo can also remove power lines, add trees, or replace grass with a pool—and that’s where things start to cross the line,” he said. “What matters most is whether that content reflects ‘ground truth data,’ which means the verified, factual attributes of a property drawn from public records, imagery, and on-site validation.”

Derek Leben, an associate professor at Carnegie Mellon University who teaches courses on A.I. and ethics, agrees that using A.I. language models to render descriptions of a house is not irresponsible, but that presenting images that do not live up to reality is deceptive.

“If you provide information that’s misleading, that’s an instance of deceptive practice,” Leben explained. “In traditional ethics law, this happens around fraud or false advertising. For example, if a company shows a picture in their ad of a cheeseburger, but it’s not a real cheeseburger and it looks nothing like a cheeseburger you’d order in the restaurant, is that falsified information? Not really. You’re not giving lies. It’s not untrue, but it’s perhaps misleading.”

And, as Leben notes, these practices can result in wasting people’s time if they go and tour the place. In the worst-case scenario, a client can enter a contract under false premises.

Staff
Stop Trying To Make A.I. Trendy
Read More

“In cases where people buy properties sight unseen, it would be a stronger case that they made that contract under misleading pretenses and for it to be null and void,” Leben said.

For real estate photographers, he is concerned about the displacement of their work. “The ‘A.I. is going to take our jobs’ worry is real, and we call this job displacement.”

Has Anyone Else Noticed How Much Kids’ TV Has Changed Lately?

We Interviewed an Anti-Trump Inflatable Frog. It Made Some Great Points.

He Was the Most Acclaimed Novelist of His Generation. He Won the Nobel. He Was Worse Than Anyone Knows.

Everyone Thought It Was the Ultimate Answer for a Great Burger. Everyone Was Very Wrong.

Torres and his peers are also concerned about widespread A.I. adoption, namely how easy it may seem for agents to overlook hiring professional photographers for listings. “In the future, agents will only need to know from which angle to take each photo—something they can learn in just a few hours with a cheap tripod—and leave the editing to A.I.,” he said. “It won’t be the same result as a professional photographer, but it will be good enough for the listing to look decent.”

Buyers and renters who already feel stifled by housing options believe that this practice should be heavily regulated. For Wiley, who experienced this firsthand with her rental search, the calculus is simple: If you’re going to make the effort to search and visit properties—something that often requires untold hours poring over Zillow listings, Craigslist ads, and Facebook Marketplace posts—the places you visit should at least resemble the photos online. You shouldn’t have to deal with the torrent of A.I. slop—especially when you’re finding a home.

“People are focusing so much on whether or not people can spot an A.I. photo and missing the point that no one should have to worry about being misled and taken advantage of while looking for housing,” Wiley said. “The focus shouldn’t be on renters to have discernment but on these rental apps to regulate their platform so that users can avoid potential scams or manipulation during their search.”

Sign up for Slate’s evening newsletter.