AI is one of the most important technological developments of our time. It has the potential to automate almost any repetitive task and even interact with humans. Most industries are already being transformed by AI, but that doesn’t mean that everyone is onboard. There are a growing number of people that are concerned about the negative effects of AI. The actor strike happening in Hollywood is the most well known example but there is pushback on the use of AI in real estate as well.
There are concerns that AI might exacerbate discrimination when it comes to tenant screening. The Fair Housing Act prohibits landlords from discriminating against people because of their age, gender, race, or religion but an AI algorithm might do exactly that if not given proper safeguards. An urban technologist and Ph.D. candidate in Urban Science at MIT named Wonyoung So has been studying how AI can, as he puts it “automate inequality.” This can happen even if the landlord has no intention of discriminating since technology companies that sell AI tools are not required to disclose the details of how their algorithms make decisions. “Tenant screenings are prime examples of technology using seemingly neutral data to justify discrimination against different races or genders,” So said. To enhance transparency for both renters and landlords regarding the impact of using tenant screening algorithms, So has launched “Countering Tenant Screening,” an initiative that thoroughly examines and explains the functionality of the AI.
There has also been a recent proliferation of AI assisted “rent collection” software. These sell themselves as an easy way to get tenants to sign up for automatic payments and remind them of past due bills. But for the tenant they are much more than that. There is a fine line between reminding tenants to pay and harassing them, one that the AI could easily step across. Using a chat bot to communicate with tenants in such an emotionally charged time removes the humanity from the situation and could anger them into making life harder for landlords, something that tenants can do easily and legally in many states. Collecting back rent is one of the worst parts of the job of renting space so I understand wanting to automate it but sicking a chatbot on delinquent renters might be a bit too much for the general public to accept.
Right now you might be thinking, “I am just using someone else’s product, they are the ones liable for any harm that it might do, not me.” This is a reasonable assumption but a wrong one as well. The current lawsuit against RealPage’s price calculation software is a great example. Besides RealPage the legal action also names some of its biggest landlord clients as defendants, meaning that these companies might get punished for a tool that they had no hand in creating. I have mixed feelings about this lawsuit, part of me thinks that software is taking the blame for America’s unaffordable housing situation. But the fact that it is being heard in court and has garnered a lot of media attention means that people are willing and able to hold landlords accountable for their AI assisted actions.
Being a landlord can be a difficult job, especially right now. Rising costs and falling demand are forcing property owners and managers to scramble to find ways to cut costs. For that reason, AI has come at a great time and is poised to proliferate. But before landlords decide to use a “black box” AI program to do some of their dirty work they need to think about the consequences, both in the court of law and the court of public opinion.
Overheard
I'd bet many are not even aware it's happening or how it's being used. I'm hearing it's becoming common for large landlords to run rental applications through some kind of AI screening (for credit risk for example) and I can imagine the likely bias in the training data
— infrastructure is politics (@steveglista) June 22, 2022
Mapped
Without any Federal regulations on AI (yet) many states are taking it upon themselves to create some regulatory framework for the technology. Here is a map of the states that have passed or are working on AI regulations.
Good reads
Bribemade
Developers have been known to cozy up to politicians in return for approvals on projects but one developer in L.A. is now going to Federal prison for six years for his cash bribes to a council member in 2017.
Lossless refinance
Bank regulators have issued guidelines to banks on how lenders can restructure and extend loans on struggling commercial buildings without taking losses or foreclosing.
Prime example
So much can change when just one large company returns to the office. Amazon recently required its employees to return to the office and there has already been an uptick in economic activity in downtown Seattle.