More Landlords And Lenders Are Using Ai. Fewer Regulators Are Checking Them For Bias.
The housing industry is rapidly adopting artificial intelligence tools to decide who gets a home loan or lease. The Trump administration is rolling back long-established protections used to keep those evaluations fair.
Technology that utilizes algorithms to predict outcomes — such as a home’s selling price or how likely someone is to afford their rent — isn’t new to the housing and lending industries. But significant improvements with user-friendly AI have made these tools more accessible for mortgage and real estate businesses, spurring increased interest in expanding the role of computerized systems in housing.
The widespread adoption of AI has renewed some hope that the technology can be more objective, reduce discriminatory bias and reverse entrenched inequalities.
But because AI models are trained with data that reflect historic patterns of discrimination, some are warning that these new systems could have the opposite effect. That fear has intensified since the executive branch began narrowing federal anti-discrimination enforcement.
“Artificial intelligence might advance civil rights if it's used properly… but it might also reinforce discrimination in our society if we're not careful, because AI is ingesting everything out there in the world,” Federal Reserve Governor Michael Barr, an outspoken critic of the administration’s deregulatory agenda, said at a recent fair housing advocacy event. “There's a lot of things out there in the world that are deeply, deeply discriminatory.”
The technology developers spearheading efforts to apply AI to the housing sector say they train their systems to avoid accidental bias. But industry observers like Barr are worried that diminished government oversight will weaken incentives to continue taking those steps seriously.
Since taking office, the Trump administration has sought to prevent the federal government from enforcing rules based on “disparate impact” — a method for determining whether a practice amounts to illegal discrimination by looking at its effects on groups of people, regardless of intention. Because disparate impact methods focus on provable outcomes instead of intent, they’ve been used to challenge decisions influenced by algorithmic technology.
In 2024, a federal court approved a more than $2 million payout to rental applicants who said they were denied housing due to an algorithm that disadvantaged Black and Hispanic people. The plaintiffs used disparate impact standards to argue that the tool relied heavily on credit scores without accounting for other factors, like housing vouchers, that increased applicants’ ability to pay rent. A 2022 Urban Institute study found that white communities had median higher scores than racial minority communities.
Under the first Trump administration, the Department of Housing and Urban Development acknowledged in a 2019 rulemaking that disparate impact methods are “an important tool to root out” potential discrimination in algorithmic systems, while stressing the need for updated policies as technology evolves.
But in President Donald Trump’s second term, HUD has changed its tune, arguing that disparate impact enforcement by federal agencies was unfair to businesses and led to illegal racial preferences.
“The issue isn’t AI – it’s disparate impact, a discredited theory that requires individuals and entities to consider race on the front end to avoid legal liability on the back end. That runs counter to the core purpose of the Constitution’s Equal Protection Clause,” HUD spokesperson Robbie Myers told POLITICO. “HUD will continue to hold bad actors accountable under the Fair Housing Act.”
HUD and the Consumer Financial Protection Bureau have proposed rules to roll back their disparate impact protections.
“One Hallmark of the second Trump administration has been the eradication of discriminatory raced based policies that have permeated every aspect of government under the banner of ‘diversity equity and inclusion’,” said Rachel Cauley, a spokesperson for the Office of Management and Budget who is also acting as spokesperson for the bureau, in a statement. “The administration of fair lending laws is no exception.”
But proponents of government enforcement against disparate impact discrimination say having those standards available to regulators is critical for challenging harmful housing provider decisions influenced by algorithmic tools.
Disparate impact methods continue to apply under congressionally passed civil rights law, so individuals can still use them to bring lawsuits. However, the lack of transparency in many AI tools can make these cases prohibitively difficult for non-experts to build by themselves, said Lisa Rice, president of the civil rights group National Fair Housing Alliance.
“A typical consumer, it's very hard for them to bring these complaints,” Rice said. “There's a lot of work. It's very labor-intensive, and it can be extremely expensive.”
Rice said leaving disparate impact enforcement up to lawsuits from individuals who often lack resources isn’t sufficient, so government agencies with experts and significant budgets should be involved.
“Regulators have the capability to dig into the system, to see what's in that system and to ferret out whether or not there's discrimination,” Rice said. Federal agencies can “then compel the institution to correct that discriminatory algorithm.”
Still, many industry advocates supported federal agencies’ rollback of disparate impact rules. The Community Home Lenders of America and others submitted comment letters in support of HUD’s and CFPB’s proposed rule changes, agreeing with the agencies that the previous regulations were overly expansive and misaligned with past court decisions.
The industry association represents local mortgage lenders. Its members were worried that overly zealous agency oversight would prevent them from bringing in tools that could both benefit their small businesses and prevent human bias, said the group’s spokesperson, Rob Zimmer.
“Let's make sure we have a system that evaluates people based on math — not based on personal characteristics that arguably should have nothing to do with whether or not you recommend access to credit,” Zimmer said. Without this administration’s policy changes, “we had two choices: get regulated to death or not adopt AI and become non-economic.”
Some argue that the Biden administration’s focus on preventing discriminatory outcomes in housing technologies could actually hurt the communities it sought to protect.
Tobias Peter, co-director at the American Enterprise Institute’s Housing Center, warned that overcorrecting real estate industry processes based on assumptions of bias could pose more harm than good.
“Getting someone in a house is nice and well, but they also need to be able to afford that home,” Peter said. “If you get them in a home and they default relatively quickly, and they get foreclosed on, and they lose whatever capital they've invested in at home — I don't think you're helping anyone.”
Still, many in the industry are aware that policy changes by the agencies can be easily undone by the next administration.
David Dworkin, president of the affordable housing industry coalition National Housing Conference, said he’s worried that the sector will face retroactive consequences for moving forward with helpful, new tools that it has “received no guidance on.”
“I'm particularly concerned about the impact that lenders will face after the Trump administration — when the pendulum swings again,” Dworkin said. “Because it always does.”
Popular Products
-
Fake Pregnancy Test$61.56$30.78 -
Anti-Slip Safety Handle for Elderly S...$57.56$28.78 -
Toe Corrector Orthotics$41.56$20.78 -
Waterproof Trauma Medical First Aid Kit$169.56$84.78 -
Rescue Zip Stitch Kit$109.56$54.78