Join our FREE personalized newsletter for news, trends, and insights that matter to everyone in America

Newsletter
New

Mortgage’s Ai Crisis Is Coming. The Industry Isn’t Ready To Talk About It.  

Card image cap

A borrower in forbearance gets the wrong information about their reinstatement options. It came from an automated servicing communication, routed through a system nobody fully owns and nobody fully monitors. By the time someone catches it, hundreds of borrowers have received the same message. Some acted on it. 

Now there’s a reporter making calls. 

Who speaks? What do they say? Who decided the system would handle that communication in the first place? 

This is the AI crisis scenario mortgage isn’t preparing for. Not a rogue algorithm making loan decisions. Something quieter and more insidious: a system operating exactly as designed, at scale, with consequences nobody anticipated and no communications infrastructure to manage them. 

The playbook assumes a person  

Every crisis playbook assumes a human decision-maker somewhere in the chain. Someone made a call. You can trace it, contextualize it, own it. You can build a response around accountability because accountability has a face. 

AI dissolves that. Responsibility spreads across product managers, data teams, compliance officers, and third-party vendors whose contracts don’t include the word “crisis.” When something surfaces, the question of who decided becomes genuinely difficult to answer. That’s not a philosophical problem. It has financial and reputational consequences attached, and the mortgage industry is accumulating exposure faster than it’s building the capacity to manage it. 

The risk is already deployed  

Underwriting gets the attention in this conversation. That’s not where the near-term risk sits.

The live surface is servicing communications, fraud detection models, automated valuation tools, customer-facing chat systems, document processing. Deployed today, at scale, with limited public scrutiny and almost no communications governance around them. 

Automated valuation models carry fair lending exposure that mirrors the underwriting conversation without triggering the same legal sensitivity. A valuation that patterns differently across geographies is a story. If a model produced it and the company can’t explain how, that story writes itself. 

Customer-facing chat systems are a category of their own. When a borrower gets wrong guidance from an automated system about a payment plan, a modification, or a fee dispute, the company owns that guidance regardless of how it was generated. The communications exposure doesn’t follow the technology architecture. It follows the customer relationship. 

Your vendor isn’t your spokesperson  

Most mortgage AI isn’t proprietary. It’s licensed, embedded, or layered onto platforms built and maintained by third parties. The contractual relationship runs to the vendor. The accountability relationship runs to the borrower, the regulator, and the reporter outside your headquarters. 

“Our vendor’s system generated that communication” is not a crisis response. It’s an invitation to a worse story. The company whose name is on the mortgage statement owns the narrative, whether the technology agreement says so or not. 

This is the piece of AI governance almost nobody in this industry has thought through. Vendor contracts address liability. They don’t address what you say on day two when the story is running and your vendor’s PR team isn’t returning calls. 

Why governance alone won’t save you  

The mortgage industry knows how to build governance structures. Model risk management frameworks, audit trails, compliance documentation, fair lending testing. These exist because regulators demanded them and the industry responded. They are necessary and they are not sufficient. 

Governance is designed to answer questions after the fact. Who approved this model. What data trained it. When did the anomaly first appear. Those answers matter in an examination.

They don’t help you on day one of a news cycle when a reporter has a borrower on the record and your communications team is hearing about the incident for the first time. 

The gap isn’t technical and it isn’t legal. It’s organizational. Communications is treated as a downstream function in most mortgage companies, brought in after the position has already been set by legal and compliance. That sequencing works for routine matters. It fails under pressure, because the window for shaping a narrative closes faster than most organizations expect, and the first position you take publicly is the one you’re stuck defending. 

There’s a second gap that’s specific to AI: the people who understand how the system works and the people who speak for the company are rarely the same people and rarely in the same room. When a journalist asks a specific question about how a valuation model weights certain inputs, or why an automated servicing message went to borrowers who shouldn’t have received it, the answer exists somewhere in the organization. Getting it into the right hands fast enough to matter is a coordination problem most companies haven’t solved because they haven’t had to yet. 

That changes when the first significant AI-related crisis hits a mortgage company publicly. And it will. 

The window is narrowing  

AI deployment across mortgage servicing, valuation, and customer communications is accelerating. Regulatory posture is unsettled but directional. Consumer advocates and plaintiff attorneys are already paying attention to how these systems behave at scale. 

The companies that navigate this well won’t be the ones with the most sophisticated models. They’ll be the ones that understood, before the story broke, that narrative risk is a distinct category of risk with its own exposure and its own requirements. 

Most companies in this industry haven’t made that determination yet. The window to do it on their own terms is open. It won’t stay that way. 

Mitch Cohen is the founder of ClearLine, a crisis communications readiness platform for mid-market organizations. He has 25 years of experience in strategic communications across fintech, data, and regulated industries.


This column does not necessarily reflect the opinion of HousingWire’s editorial department and its owners. To contact the editor responsible for this piece: zeb@hwmedia.com.