Tales from the HMDA Trenches
I was fresh out of college and working as a Clipper programmer for a large independent mortgage banker. One of my first assignments was handling the company’s Home Mortgage Disclosure Act (HMDA) submission. The changes in the law in the early 1990s had brought our company under the coverage of HMDA. We had to scramble to get ready for our first submission of approx. 33,000 applications.
We were still massaging the data till the morning of March 1, the day the submission was due to HUD. When I finally got all ok, I packed up my floppies, got on to a plane to Washington, hopped a cab to HUD and found my way to the HMDA office. When I peeked inside, I saw piles of FedEx and UPS envelopes to the ceiling, and a few tired looking guys hunched over green-screen terminals. One of them looked up, saw me holding my envelope and motioned toward one of the piles. I dropped the envelope and left – no receipt, no nothing. Mission accomplished!
While Reg C’s overall goal continues to be the promotion of fair lending and elimination of discriminatory practices like steering and redlining, much has changed from a technology perspective (no more floppies for instance!). In the old days, a costly and difficult to maintain mainframe computer would have been required to store and process the massive amounts of HMDA data, but today almost any company can access the required tools and computing power to leverage “big data”. The ubiquity of high speed internet access allows us to easily access and use data from external data sources. Back then, if we wanted to compare our geographic performance with demographic statistics for example, we had to look it up and key it in, or acquire and host it ourselves. Today we can make a simple web service call or download the data from any number of providers. And today, no last minute flights to HUD!
One area where the HMDA process has lagged technically has been in the format of the Loan Application Register (LAR) itself. The GSEs are implementing their Uniform Mortgage Data Program (UMDP) which is an attempt to standardize on MISMO compliant, XML based transactions. However, HMDA is a different animal. While in 2017 the LAR will graduate to a pipe-delimited, variable record length format, it remains a flat-file, de-normalized structure as used in 1991. It remains inefficient and limits the depth and richness of the data that can be communicated. For example, even if there is only one applicant/borrower, 5-gender, race and ethnicity fields are still required, 4-Reasons for Denial fields are required to be included, the rest must still be included as “empty” fields. If the Action Type is “Originated”, a “Code 10” must be reported as the Reason for Denial. Why include Reasons for Denial at all?
The recently-announced MISMO HMDA Implementation Toolkit is a step in the right direction – MISMO has provided guidance on how to map/convert from the MISMO-compliant XML of the UMDP to the LAR format, hopefully reducing some of the technical complexity. XML is a more efficient and flexible structure – but this conversion is an extra step that would not be required if CFPB could accept native XML.
Perhaps as a byproduct of the availability and accessibility of nationwide data, there has also been a marked increase in the visibility of HMDA over the years. And it may get even more scrutiny. In a recent Mortgage Banking article (September 2016), John Vong, president of ComplianceEase, stated “The real issue with HMDA…[is] the potential risk in terms of compliance, legal liability and reputational damage.” Because lenders will be required to report many more data fields, there’s the possibility that regulators, community activists and attorneys may find statistical anomalies in lenders’ data and attempt to exploit them. This makes it even more important than ever for lenders to analyze their own data regularly – not just once a year – and take action to correct any issues. These could be operational, educational, or human resource related, or could be connected to poor data collection procedures.
The larger lenders will be subject to quarterly reporting under the new HMDA changes, so there’s a built-in process for them. But even for smaller lenders, an ongoing program for data collection, aggregation, analysis and proactive correction makes sense. Why wait for CFPB or an external entity to discover issues in your data? Visionet’s tools and experience can help with the following:
- Classify and index loan documents to ensure you have a complete loan file, then extract data to validate against the LOS and capture any missing fields.
- Run FFIEC’s Validity Edits and identifying any areas which require correction.
- Analyze application data with business analysis (BI) tools to allow multi-dimensional views of summary data, identification of possible trends and “drill down” into loan level detail.
Over my career, I’ve worked with lenders and technology vendors from startups through the largest institutions in the country and abroad. While our industry is generally a “late adopter”, today everyone understands that technology reduces cost, improves operations and creates revenue opportunities. Done the right way, it also softens the impact of those inevitable regulatory changes. At Visionet Systems, we support our clients every day to ensure they are efficient internally and externally, as well as flexible and proactive in their stances toward regulation. We are actively involved with MISMO, and continue to implement and improve on ways to utilize these important mortgage industry standards – for vendor management, document management and loan delivery. I still have a personal soft spot for HMDA though. The upcoming changes will create a trove of data that should be handled carefully – it will open lenders up to new levels of external scrutiny, but also presents an opportunity for internal process improvement and better process governance.