When the Consumer Financial Protection Bureau (CFPB) decided that it would enforce fair lending laws by using proxy-based methodologies to determine fair lending compliance, it spawned a whole new area of compliance that focuses almost entirely on data analysis.
Fair Lending Laws
Fair lending laws govern a wide range of a financial institution’s loan transaction activities. The Equal Credit Opportunity Act, along with its accompanying Regulation B, prohibits discrimination by a creditor against an applicant regarding any aspect of a credit transaction and applies to all types of credit transactions, including consumer loans, automobile loans, and credit cards. The Fair Housing Act makes it unlawful to discriminate because of race, color, religion, national origin, sex, handicap, or familial status.
The CFPB has been aggressively pursuing claims of discrimination using two legal theories: disparate treatment and disparate impact. Disparate treatment occurs when a lender treats a consumer differently based on one of the statutorily prohibited bases. Disparate-impact discrimination occurs when a lender applies a policy or practice equally to consumers but the policy or practice has a disproportionate adverse impact on a statutorily protected group. A plaintiff does not need to prove discriminatory intent but must demonstrate that the creditor’s policies caused these statistical disparities and that the creditor’s policy did not serve another valid business goal.
Collecting Data is Key
Any financial institution that is trying to comply with these laws and defend against fair lending claims must start by collecting demographic data regarding its borrowers. In other words, in order to ensure that a company does not run afoul of fair lending laws, the CFPB now expects the institution to internally monitor its fair lending compliance by collecting personal data about its consumers.
For example, this may now come into play when a financial institution seeks to market and solicit consumers regarding its products. Many financial institutions rely on direct marketing of special promotions. But these marketing and advertising activities must comply with fair lending laws, and a lender does not want its marketing to exclude or consumers of a particular protected class or discourage them from applying for credit.
The focus on statistical data is also driving lenders to use automated decision systems so in order to meet Regulation B’s requirement for an “empirically derived, demonstrably and statistically sound” credit scoring standard. Those striving to be “community lenders” that exercise personal judgment about whom they lend to in their community should keep in mind that judgmental systems may subject them to greater fair lending scrutiny. In order to be “empirically derived,” a lender must demonstrate that its credit scoring system is based on rigorous statistical analysis, has been developed and validated based on generally accepted statistical practices and methodologies, and is demonstrated to be supported by legitimate business purposes. A process where a lender focuses on “judgment” and uses a manual process to underwrite or price its product is riskier because not only must the lender demonstrate that its system does not condone or encourage deliberate discrimination, but it also must avoid inadvertent discrimination. To the extent that the lender makes any exceptions or overrides a criterion that it is using, the lender will need to maintain data and documentation to support the reasons for its credit decisions.
Servicing and Collections
What about servicing accounts? If a call center services a financial institution’s accounts, compliance with fair lending means making sure that its scripts, telephone-call transcripts, and other policies and procedures do not vary the quality of customer service on a prohibited basis. Training must be implemented to ensure that customer service representatives do not treat consumers differently based on a prohibited characteristic.
If a customer has trouble paying or has defaulted, a lender’s collection processes or strategies should also be reviewed to ensure that they are applied consistently for similarly situated borrowers. Fees and penalties, payment plans and debt forgiveness, loss mitigation, and collection are now areas for which disparate-impact analysis must be done.
Is Big Data the Answer?
Data-mining processes and “big data” are playing an ever-increasing role in modern businesses, and new Fintech companies increasingly tout their technological advantages in this space. For the small to medium-sized community bank with a long-standing business that hinges on personally knowing its customers and lending to the guy next door, this new emphasis on statistical data presents new compliance costs and challenges to its business model. The CFPB’s adoption of statistical analysis to enforce fair lending laws seems to be based on its belief that reliance on algorithmic techniques such as data mining and “big data” can eliminate human biases in decision-making processes. But the “patterns” that the CFPB and other regulators find are inherently subject to the prejudices and beliefs of the decision-maker analyzing the data and thus subject to error. Disparate-impact analysis may be “empirically derived and statistically sound,” but not a flawless method for determining that the fair lending laws have been violated.