The Consumer Financial Protection Bureau (CFPB) has issued guidance about certain legal requirements that lenders must adhere to when using artificial intelligence (AI) and other complex models. The guidance describes how lenders must use specific and accurate reasons when taking adverse actions against consumers—meaning creditors cannot simply use CFPB sample adverse action forms and checklists if they do not reflect the actual reason for the denial of credit or a change of credit conditions.
The CFPB’s requirement is especially important with the growth of advanced algorithms and personal consumer data in credit underwriting. Explaining the reasons for adverse actions help improve consumers’ chances for future credit, and protect consumers from illegal discrimination.
“Technology marketed as artificial intelligence is expanding the data used for lending decisions, and also growing the list of potential reasons for why credit is denied,” said CFPB Director Rohit Chopra. “Creditors must be able to specifically explain their reasons for denial. There is no special exemption for artificial intelligence.”
Creditors are increasingly using complex algorithms, marketed as AI, and other predictive decision-making technologies in their underwriting models. Creditors often feed these complex algorithms with large datasets, sometimes including data that may be harvested from consumer surveillance. As a result, a consumer may be denied credit for reasons they may not consider particularly relevant to their finances.
Despite the potentially expansive list of reasons for adverse credit actions, some creditors may inappropriately rely on a checklist of reasons provided in CFPB sample forms. However, the Equal Credit Opportunity Act (ECOA) does not allow creditors to simply conduct check-the-box exercises when delivering notices of adverse action if doing so fails to accurately inform consumers why adverse actions were taken.
The CFPB confirmed in a circular from last year that ECOA requires creditors to explain the specific reasons for taking adverse actions. This requirement remains even if those companies use complex algorithms and black-box credit models that make it difficult to identify those reasons.
The CFPB’s latest guidance expands on last year’s circular by explaining that sample adverse action checklists should not be considered exhaustive, nor do they automatically cover a creditor’s legal requirements.
The latest guidance explains that even for adverse decisions made by complex algorithms, creditors must provide accurate and specific reasons. Generally, creditors cannot state the reasons for adverse actions by pointing to a broad bucket. For instance, if a creditor decides to lower the limit on a consumer’s credit line based on behavioral spending data, the explanation would likely need to provide more details about the specific negative behaviors that led to the reduction beyond a general reason like “purchasing history.”
Creditors that simply select the closest factors from the checklist of sample reasons are not in compliance with the law if those reasons do not sufficiently reflect the actual reason for the action taken. Creditors must disclose the specific reasons, even if consumers may be surprised, upset, or angered to learn their credit applications were being graded on data that may not intuitively relate to their finances.
The CFPB has also issued an advisory opinion that consumer financial protection law requires lenders to provide adverse action notices to borrowers when changes are made to their existing credit.
The CFPB has made the intersection of fair lending and technology a priority. For instance, as the demand for digital, algorithmic scoring of prospective tenants has increased among corporate landlords, the CFPB reminded landlords that prospective tenants must receive adverse action notices when denied housing. The CFPB also has joined with other federal agencies to issue a proposed rule on automated valuation models (AVMs), and is actively working to ensure that black-box models do not lead to acts of digital redlining in the mortgage market.