Denied Credit By An AI? You're Owed A Full Explanation, Regulator Says

A businessman looking at a computer

Jose Luis Pelaez Inc / Getty Images

If your application for credit is rejected, the creditor owes you an accurate explanation—even if it was an artificial intelligence system or complex algorithm that made the decision, the government’s consumer watchdog says.

Key Takeaways

  • Companies that deny credit to consumers must provide an accurate explanation why, even if it was made by an inscrutable "black box" algorithm, the CFPB says.
  • The guidance affects "adverse" action notices that financial companies must give to consumers by law, to prevent discrimination.
  • Artificial intelligence systems, such as the ones increasingly used to make credit decisions, can be so complex even their creators don't know how they reached their decisions.

Companies that use AI or algorithms to deny credit to someone still must give consumers accurate reasons for the decision, the Consumer Financial Protection Bureau said Tuesday.

In general, when a company denies a credit application, they have to send out an “adverse action” notice explaining the reasons for the rejection. That requirement was created by lawmakers as part of the Equal Credit Opportunity Act to help ensure that lenders had real reasons for rejecting certain applications, and weren’t discriminating against people based on race or other illegitimate reasons.

The CFPB’s new guidance says companies have to provide real and detailed explanations—not just generic boilerplate. For instance, if a company restricts a customer’s line of credit because they bought certain items or shopped at certain stores, they have to say which purchases hurt their credit, rather than a vague explanation such as “purchasing history,” the CFPB said.

“Creditors must disclose the specific reasons, even if consumers may be surprised, upset, or angered to learn their credit applications were being graded on data that may not intuitively relate to their finances,” the CFPB said in a press release.

The bureau noted companies are increasingly turning credit decisions over to AI and algorithms that are so complex, that researchers refer to them as “black boxes.”

The “black box” phenomenon alluded to by the CFPB guidance refers to the fact that AI and machine learning systems don’t work like traditional computer programs, which follow a set of instructions laid out by a programmer.?

Instead, they process massive amounts of data and follow a “learning” process, resulting in incredibly complex systems. Even the creators of such “black box” systems often cannot understand exactly how they reached a particular decision, according to computer scientists.

“Technology marketed as artificial intelligence is expanding the data used for lending decisions, and also growing the list of potential reasons for why credit is denied,” CFPB Director Rohit Chopra said in a statement. “Creditors must be able to specifically explain their reasons for denial. There is no special exemption for artificial intelligence.”

Do you have a news tip for Investopedia reporters? Please email us at
Article Sources
Investopedia requires writers to use primary sources to support their work. These include white papers, government data, original reporting, and interviews with industry experts. We also reference original research from other reputable publishers where appropriate. You can learn more about the standards we follow in producing accurate, unbiased content in our editorial policy.
  1. Consumer Financial Protection Bureau. "CFPB Issues Guidance on Credit Denials by Lenders Using Artificial Intelligence."

  2. Harvard Data Science Review. "Why Are We Using Black Box Models in AI When We Don’t Need To? A Lesson From an Explainable AI Competition."

Open a New Bank Account
The offers that appear in this table are from partnerships from which Investopedia receives compensation. This compensation may impact how and where listings appear. Investopedia does not include all offers available in the marketplace.