Unfair lending with AI? Don't point just at us, fintechs and online lenders say (2024)

As bank regulators once again question the fairness of artificial intelligence in loan decisions, online lenders and fintechs agree that the technology should be used carefully, transparently and with tests for bias and disparate impact.

Rohit Chopra, director of the Consumer Financial Protection Bureau, warned recently that the artificial intelligence in loan decisions could lead to illegal discrimination. Online lenders say that with the right safeguards in place and good motives, loan decisions made using AI are more fair than those of traditional underwriting systems. Plus, tech-enabled loans expand access to credit, they claim.

Unfair lending with AI? Don't point just at us, fintechs and online lenders say (1)

The fresh regulatory scrutiny from a Democratic administration could bring new rules and enforcement actions that affect the fintechs and banks that use AI in lending decisions, and the industry is fine with that.

“AI is not perfect,” said Jason Altieri, general counsel and chief compliance officer at the online lender Happy Money and former general counsel at LendingClub. “If you have humans designing something, bias will creep in. It just will. It may be by the designers or it may be the limitations of the data that they're using.”

Yet Altieri says AI is less biased than human loan officers.

“The way it was done for years was you'd walk in, you'd ask your banker for a loan, and they would look at you and with absolute bias say, Do I know you, do you have an account here and do you look like me?” Altieri said. “That's not how we make loans anymore. In 2009, when I joined LendingClub, it was, we don't even see you, we don't even know you, you were just a series of data points. That made it better for everybody.” AI also allows the use of nontraditional data like how current a borrower is on their rent and utility payments, he said.

“I absolutely think this is the way it needs to go,” Altieri said.

Upstart and the CFPB

This debate started five years ago, when Richard Cordray was the CFPB director and online lenders that use AI in their underwriting, like LendingClub, Prosper, OnDeck and Upstart, were growing fast. The bureau worked out an agreement with Upstart through which it gave the company a no-action letter (essentially a promise not to bring enforcement action against the company for its use of automated underwriting) in exchange for quarterly data about the lender’s loan approvals and denials and its handling of fair-lending rules. The no-action letter was renewed in 2020.

“I would say there hasn't been a significant change in [the no-action letter] process across the last three administrations,” said Nat Hoopes, head of public policy and regulatory affairs at Upstart. “It's sharing data — what are the variables in the model, all the information that the CFPB [requests] to poke and prod at it — and getting feedback and being quite transparent on everything we're doing. And then taking their input and iterating to the next phase.”

The CFPB gets a lot of transparency out of that process, “not just from Upstart, but about AI in general, and how much can AI benefit borrowers of color, especially,” Hoopes said.

In 2019, the CFPB compared the results of Upstart’s AI-based loan decision model to a more traditional model and published the results.

“The tested model approves 27% more applicants than the traditional model, and yields 16% lower average [annual percentage rates] for approved loans,” the CFPB wrote. “This reported expansion of credit access reflected in the results provided occurs across all tested race, ethnicity, and sex segments resulting in the tested model increasing acceptance rates by 23% to 29% and decreasing average APRs by 15-17%.”

Under the Chopra-led CFPB, “You have to be prepared for more scrutiny,” Hoopes said. “You really need to be checking your algorithms and making sure that they comply with Equal Credit Opportunity Act and disparate impact rules and that you're doing all that testing the right way. But I also see an enormous amount of focus [from Chopra] on any legacy provider that has a de facto monopoly, where the consumer isn't getting a good deal.”

This time around, regulators are more educated, in part because of the CFPB’s work with Upstart.

“The machine learning analysts that they've engaged with are very familiar with this,” said Teddy Flo, general counsel at Zest AI, a former online lender that now provides AI-based lending software to banks. “They really leveled up their understanding of the technology and are now asking much more granular questions: Are you doing disparate-impact testing? Are you doing disparate-treatment testing? Which explainability method are you using?” Regulators are also looking more closely at the data sources used to train the models, and if they are compliant, he said.

'Algorithms can never be free of bias'

Chopra has said that algorithms can never be free of bias and that they may result in credit determinations that are unfair to consumers.

AI lending technology vendors agree that there are dangers to using AI in lending decisions.

“Left to their own devices, there's no question that these algorithms will discriminate when they encounter a subpopulation that is not well represented in the data,” said Kareem Saleh, founder and CEO of FairPlay, a developer of software that tests loan decisions for signs of bias, disparate impact and discrimination. People of color, for instance, have historically been excluded from the lending system, and therefore less data is available about how their loans perform.

“When you encounter those folks, having very little data about them, the algorithm will necessarily assume that they're riskier,” Saleh said.

The algorithms have to be tested to see if their outcomes are fair, Saleh said. But there’s also an opportunity for an algorithm to pick up on clues that disadvantaged people resemble good applicants who were approved.

FairPlay’s software can take a second look at declined applicants using information about underrepresented populations. For instance, if the model encounters someone with inconsistent employment, rather than automatically assuming that they're riskier, it can recognize that women sometimes take career breaks, and that doesn’t make them irresponsible borrowers.

When FairPlay reviews loan decisions, “We find that 25% to 33% of the time, the highest-scoring people of color and women that get declined would have performed as well as the riskiest white men that most lenders approve,” Saleh said.

This all assumes the lenders using AI have worthy intentions, which is not always the case.

“There are actors out there that are not doing it well, that are not explaining their decisions with accurate methods and that are not focused sufficiently on fair lending,” Flo of Zest AI said. That’s why he and others agree with the principles the CFPB is articulating.

“It's not the AI, it's the intentionality in reducing the racial disparity that’s important,” said Chi Chi Wu, staff attorney at the National Consumer Law Center.

Calling AI a black box

Lorelei Salas, assistant director for supervision policy at the CFPB, warned in a recent blog post that the bureau plans “to more closely monitor the use of algorithmic decision tools, given that they are often 'black boxes' with little transparency.”

This is not a new criticism.

Jason Altieri was general counsel at LendingClub in 2014, when the Securities and Exchange Commission asked the company to include its lending algorithms in the S-1 document it was preparing before its initial public offering.

Altieri had two objections: First, no one would understand it. Second, “that's like asking co*ke to put the recipe for co*ke out into the public. You just can't do that. You’ve got to respect that these companies are there to make credit available on better terms to broader swaths of folks, but they are a business. And so you're going to have to tolerate not knowing how it works, but then trust that the [fair-lending] testing takes care of it.”

Transparency comes from testing individual variables and clusters of variables to make sure that there's no proxy for a protected class, Hoopes said. And all model outputs need to be run through fair-lending tests and disparate-impact tests.

“You need to achieve acceptable fair-lending results, but then you also should be trying to expand access to credit,” Hoopes said. "According to the Urban Institute, only 20% of Black borrowers have a credit score over 700, compared to 50% of white borrowers. So if you rely on a traditional underwriting model with a 700 credit score cut-off, you are denying a chance to so many minority borrowers who have never ever defaulted on loans, yet have somewhat lower scores."

Companies that use AI in lending today believe the real black boxes are the traditional models banks use to calculate credit scores and weigh credit decisions.

When Zest produces a model, Flo said, it comes with a 50- to 100-page report explaining every feature in the model, how it works, and how important it is to every decision. It also provides a 20-page fair-lending report that talks about each variable's contribution to disparate impact.

“It verifies there's no disparate treatment,” Flo said. “That's what responsible machine learning practitioners are putting in the hands of lenders. Whereas FICO gives you a three-digit number and one of 20 codes explaining the decision, with no transparency into how they got there.” FICO did not respond to a request for comment by press time.

One company working to root out bias in its models is Happy Money in Tustin, California, an online lending partner to several banks and credit unions. The company is implementing Fairplay’s software alongside its lending algorithms, and, like its peers, is testing regularly for bias. Most lenders today test their loan decisions for possible fair-lending violations on a quarterly basis, Altieri said.

“I can take a potential model, run it through and be able to see real-time if I inadvertently tripped over something where people of color are being declined, and if so, not roll it out and redesign it,” Altieri said. “That's the key piece here: very fast feedback. Historically, it would take a long time to see what the actual effects of a model were.”

Unfair lending with AI? Don't point just at us, fintechs and online lenders say (2024)

FAQs

What are the problems with AI lending? ›

These models can analyze large amounts of data to make faster and more accurate credit assessments. However, there are also risks around fairness, bias, and explainability of the decisions made by these black-box algorithms.

What is the problem with AI in banking? ›

The use of AI in banking has raised several ethical and legal concerns, including privacy, security, lack of transparency and algorithmic bias. In terms of privacy, AI systems pose challenges concerning how they may process or store personal data without the proper permissions.

Does fair lending law apply to artificial intelligence? ›

Biases inherent in historical data can permeate both established and AI-driven lending models, leading to discriminatory outcomes. Machine learning algorithms, if not carefully designed, monitored, and updated, may inadvertently perpetuate these biases, undermining the principles of fair and inclusive lending.

How does AI affect lending? ›

Role of AI in Lending

AI, along with machine learning (ML) and Gen-AI, helps financial institutions identify borrowing patterns to reduce the risk of default. By utilizing machine learning algorithms banks can efficiently analyse large amounts of data to evaluate creditworthiness and make real-time lending decisions.

What is the biggest problem with AI? ›

The main issues surrounding AI are data security and privacy since AI systems require large amounts of data for operation and training. To avoid leaks, breaches, and misuse, one must ensure data security, availability, and integrity.

What are the negative effects of AI in finance? ›

Data Security Risks

AI systems heavily rely on data, and any vulnerabilities in data storage or processing can expose sensitive financial information to potential breaches. As a CFO, you must prioritize and implement robust security measures and regularly update your AI systems to prevent potential data breaches.

Can AI be biased? ›

Research shows how AI is deepening the digital divide. Some AI algorithms are baked in bias, from facial recognition that may not recognize Black students to falsely flagging essays written by non-native English speakers as AI-generated.

What is an example of discriminatory AI? ›

Arguably the most notable example of AI bias is the COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) algorithm used in US court systems to predict the likelihood that a defendant would become a recidivist.

Will there be laws against AI? ›

Introduced on January 31, 2024, AB2013, would require, on or before January 1, 2026, a developer of an artificial intelligence system or service made available to Californians for use, regardless of whether the terms of that use include compensation, to post on the developer's internet website documentation regarding ...

What is the 7 day rule in a mortgage? ›

Mortgage Closing Waiting Period

The Rule prohibits the lender and consumer from closing or settling on the mortgage loan transaction until 7 business days after the delivery or mailing of the TILA disclosures, including the Good Faith Estimate and disclosure of the final APR.

What are the three types of lending discrimination? ›

Types of Lending Discrimination

Overt evidence of disparate treatment; • Comparative evidence of disparate treatment; and • Evidence of disparate impact.

How AI is affecting fintech? ›

The integration of AI in Fintech has democratized investing through the rise of robo-advisors and algorithmic trading platforms. These AI-powered systems leverage advanced algorithms to analyze market data, assess risk, and execute trades autonomously.

How AI is disrupting the banking industry? ›

Although the concept of hyper-personalization is nothing new, AI is pushing the limits of what's possible. AI platforms for the banking industry have the ability to analyze customer data to develop a deep understanding of customers' needs and enable FIs to design tailored experiences that meet those needs.

What is the negative impact of AI on the economy? ›

Roughly half the exposed jobs may benefit from AI integration, enhancing productivity. For the other half, AI applications may execute key tasks currently performed by humans, which could lower labor demand, leading to lower wages and reduced hiring. In the most extreme cases, some of these jobs may disappear.

What are the risks of digital lending? ›

Fintech lending comes with inherent risks, such as high interest rates, fraud, and cyber threats, that fintech companies must vigilantly address.

What is the future of AI in lending? ›

Thanks to the advances in artificial intelligence (AI) and machine learning, the future of the lending industry is undergoing a major transformation. AI is enabling lenders and financial institutions to automate processes, reduce costs, improve customer experience, and manage risks more effectively.

What are the disadvantages of AI trading? ›

Lack of transparency: The inherent complexity of AI algorithms can render their decision-making processes opaque to traders. This lack of transparency can breed uncertainty, particularly when AI-driven trading systems execute actions that appear counterintuitive or unexplained.

Top Articles
Hoekbank met slaapfunctie Calvados links chenille
9 Money Making Careers That Don't Suck for ADHD Adults
Swimgs Yuzzle Wuzzle Yups Wits Sadie Plant Tune 3 Tabs Winnie The Pooh Halloween Bob The Builder Christmas Autumns Cow Dog Pig Tim Cook’s Birthday Buff Work It Out Wombats Pineview Playtime Chronicles Day Of The Dead The Alpha Baa Baa Twinkle
Obor Guide Osrs
What happened to Lori Petty? What is she doing today? Wiki
Federal Fusion 308 165 Grain Ballistics Chart
Produzione mondiale di vino
Lichtsignale | Spur H0 | Sortiment | Viessmann Modelltechnik GmbH
Our History | Lilly Grove Missionary Baptist Church - Houston, TX
Ohiohealth Esource Employee Login
Clairememory Scam
Ella Eats
Restaurants Near Paramount Theater Cedar Rapids
Dallas’ 10 Best Dressed Women Turn Out for Crystal Charity Ball Event at Neiman Marcus
Craigslist Free Stuff Greensboro Nc
Straight Talk Phones With 7 Inch Screen
Navy Female Prt Standards 30 34
Brett Cooper Wikifeet
Jinx Chapter 24: Release Date, Spoilers & Where To Read - OtakuKart
Nail Salon Goodman Plaza
Skip The Games Fairbanks Alaska
north jersey garage & moving sales - craigslist
Isaidup
Is Windbound Multiplayer
What Are The Symptoms Of A Bad Solenoid Pack E4od?
Deshuesadero El Pulpo
2021 MTV Video Music Awards: See the Complete List of Nominees - E! Online
Piedmont Healthstream Sign In
130Nm In Ft Lbs
Rainfall Map Oklahoma
Meggen Nut
Storelink Afs
The Ultimate Guide to Obtaining Bark in Conan Exiles: Tips and Tricks for the Best Results
Human Unitec International Inc (HMNU) Stock Price History Chart & Technical Analysis Graph - TipRanks.com
Dr Adj Redist Cadv Prin Amex Charge
Bella Thorne Bikini Uncensored
Crazy Balls 3D Racing . Online Games . BrightestGames.com
Weather Underground Bonita Springs
Vons Credit Union Routing Number
ACTUALIZACIÓN #8.1.0 DE BATTLEFIELD 2042
Toomics - Die unendliche Welt der Comics online
9:00 A.m. Cdt
Tyco Forums
Noga Funeral Home Obituaries
Premiumbukkake Tour
Windy Bee Favor
Euro area international trade in goods surplus €21.2 bn
Sams Gas Price San Angelo
Rovert Wrestling
25100 N 104Th Way
Dolce Luna Italian Restaurant & Pizzeria
Latest Posts
Article information

Author: Trent Wehner

Last Updated:

Views: 5861

Rating: 4.6 / 5 (76 voted)

Reviews: 83% of readers found this page helpful

Author information

Name: Trent Wehner

Birthday: 1993-03-14

Address: 872 Kevin Squares, New Codyville, AK 01785-0416

Phone: +18698800304764

Job: Senior Farming Developer

Hobby: Paintball, Calligraphy, Hunting, Flying disc, Lapidary, Rafting, Inline skating

Introduction: My name is Trent Wehner, I am a talented, brainy, zealous, light, funny, gleaming, attractive person who loves writing and wants to share my knowledge and understanding with you.