"The Impact of Artificial Intelligence on Lending" by Sadie Cavazos
  •  
  •  
 

Document Type

Student Article

Abstract

The issue of biased lending is longstanding and has faced much legislation over the past few decades. When issues of discrimination in the housing market became center stage in the 1960s, Congress passed multiple acts to combat what became known as “redlining,” or systematically denying credit to minority groups of people. Acts such as the Fair Housing Act and the Equal Credit Opportunity Act worked to eliminate this discrimination, but that does not mean bias does not still exist. However, lending companies, due to the efforts of the above-enumerated acts, can no longer act on these biases. But with the introduction of algorithmic lending with artificial intelligence systems, the decades of work to eliminate lending discrimination are threatened. Artificial intelligence systems are trained by humans but often lack human oversight when making actual lending decisions. This human training leads to implicit biases within the algorithms themselves, bringing up the issue once more of discriminatory lending practices, known as “digital redlining.” To combat this discrimination, there needs to be regulation on how algorithms must be trained and the transparency of their decision-making. Without such regulations, protected classes of people will once again face housing discrimination, much like that of those in the 1960s and before.

DOI

10.37419/JPL.V11.I2.4

First Page

331

Last Page

354

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.