Blog

Blog

Lauren Jordan

Lauren envisions a world where people have an awareness of themselves, their identities, and the world around them, hope that the world can be improved, and the confidence to be a part of that change. She enacts this vision as a changemaker and entrepreneurship ecosystem practitioner, helping entrepreneurs succeed. She is currently helping to launch the Anacostia Impact Fund as the VP for Borrower Relations by increasing visibility and access of community financing options for local entrepreneurs. Her path has uncluded University of Michigan, Paolo Freire, City Year, New York University, Ashinaga, Robofest, LearnServe International, and 4.0 Schools.

Algorithmic Bias in Credit Scoring: How to Limit their Effect?

Dec 07, 2021
Algorithmic Bias in Credit Scoring: How to Limit their Effect?

Credit scoring has the potential to improve access to capital and financial inclusion around the world. As financial institutions begin to have access to different types of data, the variety of credit scoring mechanisms have increased. One way that financial institutions have kept track of these alternative data and found ways to understand them is through the use of algorithms and artificial intelligence. In a world with massive data, algorithms help financial institutions make decisions. Innovative methods for credit scoring have the potential to expand access to capital and other financial services for marginalized communities. However, their creation must be intentional and transparent to account for the possible biases that may arise from biased data or biased algorithms. 

 
Algorithms are the creation of people and despite relying on computing power, they are still shaped by the culture, perspective, and biases of those who create them. Implicit bias means that even if we don’t endorse them, our thoughts are still shaped by the perceptions, attitudes, and stereotypes that are present in the cultures we were raised in. Algorithms are made to replicate human behavior and since human behavior may be biased, the algorithms humans create might also be. Algorithmic bias is a systematic error (a mistake that’s not caused by chance) that causes unfair, inaccurate, or unethical outcomes for certain groups of people.  

 

Many believe that in order to remove this bias, algorithms should not track things like race and gender. They believe that disaggregated data will deter these algorithms from being biased, however, this is not the case.  Even if race and gender are not explicitly tracked, the algorithm can still pick up on the bias because of the structural and systemic nature of these biases. 
Proxy discrimination happens when a legally prohibited characteristic (like gender or race) can be predicted by other non-suspect data intentionally or unintentionally. There are many proxies for gender, such as: 
 

  • Occupation; 

  • Places where people shop; 

  • Items that people spend money on; 

  • The types of healthcare people access; 

  • The type of apps that people download; 

  • The type of music that people listen to. 

 

Removing a variable does not eliminate bias. In fact, this blindness often makes it harder for companies to identify and prevent bias since they cannot keep track of it. As long as humans are vulnerable to biases, algorithms will be vulnerable too.  
 

Champions of innovative scoring have highlighted five strategies to help financial institutions minimize the effects of bias when it comes to credit scoring, and to improve overall algorithmic hygiene.  The mnemonic device for these strategies is REACT:
 

  • Regulation – Regulatory capacity should be built to ensure the models that financial institutions are using are adapted to the customers and that these models enable the rest of the strategic criteria – explainability, accountability, collaboration, and transparency. This way, frameworks can be developed and tested and then evaluated by an outside source.  

  • Explainability – All decisions made by the algorithm should be easily explainable to the potential borrower. The decision should be simple and straightforward.  

  • Accountability – Financial institutions should hold themselves accountable for minimizing bias in their algorithms by setting financial inclusion key performance indicators. 

  • Collaboration – As innovative scoring continues to grow, financial institutions should share knowledge on best practices surrounding financial inclusion.  

  • Transparency – Credit service providers should understand the frameworks that are being used and be able to explain the data used, where the providers obtained the data, why they are using the data and the basis for the credit scoring decision. Consumers should have access to the steps of the lending process, what is involved in decision-making, and the rationale for why credit decisions are made.  
     

There are many champions that we can look to who are doing a great job of mitigating gender bias in the financing industry. 
 

  • African Academic Network on Internet Policy (AANOIP) is a network for interdisciplinary scholarly engagement and discussion on the State of the Internet, related policies and regulatory regimes in Africa. As a think tank AANOIP publishes its findings on best practices related to regulation and has championed the need for inclusive algorithmic design.  

  • Wakandi – Launched the Credit Association Management System in Africa to digitize the way informal financial groups operate, helping to digitize these financial spaces and provide more women with an official identification system where their financial transactions can be tracked. By digitizing informal financial groups, Wakandi has created a method to increase the amount of information available to input into an algorithm, diversifying data and creating access.  

  • The Algorithmic Justice League - The Algorithmic Justice League is an organization that combines art and research to illuminate the social implications and harms of artificial intelligence.

  • The Alliance for Inclusive Algorithms - The < A+ > Alliance, founded 2019, is a global coalition of technologists, activists and academics who champion Inclusive AI and Affirmative Action for Algorithms so that we come closer to creating gender and racial equality in a world where machine learning does not wire an already biased system into our future.  

While algorithms have a lot of potential for growing financial inclusion, they are not perfect and may contain the biases of the humans that create them.  As such financial institutions must ensure that they are not over-relying on this or any other technology in order to minimize bias, and must perform due diligence utilizing complementary human and computational means while working to improve their algorithmic hygiene.  

-------------------------------------

Citations:

Credit Scoring Guideline | World Bank

Women’s Digital Financial Inclusion in Africa | G7 and the Bill and Melinda Gates Foundation

Policy & RegulationCredit Risk & Scoring

  •