A new study from Women’s World Banking found that credit scoring AI systems employed by global financial service providers are likely to discriminate against women, excluding women from loans and other financial services. The study’s findings suggest that financial technology companies are missing a major opportunity to close the existing $17 billion gender credit gap and help reach the nearly 1 billion women who remain unbanked.
The study, Algorithmic Bias, Financial Inclusion, and Gender, funded by the Visa Foundation, explores the promises and pitfalls of using digital tools to open up new credit to women individuals and entrepreneurs. Specifically, it examines where biases in AI emerge, how they are amplified, and the extent to which they work against women.
“The financial services industry needs to act immediately to address sexism in credit scoring technology – not only because it’s the right thing to do but also to better equip the industry to take advantage of a $17 billion market opportunity given the gender credit gap,” said Mary Ellen Iskenderian, CEO of Women’s World Banking. “This issue isn’t hypothetical – sexist credit scoring systems pose a real threat to women’s livelihoods, their families, the growth of their businesses, and the health of the economies to which they could contribute.”
The study found that algorithms themselves are often biased because the individuals creating them have unconscious biases that they code into the algorithms. Biases also emerge because of incomplete, faulty, or prejudicial data sets that companies use to “train” the algorithm. The majority of data sources are vulnerable to gender-based bias.
Data scientists and algorithm developers on the whole (U.S.-based, male, and high income) are not representative of the end customers being scored.
“Women have historically suffered from discrimination in lending decisions – and we can’t allow that to continue into the digital realm. Alternative credit scoring data can be a boon for women entrepreneurs who are often denied credit because of a lack of information. We need AI technologies to help women, not work against them,” Iskenderian added.