OPINION: Fix AI’s racist, sexist bias

Opinion

ARTIFICIAL INTELLIGENCE

Two years ago, my business partner and I were in need of gap financing while we waited for a client to pay us. We approached the main banks but, even with collateral and our education technology company’s healthy financial record, we were deemed too big a risk and turned down.

Luckily, we had an angel investor step in to provide the loan. 
But little did I know at the time that our record of an unsuccessful application would not only affect me, but all black women entrepreneurs.

South African banks are increasingly investing in artificial intelligence (AI) to make loan decisions. The loan history and demographic data of applicants whom loan officers accept or reject is used to “train” AI to determine new loan applications. Although the race and gender information may not explicitly be captured, studies have show that AI can use location data and habits to predict race and gender — and inadvertently discriminate against applicants. I am a 32-year-old woman of colour, who lived in Bo-Kaap in Cape Town and this information will probably be used to train an AI program to reject more entrepreneurs who share my demographics.

Access to finance for businesses and homes is not equal in South Africa, with race and gender still posing significant barriers. We expect technology to create more fairness, but this will not be the case if decisions made in our unequal society are used to train AI to perpetuate the same financial injustices of our past.

Countries around the world are grappling with the problem of AI bias. The United States justice system, for example, used an AI program to manage profiling for convicts. The program mistakenly flagged black prisoners as likely to reoffend at almost twice the rate of white prisoners. Google came under fire when its search algorithm for “gorillas” included images of African-Americans. In the banking sector, AI has denied loans for certain groups, violating the US Fair Housing Act.

It’s what computer programmers call “garbage in, garbage out” — biased results caused by biased data.

The banking sector globally has, apart from the tech industry, been the largest investor in AI, with an estimated $7.5-billion to be spent by the sector in 2019. South African banks have been part of this trend of investing heavily in AI. In 2017, Capitec spent R348-million acquiring a 40% stake in Creamfinance, which will support the bank’s online lending capabilities and offer advanced credit scoring through AI. Absa has invested more than R350-million in the past three years in automation technologies, specifically robotics and AI.

Although the industry is talking about the job losses these investments will cause in the sector, there have been no conversations about the potential racial and sexist bias that could become entrenched as a result of AI.

Artificial intelligence has the ability to process large quantities of information and learn patterns through computer programs. It is being used in a variety of ways in the finance sector, from giving advice to selecting products and services based on customer history in addition to being used by banks to determine credit ratings and the ability to grant or deny loans.

But AI is only as smart as the data it has learnt from. If the data initially given to it is biased, then the decisions the AI machine makes will reflect this.

African Development Bank’s director of gender, women and civil society, Vanessa Moungar, said there is a huge financing gap in Africa that must be bridged. Although black women are the largest self-employed group of the population, their ability to grow their businesses has been limited by not having ready access to credit or security-backed finance.

Recent figures on business-specific loans aren’t available, but in an International Finance Corporation study in 2006, only 2% of black men and women had home loans, compared with a rate of 26% for white women and 32% for white men. This also has an effect on the ability of black entrepreneurs to access finance for which they could use their property to get a bank loan for other purposes, including business ventures.

In instances where loans are granted, the amount and the interest rates differ. A report by Munro Consulting Actuaries looked at thousands of past loan agreements from FNB. It found that for loans of more than R150 000, 94% were granted to white South Africans and the remainder was split among other race groups. White clients had an interest rate of 1.1% below prime, whereas for other races it was 0.45% below prime. A case against FNB for this lending discrimination is now heading to the Equality Court.

But these historic data records, in which white borrowers are favoured, are part of the bank’s records being used to train AI machines right now.

The banking sector cannot abdicate responsibility by thinking that technology will change the legacy issues involved in financing black women. We are at an important moment as we move from human decision-making to technology-enabled decision-making. We need to rethink the information we are feeding into computer programs. 

I have been running businesses since I was 21 and have employed dozens of people. I, and many other black women entrepreneurs, can contribute to South Africa’s economy. But the financial system makes it harder for us to grow our businesses and thrive because the playing field continues to be skewed.

The loan decisions made now will influence the AI decisions of the future. If we do not do something now, the marginalisation from our apartheid past, which has persisted into our democratic present, will continue to plague us in the future.

Naadiya Moosajee is a serial entrepreneur by passion and a civil engineer by training. She co-founded an ed-tech and advisory firm as well as WomEng, a global social enterprise developing women and girls for the engineering and tech industry. She serves on several local and international boards and is an Aspen New Voices Fellow.

 

Leave a Reply

Your email address will not be published. Required fields are marked *