In December 2012, the EU introduced controversial new regulations banning discriminatory pricing on car insurance based solely on gender. The result disadvantaged women, who as lower claimants over the years, previously paid less for insurance. The ruling was seen as yet another gender-bias box to be ticked off. Such biases are easy to identify although not all are as easy to fix. The transparency of car insurance prices makes it easy to spot the discrimination but what if the prejudice is more quirky and less obvious?
Regulators in New York are investigating a much more subtle form of discrimination but one that is equally as controversial. Apple is very keen to break into the world of financial services. It already has Apple Pay for money transfers and this year the company launched Apple Card, a digital credit card. As you might expect there are conditions applying but the Apple Card promises daily cash back offers of up to 3% of card usage and the interest rate it charges for unpaid monthly balances is not penal (though not cheap). To Apple’s annoyance it cannot underpin the credit card because it does not have a banking licence, so it has partnered with Goldman Sachs, a leading investment bank with many wealthy clients. So far so good! But as the old adage has it, “The devil is in the detail.” Accusations of blatant sexism quickly began to emerge.
It appears that a gender-biased algorithm is to blame. Goldman Sachs has begun to receive complaints from couples where their credit card spending limits are skewed on a gender basis – the males receiving much higher limits than females. This might seem plausible if a female had an inferior credit score or a lower salary but some of the discrepancies have raised alarms. Blaming a sexist algorithm is perfunctory but surely a human wrote the software programme (or have the AI robots been granted free rein here?).
Goldman Sachs is clearly embarrassed by the revelations and has promised to address the issue. Even the co-founder of Apple, Steve Wozniak, and his wife have received peculiarly different credit limits! The regulators will surely enjoy raking over this example during the investigation. It seems that such reliance on software programming, without monitoring controls, is wide open to abuse in all kinds of areas; calculation errors, incorrect categorisations and fatal flaws in addition to deliberate discrimination on the basis of gender, creed or colour.
Nor is it uniquely an issue for financial services. Software dominates the healthcare system, for example. Are we sure waiting lists are not tampered with to ensure the chronically ill are deferred to a later date in the hope that they die off? Or that medications are not deliberately restricted to cut costs?
It’s a leap of faith for all to accept that computers can be ethically motivated. Fundamentally, there is a human input behind it all. Regulators will have a tough job sorting out the bad apples from the blood oranges.