Credit Cards should not be black

David Pereira
Towards Data Science
2 min readNov 21, 2019

--

Credit Card
“File:Insert Credit Card (61471) — The Noun Project.svg” by Icons8 is marked under CC0 1.0. To view the terms, visit http://creativecommons.org/publicdomain/zero/1.0/deed.en

Credit cards shouldn’t be black. We learnt this fact the hard way in Spain, but we have recently found another example of why black cards are not a good idea.

A very well-known software developer published this tweet earlier this month, in which he claimed that the brand new Apple Card algorithms for credit scoring were sexist.

Things got even worse when Apple co-founder Steve Wozniak joined the conversation, basically claiming that he had suffered from the same problem.

Apple and Goldman Sachs replied to this viral news, stating that the algorithm was assessed against bias by a third party, and that gender was not used as input data for the algorithm to calculate credit limits.

Truth is gender cannot be used in the US as part of any credit decision, as stated by the Federal Reserve.

“The Equal Opportunity Credit Act largely prohibits the use of demographic information, including gender, in credit underwriting, pricing, reporting, and scoring.”

Nevertheless, gender being not an algorithm variable does not make the situation better. In fact, as Wired points out in this article, it makes it even worse:

“ This explanation is doubly misleading. For one thing, it is entirely possible for algorithms to discriminate on gender, even when they are programmed to be “blind” to that variable. For another, imposing willful blindness to something as critical as gender only makes it harder for a company to detect, prevent, and reverse bias on exactly that variable.”

As Cathy O’Neil points out in an interview regarding this Apple Card story:

We have to develop a new way of talking about it, because the old way of talking about it doesn’t work anymore. The old way of talking about it was, “Let’s not use race or gender explicitly.” That kind of worked back when we didn’t collect a lot of data about each other and about ourselves, and it wasn’t just available everywhere. But nowadays we can infer race and gender based on all sorts of different indicators. There are proxies for those kinds of classes just up the wazoo for companies that are interested in inferring that kind of thing. Statistically they’re a lot better compared to a random guess. It’s no longer sufficient to say, “Let’s not explicitly use race and gender.”

This is why, as I introduced in my last short article (Computer says no), we need not only algorithmic explainability, but also traceability and auditability, moving us from explainable AI to traceable and transparent AI, and that is also why we should not let credit cards be black, black algorithmic boxes.

If you enjoyed reading this piece, please consider a membership to get full access to every story while supporting me and other writers on Medium.

--

--