Deliberate Bias With Apple Card Algorithm

Nicholas Becker
5 min readDec 3, 2019

Apple is constantly changing their technology that they put out to the public, whether it is phones, watches, computers, glasses, and even cards, Apple is willing to expand to benefit the general public. However, with their new Apple card that was introduced on March 25, 2019, going viral without some sampling tests, customers started to notice a fault in the line of credit when wanting to increase it. It was first found when a married couple, whom both had apple cards asked for an increase in credit. When the husband asked for a credit increase in the earlier months the Goldman Sachs bank and apple accepted it, however when the wife tried to increase the credit line, the algorithm rejected her request, eventhough the married couple share the account. Apples algorithm, appears to be sexist in this case and the man who called Apple out was, “David Hansson” which is the husband of the wife who was mistreated in apples case.

Furthermore, Apple and Goldman Sachs are dealing with an algorithmic bias in which includes, race, marital status, and gender. Their response is somewhat suprising to many from Apple and Golaman Sachs as well, as they states, “we do not know your gender or marital status during the Apple Card process,” “it’s just the algorithm,”. Imagine if other companies tried to use this lousy defense to try and cover up an algorithmic bias. Apple uses a, “black box” algorithm with no capability to produce an explanation, and then abdicated all responsibility for the decision outcomes. Knowing this you would have thought there would have been a slight tweak or change in the algorithm, but there was no intent to even try with their lousy defenses to the married couple.

Better responsibility should have been taken by Apple and Goldman Sachs, by not ignoring race, gender, and marital status in order to make the algorithm fair. In my words, I would have to assume that they thought the algorithm would possibly ignore the statuses and make it fair for everyones credit line, but that is definitely not the case. Also better explanation from the big name companies would also help to make the customers feel like they are always right, but the companies overall customer service needs to be looked at as a whole, instead of arguing with someone who got treated unfairly in a situation.

In terms of a better algorithm or a quick fix to this problem would be using Complex, opaque technologies. Sort of like artificial learning, this way the computer can learn about the couple that got treated poorly and take down notes about their race, marital status, and gender, so that way all of the bias is already accounted for which could possibly eliminate the current bias that the Apple card is causing. If this change is made it could save the product from being put back on the stock shelves and not removed anytime soon, however if thy don’t make a move and customers keep having this same problems Goldman Sachs and Apple could be looking at a lawsuit, to public hate through Twitter about the companies being discriminatory towards their customers.

fected by a

Possibly the last solution that can be thought of is to look towards the government or the law to change the policies towards algorithms and technologies in order to put big corporations in check for which kind of artificial intelligences that they use. This way would eliminate the problem from starting in the first place. This would add a further amount of accountability towards big companies users, so that if they are affected by a problem they are in the clear while the companies have to learn to fix the problems they are putting out to customers. In this case, a discriminatory problem based on race, gender, and marital status.

These Algorithms are so complex, including apple’s black box algorithm that they can be harder to regulate. so, it could be a possibility that the reason the government isn't taking initiative to regulate the problems that are going on in big tech companies algorithms is because the code is to complex for them to figure out a way to regulate them. This might seem far fetched for a credit card from Apple, but this Black-Box Algorithm is used by other companies as well, including, Sony playstation four and Microsoft xbox one.

Let’s explain the “Black box”, it has come into general use as a description or nomination of an invisible process or function about which we do not need to know how it works just that it does. This means that there isn’t a lot to learn about the algorithm itself, but the fact that it is easy to use and that it works with much ease. “Black Box algorithm” refers to a situation where the term may be used in drawings, charts, manuals by this term although the actual details of the services supplied are irrelevant to the subject.

A place that Black Box works perfectly is in an aircraft, this is because the algorithm uses data recording devices specifically designed to withstand high-G impacts, fire, water and pressure environments with the purpose of retaining data stored within and to transmit a signal that may assist in the box recovery after a calamitous event. This is the brighter side of the Black Box, however there are some worries for the future to come.

Another problem within the Black Box Algorithm that could be potentially dangerous is that with evolving technologies, machine learning could create unsafe airplanes that are controlled by computers, dangerous power grids around areas, etc. This will not only be a bad thing for regular users of these products, but could be potentially dangerous for the government in which is trying to regulate these algorithms.

For the future, the best thing to do is to keep a close eye on artificial intelligence and to regulate what the government can in order to make sure that the intelligence doesn’t learn to much and turn on the human race as a whole, but from this point on the future is a bright one through technology and together as a community we can keep these algorithms regulated and safe for companies to use.

Content inspired by https://medium.com/rand-corporation/did-no-one-audit-the-apple-card-algorithm-979fee7839f9

--

--