A Viral Tweet Accused Apple’s New Credit Card of Being ‘Sexist.’ Now New York State Regulators Are Investigating
New York regulators are investigating Goldman Sachs after being alerted for potentially violating state laws banning sex discrimination with regard to Apple’s new credit card. A discriminatory algorithm may be to blame.
The Apple Card, which Apple announced this March, is issued by Goldman Sachs. After complaints began to circle around the internet over the past week, the New York State Department of Financial Services (NYSDFS) took interest and launched an investigation into the card’s issuer.
The NYSDFS was first tipped off by a viral Twitter thread from tech entrepreneur David Heinemeier Hansson, begun on Nov. 7. He detailed how his card’s credit limit was 20 times higher than his wife’s, even though she has a higher credit score and they file joint tax returns. Hansson referred to the Apple Card as a “sexist program” and said that its over-reliance on a “biased” algorithm did not excuse discriminatory treatment.
Hansson’s complaints were even echoed by Steve Wozniak, co-founder of Apple, who responded to Hansson’s tweet, saying “the same thing happened to us.” Wozniak said that his credit limit was 10 times higher than what his wife had, even though they did not have any separate assets or accounts. In his view, Apple should “share responsibility” for the problem.
Goldman Sachs has denied wrongdoing, stating unequivocally through company spokesman Andrew Williams that “in all cases, we have not and will not make decisions based on factors like gender.”
Williams said that two family members can “receive significantly different credit decisions” based on their individual income and creditworthiness, which can include personal credit scores and debt levels.
A spokeswoman for Apple directed TIME to a Goldman Sachs representative when requested to comment.
Superintendent of the NYSDFS Linda Lacewell said Sunday in a statement that state law bans discrimination against protected classes of individuals, “which means an algorithm, as with any other method of determining creditworthiness, cannot result in disparate treatment for individuals based on age, creed, race, color, sex, sexual orientation, national origin or other protected characteristics.”
Lacewell said that New York supports innovation but “new technologies cannot leave certain consumers behind or entrench discrimination.” She added that this “is not just about looking into one algorithm” but also about working with the tech community more broadly to “make sure consumers nationwide can have confidence that the algorithms that increasingly impact their ability to access financial services do not discriminate.”
This isn’t the first time a potentially discriminatory algorithm has come under scrutiny by the NYSDFS. Last week, the agency began investigating an algorithm sold by a United Health Group subsidy that allegedly resulted in black patients getting substandard care as compared to white patients. Various algorithms across industries have faced criticism for being racist or sexist.
Published at Mon, 11 Nov 2019 21:54:33 +0000