The claims triage example from a few weeks ago showed how a simple rule can inadvertently result in bias that has a real cost.
The same thing can happen when we’re acquiring customers, offering new services to existing customers, or repricing an existing contract. Here is an example.
When Apple launched its credit card, several high-profile cases emerged where women were offered lower credit limits than men, even among couples with shared assets and similar financial profiles.
As an example of how high-profile these were, Steve Wozniak, who co-founded Apple with Steve Jobs, said the same thing happened to him and his wife.
The algorithm may not have used gender explicitly. Goldman Sachs, the issuer, said that gender was not used as an input in the algorithm. An official investigation by the New York Department of Financial Services “did not produce evidence of deliberate or disparate impact discrimination but showed deficiencies in customer service and transparency”.
So it is possible that the algorithm used other variables that acted as proxies for gender. The report says the specific factors related to the discrimination complaints included things like “credit score, indebtedness, income, credit utilization, missed payments, and other credit history elements”.
Some women were offered less credit, not necessarily because of their actual risk, but possibly due to bias embedded in the data and model.
Customers turned away or under-served, and the business loses out on good lending opportunities. There's also reputational damage.
The bias may not be deliberate. Seemingly harmless variables act as proxies for protected characteristics. We don’t want this, for obvious reasons – it’s the wrong thing to do and can damage our reputation.
But it goes beyond pure legislative or moral concerns, directly affecting the bottom line. Good customers leave, we keep more risk, profits slip away, all because of bias hiding in the rules.
Disclaimer: The information in this article does not constitute legal advice. It may not be relevant to your circumstances. It was written for specific algorithmic contexts within banks and insurance companies, may not apply to other contexts, and may not be relevant to other types of organisations.