“Yes, when it comes to sophisticated algorithms and AI (Artificial Intelligence), even some of the world’s most valuable companies can get things wrong. What’s interesting is that one of the selling points of the Apple Card was that it would provide credit to those with little or no credit histories
‘Here’s the thing most people don’t understand: Certain kinds of advanced algorithms, machine-learning algorithms, need a lot of data to train on in order to make predictions. We turn to the past to find that data. But if you’re not careful, the algorithms learn the mistakes of the past. In this case, that would be gender bias. Algorithms that learn from history are doomed to repeat it. That’s the great irony. You have to really work to correct for the mistakes of the past. In life and in algorithms.'”
“Despite concerns around the data privacy, a study by Accenture found that 60% of consumers would be willing to share personal data, such as location data and lifestyle information, with financial service providers if it resulted in lower pricing on products, or delivered other benefits, such as faster turnaround on loan approvals.”
“According to a recent survey by Dimensional Research, nearly eight out of 10 enterprise organizations currently engaged in AI and ML report that projects have stalled due to issues of data quality and model confidence.
An AI Trust Index is a FICO-like score for algorithmic vulnerability and risk based on five major AI business risks — bias, explainability, robustness, compliance and data privacy.
Much as FICO accelerated risk rating has transformed the financial services industry, AI Trust Index will help accelerate development of Trusted AI systems that are free from bias, transparent in their operations, and are able to reflect the core values and policies of the business.”