Leading the way in innovation for over 55 years, we build greater futures for businesses across multiple industries and 55 countries.
Our triple bottom line focuses on people, the planet, and purpose. Together we are ready to go the extra mile to empower communities while transforming countless lives and industries along the way.
We share news, insights, analysis and research – tailored to your unique interests – to help you deepen your knowledge and impact.
At TCS, we believe exceptional work begins with hiring, celebrating and nurturing the best people — from all walks of life.
Get access to a catalog of the latest news stories from across TCS. Discover our press releases, reports, and company announcements.
Blog
Senthil Chandramouleesan
You have these already downloaded
We have sent you a copy of the report to your email again.
The banking industry has been aggressively adopting artificial intelligence (AI) across multiple use cases in the last few years. The ultra-competition in the industry has pushed banks to work relentlessly and improve business agility, accelerate decision-making, reduce operational costs, and provide superior customer experience. To that end, banks are leveraging AI to deliver some of these outcomes. However, the inability of AI systems to explain the decisions and actions to human users limits their effectiveness. This is where explainable AI (XAI) offers tools and techniques to build human-interpretable AI models.
Concerns with the ‘black box’ nature of AI
While more financial enterprises are adopting AI algorithms, the reality is that users are unable to understand the rationale behind the decisions. There is an ever-increasing concern that the output from AI algorithms is biased. Instances of such bias have been reported in financial services, including bias against minority groups or gender in credit card and loan approvals. There are concerns from regulatory bodies on the transparency of AI and the dire need to demystify the ‘black box’ nature of AI. The US Senate has introduced a bill, ‘The Algorithmic Justice and Online Platform Transparency Act,’ prohibiting AI algorithmic processes on online platforms that discriminate based on protected characteristics. As part of the bill, the Federal Trade Commission has established an interagency task force with representatives from the Consumer Financial Protection Bureau to examine the discriminatory use of personal information.
Improving transparency in AI
In the commercial lending business, banks use AI to proactively identify when their existing customers are thinking about business expansion or related business ventures and prequalify them for a loan. This offer deepens the customer relationship and keeps the customer from defecting to another bank for any additional lending needs. Banking personnel requires explanations from the AI system on how the specific decision was made and the influencing measures for every customer qualified for such an offer. An example of this would be a firm proactively working to reduce its carbon footprint by introducing sustainable practices into the business model or venturing into green business while improving the profit margin. Such firms shall have a better weightage for pre-approved loans. Firms with increasing debt year-on-year shall be declined for an incremental credit offer. Here, XAI can be applied to explain and justify the outcome produced, thereby improving transparency in AI decisions.
Approaches to improve explainability in AI
Banks can adopt XAI techniques that are model-agnostic to improve the explainability of AI systems. AI model simplification is based on rule extraction techniques. For instance, the local interpretable model-agnostic explanations (LIME) technique builds locally linear models around the predictions of an opaque model to explain it. This technique will be able to answer the most common questions: why was a customer selected for the loan offer, or which features caused the offer qualification? The bank can also use the SHapley Additive exPlanations (SHAP) technique to rank or measure the influence or the importance of features like credit history, revenue growth, and profit margin in the prediction output.
Let us consider the approach for providing explanations at the global, local, and feature-interaction levels. At the global level, XAI will help answer how the data is contributing to decisions. It can measure the influence of critical features like credit history, total turnover, profit margin, type of industry, total debt, and loan amount and rank the order of feature importance on the decision.
XAI can facilitate a better understanding of the prediction for an individual customer and answer the question of why the AI system disqualified a second loan for a particular existing customer. Poor credit history or poor business performance could be a major influencing factor. XAI can even compare the decisions taken on two customers with almost the same feature values, such as total turnover and credit history, and explain why the AI system disqualified the loan for one customer but not the other.
At the feature-interaction and distribution level, XAI can explain how the model output varies with changes in the total turnover when the other attributes remain constant. This can explain the difference in loan offer decisions for two similar customers.
Final thoughts
With explainability being critical for all stakeholders, XAI adoption will accelerate in the coming years. AI algorithms, especially deep learning approaches, will see a drastic uptake by the banking industry if only XAI comes to the rescue with explainability techniques. This will address the concerns of banks and regulators that are expecting transparency and explainability to build trustable AI in banking. If used judiciously, the potential of AI is incredible and will help remove the historic data bias and democratize access to financial services.
Generative AI and its Potential in Unlocking Business Value
Testing Transformation leveraging Process Optimization, Automation and Agility
Harnessing Generative AI in Procurement for Enhanced Efficiency
Enhancing Dealer Network Management with Master Data Management