Most companies today still work with what are known as “black box” AI systems. These opaque models make it difficult to understand how decisions have been reached; they rely on data and learn from each interaction, thus can easily and rapidly accelerate poor decision making if fed corrupt or biased data.
Overcoming the AI Challenge
Temenos Explainable AI (XAI) has the capability to overcome these concerns, while providing reassurance that decisions will be made in an appropriate and non-biased way.
XAI models are highly transparent and explain, in human language, how an AI decision has been made. The explainability not only provides explanations for its decisions, but helps users identify and understand underlying issues, so they leverage results to identify the root cause of a problem and improve their operational processes. Crucially, they do not solely rely on data, but can be elevated and augmented by human intelligence. These models are built around the relationship between cause and effect, creating space for human sensibility to detect and ensure that the machine learning is representative, comprehensive, complete, and deals with all possible scenarios, and if it’s doesn’t, it allows the necessary changes to be made.
XAI not only supports increased efficiency and automation but, by virtue of being transparent, it provides a model that businesses can trust entirely to support their operations, with full auditability.
Let’s Look at an Example
Recent market volatility has been creating new challenges for fund accounting teams, and many find themselves spending more time managing control breaks and checking false positives in order to finalize their NAV calculations.
Using Temenos XAI we help fund accounting teams clear exceptions (for example, price variation exceptions) more efficiently by prioritizing those breaks with XAI scores or automatically justifying a break if the score/probability of being a false positive is high enough. The XAI model score is based on price history, corporate action history, benchmark data, security master data, and historic exception data stored in Temenos Multifonds. With the data points for new price variation exceptions, the model then calculates the likely probability that each individual break is a real issue or a false positive, and importantly, the model’s result provides an explanation on how it came to this decision.
By leveraging this technology, accounting teams can work quicker and smarter with a reduced number of false positives, as well as the ability to prioritize exceptions based on XAI results, saving them valuable time on manual investigation. Our analysis of the price variation control specifically suggests that over 80% of control failures end up being justified as false positives.
Key Features of Temenos Multifonds XAI Integration
- Highly transparent and explainable
- Removes human error from the decision-making process
- Creates new efficiencies in the NAV calculation process
- Elevates existing Multifonds automation
- Cloud-native hosting
- Integrated via API frameworks