Artificial Intelligence (AI) already is transforming banking for many of the country’s largest financial services organizations. Fortunately for smaller organizations, AI isn’t out of reach. With the right focus, community banks and credit unions can access AI and compete with their larger competitors…bot-to-bot.
A Whole New Ballgame
AI presents opportunities for financial services organizations of all sizes to reduce costs, mitigate risks, increase revenue and improve customer service. But because this often requires a specialized understanding of data and how to leverage it, many smaller organizations aren’t prioritizing it…yet.
For many, the issue is a lack of resources and skill inside the business. After all, for AI to be effective, it needs high-quality data that is actionable. For many community banks and credit unions, data science is too new to already have employees who understand it. Acquiring these skills is another issue.
AI as a Service
So what steps can smaller, resource-restricted banks and credit unions do to make a meaningful difference?
First, drill down on the opportunities or problems you are looking to address and determine how AI can help solve them. Whatever the target use cases – fraud and AML, small-business lending decisions, custom offers triggered by customer life-events – the fundamental fuel is good data.
Then seek an experienced partner like FIS, who helps financial services organizations of all sizes launch AI solutions built on large consortium data and science skillsets. This approach brings economies of scale, readymade interfaces, wider ecosystems at affordable cost, with faster time-to-market and reduced risk.
Where Can I…AI?
AI should solve real business opportunities or problems. Following are a few use cases:
- Combat Fraud: Through predictive analytics, AI can identify fraud before it occurs and help reduce the number of false positives. Statistically, financial services organizations lose roughly 5% of revenue each year due to fraud, which translates to approximately $3.7 trillion. A distinct feature of machine learning is its capability to self-learn, so as more data accumulates, the algorithms or computer problem-solving operations get more accurate. The FIS SecurLOCK suite utilizes AI to block suspicious transactions as they occur, monitors a customized plan that keeps you on top of fraud trends, and adjusts alerts as specific threats emerge.
- Improve (Self) Service: Through large datasets and machine learning, bots know how to respond to customer questions from onboarding concerns to transaction-specific. The FIS MyBanker skill via Alexa uses friendly, conversational language to help users navigate through different aspects of their banking journeys, from buying a home, applying for a credit card or dealing with card fraud. The FIS skill leverages the power of the cloud to provide these personal banking services and to support multiple platforms, locations and situations.
- Increase Share of Wallet: Robo-advisors are digital platforms that collect information about an individual’s financial goals and the level of risk they’re willing to incur, then input this data into algorithms. The results are used to offer financial advice to the individual, allowing him/her to make educated decisions. Often the robo-advisor will fully automate the purchase and management of products. The services provided by these automated advisors have expanded to include dividend reinvestment, portfolio rebalancing and tax-loss harvesting capabilities.
- Improve Compliance Activities: The impact of AI to improve on false positives in compliance monitoring for some financial services organizations can be as high as 20-30%. Actions taken from a false positive in AML monitoring can have adverse impacts on highly profitable customer relationships. AI and Robotic Process Automation (RPA) can be applied to regulatory technology in highly manual operational areas such as anti-money laundering to ensure constant and consistent compliance reviews. Automated exception identification allows organizations to apply scarce resources more productively to compliance concerns.
When deciding where to apply AI, financial services organizations will encounter some key control and governance considerations when plugging into their respective operating environments. How much authority should AI have? What about regulators? And what if things don’t go as planned? For AI technologies to succeed, they should be audited to explain why their algorithm reached a certain decision.
AI software is only as smart as the data used to train it, therefore, a human review must be conducted to oversee the predictions of the machine to ensure fairness and protection against bias decisions. Don’t forget to think through “off-ramps” that steer customers to human backups when needed. Give AI systems the opportunity to learn from the outcomes of human interactions.