Today, the financial services sector is slowly taking advantage of cloud-native technologies. Capital One is one such firm that has moved to the public cloud wholly, and there are a few who are running their private clouds while reaping the benefit out of modern trappings, like micro-services, service meshes, and containers. Machine learning (ML), a part of AI technology, is another growing technology in the banking segment.
The nation’s largest financial institution, Royal Bank of Canada (RBC), had its own data center in 2016. It utilized an OpenShift-based cloud on-premises prior to releasing a fintech-focused AI R&D center – Borealis AI. Surprisingly, the company is on the news again. It announced its working with Nvidia, Red Hat, and others to build a new AI computing platform powered by an Nvidia GPU farm. The hardware components are designed in-house in this project.
While addressing Data Center Knowledge (DCK), the senior VP of worldwide technology infrastructure in RBC, Mike Tardif, shares that their firm started conveying GPUs for its ML errands in off-the-shelf GPU servers. However, after learning the flexibility deficiency in this tactic, RBC decided to purchase chips for AI computing from Nvidia.
He says, “We felt leaving out the middle person a little bit, going directly there, made sense. Then we could start keeping up on where they were going with their chips, and what they’re building for automation and software.”
Foteini Agrafioti, RBC’s chief science officer who heads Borealis AI, says, “What’s amazing with this new technology is that we can now process things extremely fast. When we’re analyzing client records on our personal-banking side, which is our largest business with millions and millions of client records, you can perform an analysis of a model within 20 minutes or an hour of an entire client base. It would take us weeks to do that using CPUs.”
She shares that the app currently being developed by the firm is wide-ranging. In any case, it is a piece of a natural language processing project that can dissect the content from online journals and news stories continuously to determine appropriate bits of knowledge for qualified experts and financial advisors. She says the product must figure out what might be important to any semblance of counsels and value research examiners who help clients keep steady over their portfolios.
Tushar Katarki, senior manager of OpenShift product management at Red Hat, tells DCK that, “I think it’s been a win, win, win. I mean, Red Hat has certainly benefited. Because guess what? If it is applicable for RBC, it’s probably applicable for other banks too, because they’re all regulated in similar ways by the various governments.”
He added, “When we started thinking about AI as a workload, Red Hat didn’t have a whole lot of exposure to AI and machine learning, and even HPC for that matter, whereas Nvidia had been doing it for many, many more years. So we learned that business and that technology, whereas they learned containers, CI/CD, Kubernetes, cloud, and everything from us, as they didn’t have that kind of background. There was a very symbiotic relationship that continues to thrive even today and going into the future.”
Katarki stated that the proficiency gained by Red Hat and Nvidia from this project in terms of AI and ML would benefit industries outside fintech, such as manufacturing, retail, insurance, government, etc.