GFT’s data platform modernisation accelerates the AI journey
GFT is an international technology service provider with strong expertise within banking, financial services and insurance. It has over 10,000 employees globally, across the UK, Poland, Spain, Canada, Germany, the USA, Vietnam, Brazil and many other countries.
“I would say our differentiator is that we pride ourselves as being precision engineers across a multitude of domains; including platform modernisation, cloud computing and neobanking services, with a strong focus on AI and Data,” explains David Tuppen, GFT’s Chief Data Officer.
David Tuppen and GFT’s data team are developing the GFT AI.DA marketplace, which can support and accelerate GFT's clients’ journeys utilising AI.
“GFT is maturing the AI landscape in the industry, helping many organisations that are focusing on their investments into AI,” he shares.
Specifically, GFT is doing something a little different - GFT has been placing a lot of emphasis on its AI.DA marketplace, which allows clients to see AI use cases, plus AI and data journeys, through to a modern data platform blueprint of architecture design.
“We firmly believe that there is a connection between AI and the data that goes with it,” says Tuppen. From integration, storage and management of data processing and analytics; building specific end-to-end pipelines is as important as generating the AI itself.”
For Tuppen, data platforms are evolving - they have changed over time and will continue to evolve.
“There are various architectural patterns which are needed and these vary according to each business and client domain,” he says. “There's not one single solution for a modern data platform. So, from the big trend on data mesh and democratised type architectures, to traditional centralised data stores, there needs to be a specific pattern for each business unit and each client.”
This is what GFT provides, a data strategy that fully aligns with an organisation's AI goals and aspirations.
“Data strategy is fully dependent on the business strategy in order to demonstrate true value for the business,” he emphasises.
A solution needs to start with business value first; after that comes data lifecycle, data management and governance, data and technical architecture, data science and AI visualisation, then finally, data literacy and change management.
“Each one of these steps is needed to a certain degree, for controlled and secure insights.”
Tuppen sees the data pipeline prior to AI projects as being very important.
“You need to move from front to back,” he says. “As I mentioned, you start with the business and then move backwards to the IT backend. You need to look at how to change the impact and what change the impact of adding intelligent automation has on your business. How will the users view and access those insights from the AI? How is the data stored and processed? How is the data integrated into the platform?”
Tuppen sees that if a customer removes any of those considerations, they might land themselves in a data integrity issue, which could cause processing issues and an increase in time and cost.
Ensuring that data is clean, relevant and suitable for AI
Along the pipeline of a modern data platform, there should be defined data models, or at a minimum, one that has governed data assets (or data products), regardless of the data quality (DQ) tooling or process used.
Smart DQ, also known as Intelligent DQ or Rule-Based DQ, without management and governance can compromise the underlying input data. “One will always have various levels of data quality and a lot of people will focus on rule-based or Smart DQ but forget about the governance and management that's needed,” Tuppen explains.
On top of that, it’s necessary to control the data being used by the AI models.
“Focusing on the integrity of the input data will mean that your insights from the applied AI will be much more trustworthy.”
As with architecture, Tuppen says that there is no single answer for how to store your data.
“If you want a playground or a sandbox environment, where you can run generalised analytics against raw data; that's where you'd have, for instance, a data lake which may not have been modelled.”
Tuppen advises that if a business wants to slice, dice and drill down reporting capabilities, then they may typically need a relational database.
“If you want an event-based architecture, then potentially you'd use something like NoSQL - it really depends on the business case and the client as to which architecture is best to use.”
There are a range of challenges and solutions involved in the data integration process — and Tuppen has seen them all.
“Data integration, which is getting the original source data into the new platform, can follow a multitude of paths,” he said. “This is the Extract, Transform and Load (ETL) process. It really does depend on the architecture and whether transformation is required. Does the customer need streaming capabilities, or can they run a batch without considering that option?”
Tuppen encourages businesses to carefully manage their data as it will be likely to explode over time. This includes duplication and replication of data over and over within the entire environment of the business.
“Again, this increases both cost and time,” he says. “Maybe having all that data is necessary and maybe you are able to store all that data. But if you expect a lot from the data over time, building a decomposable architecture with, for instance, a layer that has schema bound API’s, will give a certain degree of surety on the data as it moves down your pipeline.”
GFT addresses data governance and compliance issues, especially in industries with strict regulatory requirements.
“If you consider each layer of a modern data platform, you will inherently need to adhere to regulatory compliance,” Tuppen explains. “For example, if you look at BCBS239, and consider Principles 1 and 4.”
Principle 1: Aggregated data and reporting need to have governance.
Principle 4: Accuracy and integrity of data.
“If you have a data governance model in place and data management in place across your pipeline, you will be following Principle 1 and Principle 4, which includes accuracy and integrity of data,” Tuppen advises.
When an organisation moves their data into their data models, that will inherently force integrity on that data and ensure the accuracy of it.
He continues: “So as soon as you start building that layered approach in your architecture, you will start becoming regulatory compliant inherently, rather than having to build it from scratch.”
Data infrastructure scales have had to adapt to handle increasing data volumes. Tuppen saw many companies ‘lift-and-shift’ their data into the cloud a few years ago, which at the time was not unusual.
“Many businesses originally lifted and shifted all of their data into the cloud, with the ambition of addressing existing challenges later down the line,” Tuppen says. “The challenge (and benefit) with that, is that the cloud scales very easily, so you don't have to consider the ever-growing infrastructure behind it.”
As such, GFT has seen data explode over time, which means that data processing time and costs have now multiplied.
“I frequently talk about the data strategy, but having your data strategy up front — which means designing a targeted solution, including archiving and deletion of data, data modelling and sandboxing, is crucial to cover all your bases,” he explains.
With this approach, businesses can process their data faster, which will produce input data quicker and allow them to run AI solutions more efficiently.
GFT's strategic technology alliances and a focus on Gen AI
GFT has a multitude of technology partnerships, including major cloud providers and many independent software vendors (ISVs).
“We are in fact building accelerators with our ISV partners on the cloud,” says Tuppen. “From building large language models (LLMs) and Gen AI accelerators on GCP, through to streaming capabilities on AWS, we spend a lot of time carefully selecting our partners, based on synergies between their capabilities and our clients’ requirements.”
For instance, MongoDB is a leader in the data technology space and GFT has used their technology for a number of clients.
“We often utilise the streaming capabilities of Confluent, where we advise on the implementation of the technology for our clients, who are looking for enterprise-grade trusted real-time processing of data.”
Over the next 12 months, Tuppen anticipates a huge focus on AI and data.
“GFT is investing heavily into its AI.DA marketplace, which includes industry specific use case libraries, AI and data journeys, blueprint solutions and accelerators, through to the modern data platform design patterns; all accessible through the AI.DA marketplace,” Tuppen shares. “We are going to enable our customers to accelerate their AI journey even faster.”
As for insights into future trends in AI and data platforms, Tuppen sees that Gen AI is the hot topic currently in everyone's search history.
“People want to know - how can we integrate Gen AI with our existing landscape? How can we use LLMs and AI? The reality is that unless you have a trusted data platform in place, this is always going to be a difficult challenge.”
AI is fed by data, so users must have a trusted data source in place.
“If you look at data, the industry is moving towards a democratised domain-led architecture,” Tuppen suggests, “So, isolating each functional domain into an end-to-end data product is becoming the go-to standard.”
Meanwhile, GFT’s data science team is seeing increased uptake of smaller task-oriented language models.
“Model ops for governance is a trend that we're beginning to see,” he says. “We are actually seeing prompt engineering becoming less of a trend, due to optimisation frameworks already being built.”
At GFT, the focus continues to be on AI and Data, how to accelerate AI solutions and the benefits these bring, whilst always maintaining an eye on the ever-evolving data and AI requirements of the future.
- Sumsub: Identity Fraud up 73%; how can Fintechs React?Fraud & ID Verification
- Money 20/20 USA: AU10TIX on Growing Fraud DangersFraud & ID Verification
- WEF and Cambridge University Unveil Future of Fintech ReportFinancial Services (FinServ)
- Finastra Partners With Databricks for Gen AI-led SolutionsFinancial Services (FinServ)