Subscribe to the Teradata Blog

Get the latest industry news, technology trends, and data science insights each week.



I consent that Teradata Corporation, as provider of this website, may occasionally send me Teradata Marketing Communications emails with information regarding products, data analytics, and event and webinar invitations. I understand that I may unsubscribe at any time by following the unsubscribe link at the bottom of any email I receive.

Your privacy is important. Your personal information will be collected, stored, and processed in accordance with the Teradata Global Privacy Policy.

Data and analytics in financial services — a challenge or an opportunity?

Data and analytics in financial services — a challenge or an opportunity?

Since starting a new role at Teradata with the responsibility of supporting our financial services sales and consulting teams across the globe, I’ve decided now is a good time to document some thoughts on the challenges and opportunities before us. Can a coherent vision and agenda for analytics truly transform capabilities, particularly in legacy financial services environments? Read on and you’ll find out.

To set the scene, what are the major pain points for financial services? Well, that depends on your perspective, but for legacy institutions there are a raft of challenges. Simply listing challenges isn’t a massively value-adding exercise; however, the General Data Protection Regulation (GDPR), the Payment Services Directive (PSD2), the requirements for API-driven “open banking”, challengers from the fintech sector, changing customer expectations, and operational risk are just some of the areas that are likely to keep financial services professionals awake at night. But it’s not all sunshine and roses for other players either — Atom Bank, for example, recently delayed the launch of a current account in the UK market until possibly 2019, citing the impact of impending regulation as the primary reason.

Atom cites GDPR, PSD2 and the UK-specific Financial Conduct Authority’s (FCA) review of the current account market as the primary causes for concern. It is perhaps ironic that a startup finds itself grappling with concerns around PSD2, probably for very similar reasons to a legacy organization; the economics of running a current account product within a disintermediated and highly competitive environment may transform the market considerably.

My first data project

Interestingly, the economics of the financial services market as a whole is one of the key challenges and opportunities where intelligent and innovative use of analytics in the broadest sense is likely to make a difference. One of my earliest data projects involved the delivery of a customer value project in a large UK retail bank. Unsurprisingly, when this project was completed, it showed that approximately 10 percent of the customer base were extremely high value, whilst approximately 10 percent of the customer base lost money for the bank. The remaining 80 percent were pretty much identical in terms of value generated.

These insights may have been a ‘gut feel’ for management previously, but to have real data proving this intuition is incredibly powerful. A whole series of strategic discussions open up around understanding customer value and how to manage customer relationships once you have data-informed insights. Use cases range from a debate about whether customer value should be displayed to branch staff (and how); whether more valuable customers should be prioritized for call handling; and, most interestingly, whether current value was a reliable predictor of future value.

Over time, this data project led to a decision to consider what I would describe as a ‘proper’ optimization solution for the bank. This was an early attempt to take all the known information about individual customers, as articulated through a series of model scores (value, product propensity, channel propensity, cross-sell propensity, risk and fraud scores, etc.) to support a ‘next best offer’ approach. That doesn’t sound particularly innovative or surprising, but this was more than 15 years ago. The innovative component was to match all customer-level information against a set of marketing goals (product value, product volume, customer value, customer volume) and marketing constraints (budget, channel capacity, available product volume) and create a series of scenarios where business managers could choose, for example, whether to campaign with a target of increasing sales volume versus a target of increasing sales value. Again, perhaps unsurprisingly, selecting a group of customers for a volume-oriented campaign produces a very different campaign selection versus selecting for value increase.

The point about this example is this: Because of advances in technology and the commoditization of what were at the time highly advanced analytics techniques, we should perhaps set our sights higher when we start to consider the opportunities for optimization in financial services. Whilst optimizing at a marketing campaign level can lead to significant incremental benefits, would attempting to optimize at a profit-and-loss or balance sheet level now be unreasonable or unrealistic? Given current analytical capabilities, possibly not.

Impediments to change

Now clearly, there are a significant number of immovable constraints which would obstruct a purist theoretical approach to optimization at an enterprise level — many of them being regulatory — but the point is that for many large financial services organisations, particularly the legacy segment, it is often not identifying and solving individual analytics use cases that causes challenges within organisations but the ability and the economics of implementing organizational change to deliver the benefits identified from analytics.

I remember back in the late ’80s, when I was an enthusiastic and fresh-faced analyst programmer in a large UK bank, a discussion about a particular change project that was proposed in my area of the business. This would involve a change to a core operational banking system, and to even estimate the effort required for the proposed change, a project of 400 man-days would be required. That was the standard work effort for estimating any change to that particular system. As a core operational system, all changes — regardless of complexity — required forensic analysis to ensure the integrity and continued operational robustness of the system.

Eventually this became one of my reasons to leave financial services. Despite being part of a team that was generally very successful in generating new ideas and insight, despite identifying significant business cases — sometimes with projected benefits running into the tens of millions of pounds — we hardly ever saw ideas translate into production. We couldn’t change the business through insight and analytics, because it was too expensive to change the business. It was too risky.

The case for developing a business vision based on data and analytics is founded on the belief that analytics can transform that business, but it needs to be seen as one of the key drivers of business transformation. The organization has to believe that it has data at its heart and that failure to maintain a constant agenda of development and innovation is likely to result in the eventual decline of the business.


Strategic vision

That said, what are the barriers to organisations developing, implementing, and benefiting from a strategic vision for analytics? From a theoretical perspective, very little — compute capability is, by definition, at an all-time high, and whilst Moore’s Law may no longer be indefinitely applicable, advances in areas other than raw CPU capability continue to extend the art of the possible. Taking Teradata’s own case, for example, 10 years ago it was possible to buy one or more cabinets of Teradata to put on the floor of a data centre. A ‘virtual machine edition’ existed, limited to 1 terabye, for research purposes. Today, you can run the same Teradata software on an on-premises cabinet, on commodity hardware, on a virtual machine, in the AWS cloud, or in the Azure cloud. Similarly, the realization that GPUs are ideally configured to boost analytical workloads opens up huge opportunities for implementing complex analytics at scale.

So if the technology is not a limiting factor, what is? Well, I would contend that the major limiting factor for many organisations is an inability to get the basics right. Since starting in the world of consulting seven years ago, I have been amazed at the number of organisations I’ve seen who are yet to master the arcane and complex world of business intelligence (BI), more commonly known as ‘counting things’. Symptoms include an inability to manage data lineage, lack of data ownership, lack of data quality and governance processes, lack of documentation, data being changed without communication, changes being implemented without considering data impacts — the list continues.

In one large bank I’ve worked with recently, I found no production view of which customers call the call centre. In another bank, the process for investigating the root cause of customer complaints comprises a ‘complaints analyst’ reading a portfolio of approximately 300 complaints in order to identify common factors. If only there was a way to automate such a process!

So against this depressing backdrop, how are financial services organisations dealing with (and more importantly benefiting from) the rise of AI? Well, if you subscribe to a few of the popular market information sources such as Finextra, Computing, or The Banker, you could be forgiven for thinking everything is rosy. A day doesn’t pass without some reference to a financial services institution implementing AI, bots, or advanced analytics to improve some process or other. From a customer perspective though, the experience remains unchanged. If the ultimate outcome is to shave a percent or two off the cost, then as a customer I’m probably not interested.

Perhaps of more concern in the long run is the implication of sophisticated analytical processes being built on shaky foundations. As the rise of advanced techniques continues, legislators have managed to (almost) keep up with the pace of change. For example, the imminent General Data Protection Regulation (GDPR) in the EU states that:

“Individuals have the right not to be subject to a decision when:

  • it is based on automated processing and
  • it produces a legal effect or a similarly significant effect on the individual.

You must ensure that individuals are able to:

  • obtain human intervention,
  • express their point of view, and
  • obtain an explanation of the decision and challenge it.”

There are (thankfully) exemptions to this requirement, notably if the process:

  • is necessary for entering into or performance of a contract between you and the individual,
  • is authorised by law (e.g., for the purposes of fraud or tax evasion prevention), or
  • based on explicit consent.

However, if we step back and think about the process of using an AI to provide customer services, for example, this relies on a foundation of robust data. Take this recent clipping from Finextra, for example:

“Nova has been introduced initially to answer basic queries and refer customers onto human colleagues for more complex questions. Over time, however, the bot is expected to become more intuitive as it learns on the job, building more knowledge about how to interpret and respond to customers’ queries.

Ultimately, it is expected that Nova will will be able to guide customers to a secure environment with log-in, enabling them to perform tasks and/or obtain advice there.”

Implications for financial services

The implications of my story so far are that in order to reap the benefits of advanced analytics, banks need to dramatically improve end-to-end data governance, and, secondly, they need to formulate an analytics vision that provides a strong and stable data framework within which all the requirements of regulation and prudent operation can be demonstrably met (i.e., provide auditable evidence of compliance) whilst providing an agile environment to allow advanced analytics experimentation and implementation.

Given the tone of my thoughts, is this feasible? Well actually, yes, I think it is. However, it does require some different thinking and much more priority, especially within legacy banks. Building an infrastructure and attitude that supports rapid experimentation with new data sources is relatively straightforward (debates about appropriate technologies aside). If banks can overcome the operationalization challenges, then the foundations are there for a rapid exploitation of new analytical techniques and should provide greater insight into the individual needs of customers, supporting much more personalized and targeted interactions with them.

What is interesting is that even with some of the challenges outlined above, legacy banks are making significant improvements in their ability to engage effectively with customers. One UK retail bank that has significantly transformed its approach to customers over the past 10 years has moved completely away from the concept of sales and is now entirely focused on meeting customer needs. Gone are the days (admittedly in the ’80s) when banks had sales staff and area sales managers, and given the huge costs of mis-selling scandals and the accompanying public and regulatory backlash, that can only be a good thing.


Portrait of Mark Perrett

(Author):
Mark Perrett

Mark Perrett is Head of Financial Services Consulting for Teradata International. Mark is an experienced practitioner in all areas of CRM development, implementation and management including: Customer Value Measurement, multi-channel direct marketing, event based marketing and advanced data analytics. He has extensive business consultancy experience covering CRM, Customer Insight, Analytics, Digital Marketing, Big Data and Social Media/Social Intelligence across industry sectors.

Mark has 33 years’ experience working in IT and analytics-related roles and has a passion for driving business value from working cleverly with data and strives to be perceived as a trusted advisor.

In addition to being a member of the Editorial Board of the Journal of Direct, Data and Digital Marketing Practice, Mark has a deep knowledge of financial services and utilities sectors and has also led high-performing analytics and marketing teams.

Mark holds a BSc (Hons) in Psychology from Lancaster University and a MBA from Henley Management College and enjoys travelling and photography in his spare time.

View all posts by Mark Perrett

Turn your complex data and analytics into answers with Teradata Vantage.

Contact us