Fast Data
“I want it now!”

Fast Data “I want it now!”

In 1989, ‘Queen’ released a very successful single called “I want it all”. The opening bars of this song repeats the song title (twice) and then the lyrics change subtly to “and I want it now!” This could be a battle cry for today’s fast-moving society.

We live in a permanent state of impatience. We demand instant gratification and are intolerant of any obstacle in this headlong – somewhat hedonistic – pursuit of speed.

Keen to raise their game and their understanding of consumers, companies have responded by collecting huge quantities of customer information. The minutiae of what we buy, when we buy, how we buy and where we buy is recorded with meticulous care. But here’s the irony, this data delivers precious little value…

The problem is that data lakes are overflowing with unstructured, historical information. In a world that’s engaged in real-time conversations with companies – via mobile apps, PC’s or even trusty landline – we need to capture consumer insights and act upon them in real-time. Companies that rely upon history will quickly become history.

So why, you may ask, have companies been so slow to grasp this reality? Well, here’s the blunt truth: it’s beyond the ability of most company systems. They are simply incapable of acting upon data collected in real-time. This is a far from trivial task and often involves a lot of tinkering with fragile legacy systems.

Fortunately, though, there is a solution in sight. It’s called Fast Data.

Fast Data is the application of big data analytics to smaller data sets. Working in near-real or in actual real-time, it allows companies to solve a problem or create business value whilst engaging with a customer or another computer. Admittedly, this is not a new idea but embracing Fast Data is becoming hugely important as we move ever closer to Edge Computing (indeed, my colleague Andrew Simmonds has just written a fascinating article on this very subject).

A Fast Data Architecture
What high-level requirements must a Fast Data architecture satisfy? They form a triad:

  1. Reliable data ingestion.
  2. Flexible storage and query options.
  3. Sophisticated analytics tools.

The components that meet these requirements must also be ‘reactive’- displaying four key qualities:

First, they must be infinitely and instantly scalable – adaptable enough to meet the inevitable rises and falls in demand.

Second, they must be resilient and remain unaffected by the equally inevitable failures that happen in large distributed systems (for example, any failure in an autonomous car could be catastrophic so the Fast Data component must be utterly resilient and reliable).

Third, they must always respond to service requests even if failures limit the ability to deliver services.

Fourth, they must be driven by messages and events from the world around them.

The following chart shows an emerging architecture that can meet these requirements.

And the really good news is that you can graft this kind of architecture on top of an unstable legacy system. And that’s not my hypothetical assertion, it’s a reality that ING bank has very successfully achieved…

Unlocking valuable intelligence

Back in the halcyon days, banks were very close to their customers. They knew them intimately and treated them personally. With the proliferation of customers, products and channels, though, this intimacy has been lost. ING wanted to recapture the ‘golden era’ with a global strategy to make the bank more customer focused, ‘mobile first’ and altogether more helpful and proactive.

A typical bank these days captures and processes billions of customer requests, instructions and transactions. In doing so, they capture and store vast amounts of customer data – but, and here’s the startling truth, few (if any) of the major banks use this data effectively for the benefit of their customers.

Bas Geerdink of ING was hired to address this problem. His broad international remit is to create a truly customer-friendly, omni-channel experience. And to kick-start this process, he turned his attention to their vast but disparate data stores as he was convinced they could unlock valuable intelligence. Historical data can often reveal customer behaviours and trends that are crucial to predictive analytics. For example, past data can be used to plot future pressure points on personal finances – e.g. key payment events can be anticipated and mitigated with predictive analytics.

However, mining this data presents major challenges. Most banks are hampered by disparate and disconnected legacy applications that cannot operate in real time. Confronted with this dysfunctional problem, ING made some fundamental decisions:

  1. Create a single, secure data lake.
  2. Employ a variety of open source technologies (along the lines of those shown in the previous chart). These technologies were used to build the over-arching notifications platform to enable data to be captured and acted upon in real-time.
  3. Work with the legacy application teams to ensure that critical events (during a customer’s ‘moment of truth’) are notified to this Fast Data platform.
  4. Trigger two vital platform responses:
    1. Instantly contact the customer to establish whether help is urgently needed (for example, to complete a rapid loan application).
    2. Run predictive analytics to decide whether the customer needs to be alerted.
The future role of banks

Partly in response to the Open Banking directive, the bank is now opening up their data to third parties who have been authorised by customers to process certain transactions on their behalf (e.g. paying bills). This is a fascinating development with potentially far-reaching implications. It raises a question about the future role of banks. For example, would the rise of nimble, tech-driven third parties reduce banks to mere processing utilities?

ING are determined not to be marginalised which is why they have invested in this Fast Data platform and are building real-time predictive apps – both on their own and with third parties (such as Yolt). It is a bold and very radical strategy – and, not surprisingly, it raises some searching questions…

What types of customer would most welcome this kind of service and is there any risk of alienating less technology-literate customers?

The bank doesn’t yet have definitive answers to these questions. However, ING is adamant that all technology-driven initiatives must have universal appeal and that is why they are introducing change on a very gradual, phased basis…

In the first instance, they are testing these services on employees of the bank and then on ‘beta’ test groups of (external) customers. To date, feedback has been extremely positive and this has encouraged the bank to keep investing. However, Bas did emphasise the need to appreciate customer sensitivities and preferences. For example, there is a fine line between providing a valued service and becoming intrusive – that is why the bank specifically considers factors such as the best, most receptive time of day to make interventions (if at all).

Fraud detection is another intriguing development where Fast Data is having a significant impact. At the moment, traditional fraud detection systems often lack finesse. When a customer attempts to use a credit card, it can trigger a ‘false positive’ 90% (or even more) of the time. This can be inconvenient – not to mention highly embarrassing – both for the bank and especially for the customer. ING are hopeful that their Fast Data platform will radically reduce the level of false positives as well as the level of fraud.

Other applications of Fast Data

I am also aware that Capital One have deployed a Fast Data service and are now able to authorise a car loan in seconds – instant on-the-line confirmation that massively improves the customer experience.

In the police and military worlds – where data from a variety of disparate sources has to be captured and actioned in real time and often in hazardous situations – there are obvious applications. For example, Fast Data analytics can be used to predict when supplies of critical ammunition need to be replenished and to trigger immediate air-drops to front-line troops.

It’s all very clever stuff. But, unfortunately, not all scenarios are quite so quick-witted. We also know of instances where data is anything but fast…

Take the Lloyds Insurance market for example. Currently, full risk assessments are completed two weeks after prices have been quoted and often accepted – a fortnight in which things can go expensively wrong. Quite clearly, this is a risk too far!

So, for the ‘I want it all – and I want it now!’ generation, the promise of instant gratification is becoming ever more real-time. Fast Data is delivering on Freddie Mercury’s enduring anthem.

© 2023 Clustre, The Innovation Brokers All rights reserved.
  • We will use the data you submit to fulfil your request. Privacy Policy.