Are Chatbots the future interface for global business?

January 15th Clustre Breakfast Briefing – Executive Summary

Chatbots are positively transforming the way HSBC supports its people and customers. We were therefore delighted to welcome Gareth Butler, Senior Programme Manager/ Director, Global Risk Transformation and Innovation at HSBC, as the guest speaker for our first Breakfast Briefing of 2020.

Gareth brings more than 25 years’ experience in customer-facing roles to the management of risk transformation and innovation. His remit covers a broad spectrum, including the application of AI, chatbots, data analysis, machine learning and behavioural analysis to the operational roles of the people in HSBC’s Risk organisation. 

Gareth shared HSBC’s experience of harnessing chatbots to enable colleagues around the globe to get quicker answers to key policy questions and allow policy experts to work on higher value tasks. He also explained how HSBC Risk Transformation has fostered a culture of experimentation, where value is measured in terms of a wide range of benefits.  

HSBC has around 200 petabytes of data, 235,000 employees in 39,000 offices, 40 million customers, 90,000 servers and data centres in 21 countries. It also has over 200 Chatbots in development/production stage on a variety of infrastructures from on-premise architecture through to Cloud based solutions, in both staff and customer facing capacity.

HSBC operates in a broad range of regulated markets and this scale and complexity brings with it challenges and also opportunities by creating a rich seedbed for experimentation and innovation. 

Developing conversational interfaces for HSBC’s risk function

Conversational interfaces are not new to the market; they have been around, in one form or another, for many years. And they’re here to stay. From early MIT based Bots in the 1960’s to the current day with Alexa operating on more than 100 million devices and Siri on more than 500 million. Market commentators, for example Gartner, have previously predicted that in 2020 the average person may speak more with Chatbot mechanisms than they do with their spouse or partner and other commentators have predicted that Chatbots will become a primary mechanism for interfacing with customers across a wide range of industries. 

Robo-advice is establishing itself as one of the leading disruptive technologies in the business sector and there are several factors driving this trend:

  • Humans can be naturally inconsistent in the answers they give to similar questions – they may have different opinions or different ways of asking the same question. This can be more prevalent when there are lots of people who may ask or answer the same question or where the answer to the question is complex or open to wide interpretation.
  • Chatbots provide consistent advice and can be trained in a controlled fashion to get smarter over time. They never sleep and operate 24/7/365 
  • Chatbots also generate business intelligence and analytics which can be used to take decisions, for example by highlighting what it is people are asking about, and where, and enabling the root cause underlying the question to be addressed more directly (thereby reducing the need for the question to be asked in the first place) 
  • And, of course, Chatbots can allow for the reprioritisation of work by reducing the workload of subject matter experts, freeing them up to work on more complicated and higher value problems and opportunities. 

When considering the merits of a use-case for a Chatbot initiative, it’s key to narrow down the broad range of possible use-cases to those which offer the greatest range of potential benefits. A sophisticated cloud architecture design for conversational AI needs to be supported by a robust business case. For instance, targeting areas which receive tens of thousands of repeat queries every year may be more appropriate for the deployment of a complex architectural solution than a use-case with fewer than 1,000 queries per year.

A variety of architectures and products are used within HSBC Global Risk for Chatbot Technology, including AWS and Google (on the Google Cloud Platform – GCP). Through Google, Gareth was introduced to Filament, who are experts in applied AI. They employ a team of 30+ data scientists and machine learning engineers who deliver scalable AI solutions for enterprise clients. 

Within Global Risk at HSBC there is an established Innovation platform which encourages ideation, experimentation through proof of concepts and learning of new technologies. There is also a network of cloud sandbox and development environments within which to progress ideas from experiment to production stage. These environments have been put to good use to support the evolution of conversational AI solutions.

Several of these experiments focused around the use of chatbots to ingest internal policies and surface pertinent aspects within them quickly to employees in response to specific questions through the Chatbot user interface. Gareth’s team worked closely with internal and external Data Science and Dev Ops teams to get the best out of the tools available and tune their algorithms to deliver results in the most effective way. 

For example, rewriting a document to enable a Chatbot to work with it, or simply having the Chatbot return a large section of a policy and ask the user to self-serve in response to a question, is not particularly helpful if the employee then has to trawl through the policy to find the specific answer to their specific question. So, HSBC developed a solution which ingested raw policy data and enabled the Chatbot to not just return the policy, but to draw out the actual clause relating to the specific question. 

Selecting good candidate use-cases for Chatbot implementation

In selecting suitable use-cases for Chatbot implementation, the first thing Gareth focused on was looking for areas that generated relatively high volumes of enquiries and were going to do so for the foreseeable future (there’s no point in going to a lot of effort to develop a Chatbot to answer questions on a policy or procedure that is shortly to be replaced or potentially eliminated). 

The second question was whether the use-case would fix an actual problem and deliver some real benefit (e.g. capacity savings and value-added service). The ROI of the project needs to support the effort involved to create the architectural solution and to secure the necessary approvals from cyber security and data teams to host data on the external cloud. 

Other considerations such as the classification of the data and the firm’s risk appetite need to be considered within the context of cloud hosting in addition to the presence of any sensitive information. For instance, equipping a Chatbot which uses external cloud hosting to know who it is talking to, and where they are, may require it know PII about the person asking the question and the location in which they work.

In assessing the potential benefits of a specific Chatbot deployment, Gareth focused on four questions:

  • How much time are people currently spending looking for answers in this area?
  • How much time are subject matter experts spending providing these answers?
  • How much value is attached to getting these answers right (e.g. in terms of risk)?
  • Are there any factors on the horizon which could negate the need for the Chatbot, i.e. how future-proof is the use-case?

Creating the “bot of bots” and enterprise bot management

Gareth highlighted that as larger organisations deploy more and more conversational AI solutions and see traffic flow through these solutions, the benefits start to materialise. However, new challenges can also emerge and new innovative solutions are required to meet them. For instance, the existence of multiple chatbots available to staff within a large global organisation provides benefits but also raises questions such as ‘where can I find a Chatbot’ and ‘how do I select the one I actually need?’ This is especially the case where companies use a variety of differing Chatbot architectures.  

What was needed was a “bot of bots” to create an ecosystem of bots and help to guide the employee easily to the right Chatbot:

  • The solution would need the ability to orchestrate effectively across multiple underlying SME Chatbots in a secure fashion 
  • It would also need to be able to communicate across a variety of different Chatbot architectures from AWS to GCP to On Premise Chatbot solutions
  • Finally, the solution needed to be performant and compliant with HSBC’s stringent cyber-security requirements. 

To address this requirement for a “bot-of-bots” and a safe, consistent way to deploy bots operationally in HSBC, Gareth worked with Google Cloud Platform and with Filament and their Enterprise Bot Manager – an enterprise grade bot management platform which uses Google Kubernetes Engine to orchestrate and connect with multiple underlying Bot architectures.

Lessons learned from AiDA – HSBC’s customer facing Chatbot in the US

Filament had gained separate experience working with HSBC on AiDA (Automated Immediate Digital Assistant), the Chatbot which provides the first level of customer interaction on HSBC’s US website. 

Claire Fletcher-Hobbs from Filament added that AiDA has delivered great results so far, cutting the average enquiry handling time in half, increasing the percentage of enquiries resolved in the first interaction and delivering comparable levels of customer satisfaction.

Some of the lessons learned from the development of AiDA are to:

  • Start with the basic things – questions that are frequently asked and relatively straight forward to resolve. These are the easiest to train the Chatbot and provide immediate productivity benefits. 
  • Spend enough time on the conversational interface design and don’t be afraid of using buttons to guide customers through the conversation. 
  • Ensure, where the Chatbot cannot understand or adequately respond, it fails gracefully and hands-over seamlessly to a human agent (e.g. “let me pass you to the right person to answer your question”).
  • Anticipate that the bot will be in its infancy when first deployed, so make sure the resources are in place to support and train the bot, otherwise it may be overwhelmed by the unexpected.
  • Start measuring the benefits (e.g. productivity improvement) immediately but remember that these benefits may be low to begin with and grow over time, as the Chatbot is trained and becomes more effective in the answers it provides. 
  • Understand that, although the initial implementation effort can be significant (especially given the need for extensive pre-deployment vetting) the eventual running costs can be very low indeed, as they leverage the economics of the cloud service providers. 

Looking ahead, how can even more value be delivered by chatbots

The early benefits from Chatbot deployment in the Risk function have been targeted at productivity improvement and the creation of capacity among the subject matter expert community. Chatbots never sleep, provide business intelligence and enable the reprioritisation of work. 

These benefits are further increased by the positive feedback loop created by using machine learning to leverage the wealth of data created by the bots themselves; each interaction is fed into a data-lake and this data is analysed to identify areas where processes can be changed or better information provided to reduce or eliminate repeated enquiries. 

Chatbot technology is already capable of using API’s to interface with other systems, either to get the information needed to answer a question or to trigger an action in response to a request. The next step may be to go beyond “here’s the answer to your question” to “here’s what I’ve done for you”. In other words, integrating chatbots with robotic process automation. 

Another significant advantage of chatbots over, say, intranet knowledge sharing, is the level of employee engagement they generate. This point led to a significant discussion of the future direction of conversation interfaces:

  • Will conversational interfaces make increasing use of voice interaction rather than text and, if so, will this simply be via speech-to-text converters or will it be more sophisticated? 
  • Will customers change their behaviour, knowing they’re dealing with a bot, and will technology like facial recognition be used by companies to augment the bot’s capabilities (e.g. the ability to detect what mood the customer is in)?

Whatever the answers to these questions, it’s clear that chatbots are here to stay and they are going to do a lot more than just answer our questions. 

FUTURE EVENTS

Please visit www.clustre.net/category/events/to register for future events:

  • MARCH 11TH,2020: Breakfast Briefing on how LV= learnt to love Agile, with guest speaker Richard Warner, LV= COO, at the Delaunay Private Dining Room, 55 Aldwych, London WC2B 4BB. For further details and to register: https://www.clustre.net/how-lv-learnt-to-love-agile/

MORE INFO
FOLLOW
IN TOUCH
© 2024 Clustre, The Innovation Brokers All rights reserved.
  • We will use the data you submit to fulfil your request. Privacy Policy.
  • This field is for validation purposes and should be left unchanged.