Spixii Blog

The future of voice for insurance

Written by The Spixii Marketing Team | May 30, 2018 7:59:00 AM

3 min read

"On the cusp of a revolutionary way of conversing with machines, comparable to the web and mobile evolution, we are changing the way we access information" - Max Amordeluso, EU Lead Evangelist for Amazon Alexa

 

"Alexa - play some relaxing music."

Do you have an Alexa, Google Home or similar personal assistant at home?

Voice sets to unlock a whole new set of opportunities for insurers to reach their customers in new compelling ways. 

Last month, there were two huge breakthroughs in voice-powered digital assistants. First, Google Duplex: the stunning demo of Assistant making a phone call. And second, the keynote of Dr Ali Parsa, CEO of Babylon Health, at the AWS Summit.

 

First, 'that demo': Google Duplex

While pitching Spixii in Startup Central at AWS Summit, many people asked us about voice and 'that demo' of Google Duplex.

Artificial intelligence (AI) sounded natural, adding language fillers like "mmhmm", plus appeared to understand the nuances of human conversation.

We have written before on the differences between machine and human communication. Machines operate rationally using set rules, while humans are more irrational in their speech patterns. We um and er, anticipate and accept some misunderstanding, use in-jokes and a wide variety of words to say the same thing.

This particular demo shined in its ability to react intelligently even when the conversation did not go as planned. As said in the demo, "the technology is directed towards completing specific tasks, such as scheduling certain types of appointment". 

However, it was not long before Google Duplex hit controversy.

Some argued it breached two-party consent laws, requiring everyone in a conversation to agree to being recorded. Others raised concerns over its lack of transparency. When we talk to an artificial intelligence of every kind, we have the right to know - particularly in a climate of distrust in the tech companies monopolising the industry and their use of private data.

However, these discussions underline the necessity of putting human ethics at the start rather than the end, as Alberto Chierici discusses with FinTech Talents here. As David Meyer, writer for Fortune puts succinctly, "Human ethics - not just human simulation - need to be baked into these systems from the start, not as a reactive afterthought".

Photo by Andres Urena on Unsplash

 

Dr Ali Parsa, CEO of Babylon Health, at the AWS Summit Keynote

During the AWS Summit, Dr Ali Parsa of Babylon Health showed a similarly incredible demo of Babylon Health on Alexa. 

Parsa showed how people can describe their symptoms and diagnose health conditions on Alexa before their conditions get any worse. From studies in the UK and abroad, only 1 in 10 patients needed to see a doctor in person, taking pressure off doctors and the NHS. Babylon Health has an impressive 95% Net Promoter Score (NPS) in Britain. Plus in Rwanda, with support from the government and the Bill & Melinda Gates foundation, people signed up in their millions: 70% of the population to be exact.

This is indicative of a wider trend: on the whole, people are comfortable talking to machines. Indeed, as Max Amordeluso - EU Lead Evangelist for Amazon Alexa - pointed out, we have a whole generation of children asking machines to get stuff done.

And it makes sense. Voice is one of the most natural, and oldest, forms of communication. According to Amordeluso, Alexa is used by two or more family members in over 80% of households surveyed.

 

So what might this mean for insurance?

To train Babylon Health, the AI brain is trained like a doctor would be: given a significant amount of knowledge about medicine, in order to create a knowledge base. A similar approach can be applied to insurance, training an artificial agent with data sets on risk profiling or insurance policies.

One potential for voice is to rethink the relationship between insurers and their customers. Just like chatbot technology, it could help insurers, brokers, agents and reinsurers have two-way conversations with their customers, using scripts designed with empathy and compassion.

Currently, insurance has two highly emotional points of contact: firstly when buying an insurance product, and secondly when filing a claim. Voice and chat could help fill in the middle and develop more lasting relationships between insurers and their customers.

Often, we think of benefits in the short-term rather than the long-term. It is all too easy to baulk at the idea of paying for a risk we have not imagined yet. Voice and chat could help customers understand the value of insurance in the long-run while arming them with knowledge on how to better protect against risk in the short term.

 

Key takeaways from Max Amordeluso on Alexa:

  • Train Alexa to understand context.
  • Give a clear purpose. Design it to do one thing extremely well.
  • Always look at it from a human angle.
  • Start by writing out the script on paper with 2-3 people. Remember, 'for human beings by human beings'. This blog post by our UX designer, Matteo, might come in helpful.
  • "Write for the ears, not the eyes. Eyes expect uniformity, ears expect variety."

 

"Alexa - how can I improve my car insurance premium?"

We would love to see insurance get talked about more. As children grow up talking to machines, can they also grow up hearing more about insurance? Understanding its value and technical terms could help us create a 100% insured society.

The more we talk about it, the more we can do without worrying about the risk. 

Let's make insurance chatty again.