Cookie Preference Centre

Your Privacy
Strictly Necessary Cookies
Performance Cookies
Functional Cookies
Targeting Cookies

Your Privacy

When you visit any web site, it may store or retrieve information on your browser, mostly in the form of cookies. This information might be about you, your preferences, your device or used to make the site work as you expect it to. The information does not usually identify you directly, but it can give you a more personalized web experience. You can choose not to allow some types of cookies. Click on the different category headings to find out more and change our default settings. However, you should know that blocking some types of cookies may impact your experience on the site and the services we are able to offer.

Strictly Necessary Cookies

These cookies are necessary for the website to function and cannot be switched off in our systems. They are usually only set in response to actions made by you which amount to a request for services, such as setting your privacy preferences, logging in or filling in forms. You can set your browser to block or alert you about these cookies, but some parts of the site may not work then.

Cookies used

Performance Cookies

These cookies allow us to count visits and traffic sources, so we can measure and improve the performance of our site. They help us know which pages are the most and least popular and see how visitors move around the site. All information these cookies collect is aggregated and therefore anonymous. If you do not allow these cookies, we will not know when you have visited our site.

Cookies used

Google Analytics

Functional Cookies

These cookies allow the provision of enhance functionality and personalization, such as videos and live chats. They may be set by us or by third party providers whose services we have added to our pages. If you do not allow these cookies, then some or all of these functionalities may not function properly.

Cookies used




Targeting Cookies

These cookies are set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant ads on other sites. They work by uniquely identifying your browser and device. If you do not allow these cookies, you will not experience our targeted advertising across different websites.

Cookies used


This site uses cookies and other tracking technologies to assist with navigation and your ability to provide feedback, analyse your use of our products and services, assist with our promotional and marketing efforts, and provide content from third parties


Here are some suggested Connections for you! - Log in to start networking.

Nuance - Blog Page 2

Page: 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9

Why Punxsutawney Phil should not be the norm for prediction in customer service

Punxsutawney, a small town in Pennsylvania, draws thousands of visitors every year on February 2nd when Phil the Groundhog predicts the weather. On February 2nd, 2018 it will be the 132nd time. According to a legend, if Punxsutawney Phil sees his shadow, there will be six more weeks of winter weather. If he does not see his shadow, there will be an early spring. This legend originates from the German-speaking areas where a badger used to forecast the weather.

But no matter which animal is used, those predictions are rarely accurate. In fact, Phil is only right 39% of the time. In order to predict the weather, or anything else for that matter, you need data. And the only data animals have is how much longer they want to sleep. That’s why actual weather forecasts utilize data from the past to predict the upcoming changes for the wind direction, likelihood of rain, etc.

This is very similar for brands who want to utilize predictive targeting for their customer engagement. Without historic data there is nothing on which a prediction can be based. Before thinking about adding a prediction engine to customer service, brands have to take a close look at the data they have available. This can be recordings from calls that have been transcribed, chat scripts, customer journey data, etc. The more the better. This allows the prediction to be more accurate over time.

If there is not much historical data available, brands can use current information from their customer engagement tools. For example, implementing a virtual assistant and a live chat in several digital channels allows the brand to gather new data and insights. These can then be leveraged to improve the prediction over time.

The best way to create a great customer engagement experience is to continually gather customer data. Every bit of information that can be received during conversations can be utilized for valuable and meaningful insights, which then result in a better optimization process that then allows the brand to predict customer behavior better and better.

This information loop can be augmented by humans who help analyze the data, adjust it to put into the right context and, in addition, help the prediction engine and the underlying machine learning algorithm to learn what to look for. The combination of both automation and humans drives higher accuracy and a better experience for the user.

This said, I’ll still be hoping that that Phil doesn’t see his shadow.


Publish Date: February 2, 2018 5:00 AM

Become the psychic for your customer service

Working in Marketing requires us to read about what’s going on in the market. This can be challenging at times, sifting through the massive volumes of content, especially when you deal with technology that is a hot trend right now, like AI. But every now and then you stumble over great pieces like an article by VentureBeat’s Blair Hanley Frank, who states,  “People don’t go and buy two quarts of AI. They buy a product to solve a problem […]”

We couldn’t agree more. The problem with delivering a great customer experience has always been the same, AI or no AI: understanding how technology addresses customer needs while solving a business problem. Unfortunately, technology can be blinding. It promises so much but, if used incorrectly, can come with a lot of sorrows. One of them is that your customers may not like it, thus won’t use it, which most likely will result in their seeking alternatives from you or, worse, seeking alternatives from your competitors.

Instead of running down the rabbit hole, let’s take a step back and think about the actual business problem. What do your customers want? Most likely they want a fast answer if they have a problem. They also want efficient customer service. No matter if they want to buy something new, add a new feature to their plan or ask a question about the latest bill, they want it done in the easiest way possible.

Taking the right steps

The first step to addressing either of these is to ensure that connecting with you is easy; therefore, letting your customer search for a phone number or an email address for too long won’t help.

Step two is making sure that existing data about your customers is fully utilized. For example, if you know that your customer called you about the same question three weeks ago, don’t make them repeat the question. Instead delight them by using that knowledge to streamline their engagement. And if you have to transfer them, for example from the IVR to an agent, also transfer the context. Strong integration with your CRM system (or any other system that you use to store customer data) is a must across all your automated or human-assisted interaction channels.

Enter the psychic

Now comes the fun part, the one that you’ve probably heard before: using artificial intelligence to improve the customer’s experience. One of the most common scenarios is predicting a customer’s intent. It’s like having a personal assistant that tells you exactly what you need in the moment it is needed. Let’s say you receive a notification telling you about a roaming upgrade to your phone plan (because the system realized that you are going to Europe next month). You call the number that is displayed in the outbound notification, and the IVR greets you with:

“Hi Chris, are you calling about upgrading your plan for your Europe trip next month?”

“Uhm, yes.”

“Great! How long are you staying in Europe?”

“About three weeks.”

“We can add the roaming option for you and automatically remove it once you’re back. Do you want to add the option with a start date of February 4th and end date February 25th?”

“Yeah, that would be great.”

“You’re all set, Chris. Enjoy the trip.”

Several things will change as soon as this technology is implemented. First, your customer will view your customer service as both fast and efficient. No need for them to remember to call you - you will proactively reach out ahead of time. Kudos, for sure. In addition, it will help streamline your contact center operations as callers won’t need to take time working through IVR menus or being transferred to other departments. Or better still, they may not even need to call at all. Both of which mean less cost for you. Finally, your own CRM system will become smarter by learning what does and doesn’t work with customers, driving even further speed and efficiency improvements in the future.

Does this all sound like something from the far future? It’s not as far away as you think. The technology exists today to put these solutions into action. Let us show you how we can use AI to improve customer service, streamline your contact center, and create more efficient digital channels. And, of course, become the psychic your customers will love.


Publish Date: January 19, 2018 5:00 AM

Can cyber criminals “compromise speech recognition systems with ease”?

A recent Finnish university study on voice biometrics has been making headlines – and most of those news stories have been inaccurately summarizing the results with concerns as in our title above, leading many to believe that cyber crooks can compromise even the best speech recognition systems.

Before commenting on the article and the study, I feel it is important to highlight that Nuance’s voice biometrics solutions have secured over five billion transactions to date, and not once has an impersonation attack been reported. We have conducted several voice impersonation attacks with famous voice impersonators in the US and the UK, and none proved successful.

So why are the news stories missing the mark? The real story? Let’s start with the conclusion.

 “The results indicate a slight increase in the equal error rate (EER). Speaker-by-speaker analysis suggests that the impersonations scores increase towards some of the target speakers, but that effect is not consistent.

So how could the researchers write that “Voice impersonators can fool speaker recognition systems”? To understand that, you need to dig deeper into the study. Here are the actual data points:

So what does this data mean? Let’s start with some definitions.

Text Independent- This is passive voice biometrics where a voiceprint is created from listening in on a normal conversation and that voiceprint is compared to a voiceprint on file.

Same Text- This is active voice biometrics where the user is given a specific phrase to repeat. (Often it is “My voice is my password”.) Once enrolled the user is asked to speak the phrase and then this new voiceprint is compared to the voiceprint on file.

False Accept Rate- This is the percentage of times a system incorrectly matches to another individual’s existing biometric. Example: fraudster calls claiming to be a customer and is authenticated.

False Reject Rate- This is the percentage of times the system does not match the individual to his/her own existing biometric template. Example: customer claiming to be themselves is rejected.

Equal Error Rate or EER- The EER is the location on the graph curve where the false accept rate and false reject rate are equal. In general, the lower the equal error rate value, the higher the accuracy of the biometric system. Note, however, that most operational systems are not set to operate at the “equal error rate”, so the measure’s true usefulness is limited to comparing biometric system performance.

GMM-UBM; i-vector Cosine; & i-vector PLDA- These are three different algorithmic approaches to voice biometrics.  Notice that the latest technology, Deep Neural Networks, is not tested.

Now that we have that, the data showcases the following:

  1. In one instance (text-independent GMM-UBM) the EER is decreased with impersonation – meaning that the imposters were less successful at generating a false accept than a random individual not attempting any voice mimicry.
  2. In another instance (same text i-vector PLDA) the EER is virtually identical between the impersonation testing and random attacks. In other words, imposters have the same performance via mimicry as a random individual not attempting to modify their voice.
  3. In four instances, there is an increase to the EER rate, but given the small sample size (60 voices) the results are not statistically relevant. In other words, a test performed with a larger sample may showcase opposite results.

Finally, and maybe most importantly, the researchers did not perform the tests with Nuance voice biometric technology. This is evident by the very high EER rates reported by the study as a “baseline” result, ranging from 4.26% EER to 10.83% EER. No tests were conducted on deep-neural-network based voice biometric algorithms, the technology used by Nuance and deployed through scores of enterprises worldwide.

In conclusion, although this topic does merit additional research, Nuance will continue its focus on improving our ability to address actual fraud attack vectors, including brute force attacks, voice imposters, and recording attacks while continuously improving the voiceprint and also improving mitigating strategies for future attack vectors that we believe will eventually be used by fraudsters such as synthetic speech attacks.

Contact us if you would like to learn more about the great strides Nuance has made in Voice Biometrics.


Publish Date: January 9, 2018 5:00 AM

2018 predictions: Five ways AI will make you love customer service this year

1. Your voice will be your password

2017 was a record year for hacks of personal customer details. These breaches give fraudsters access to our identities including the answers to those annoying security questions. One thing the fraudsters can’t do much with? Voice data. And that is why banks and telcos are increasingly replacing security questions with biometrics.

With a few words of speech, voice biometrics can confirm you are who you say you are at accuracy and security levels better than pins, passwords and security questions. And it knows how to detect recordings from real, live speech – rendering the data useless to fraudsters in the case of a breach.

2. You will use a virtual assistant (VA) for customer service, and it will work.

Conversational AI breakthroughs have led to a new generation of VAs specific to your bank, your telco and your pizza ordering, all providing personalized, concierge-like service. In 2018, this generation of VAs will be made even more effective, through technology called HAVA (Human Assisted Virtual Assistant). HAVA adds a human-in-the-loop capability, first to help answer new questions the VA may not know, but more importantly to provide a learning loop that updates the VA’s “brain” in real time.

3. You will add a brand as your messaging “Friend” – and you will mean it.

In 2017, Facebook Messenger, Line, Kik and more added capabilities for their users to “friend” organizations and companies, and late in the year, Apple announced Apple Business Chat, which will do the same for Apple Messages. In 2018 you will start engaging brands in the same way you talk to friends – in your messaging app, through SMS and even inside your banking and telco apps. And AI will allow each brands’ VA engine to respond to you in a personalized way, referencing past engagements you have had across other channels.

4. Prediction will let brands anticipate your needs

Customer service creates a ton of data. In 2018 this data will be harnessed more than ever to fuel new AI engines. Predictive customer service will let brands anticipate what you need or may do, before you even know, by analyzing and detecting the patterns of billions of customer engagements over time.

5. The “800” number will enter early retirement

Digital customer engagement combined with mobile devices, tablets and data lines will lead to less calls. A lot less. In 2018 you will engage with a virtual assistant and if they can’t resolve an issue, you will be seamlessly texting with a live contact center agent. If the issue is really complicated and can’t be resolved through messaging, you still won’t call the 800 number. In 2018, that step will be integrated through advanced technologies like WebRTC and IVR-to-digital, allowing the contact center agent to connect with you by voice or video within the app, on your laptop, even through your TV screen or smart speaker.


Publish Date: January 5, 2018 5:00 AM

Over and Out – Moving beyond the walkie-talkie voice interface: Part II of “What’s left to tackle in voice technology”

“Over” was short for “over to you” indicating that its your turn to talk on a short wave radio or walkie-talkie (or any half duplex comm tech for you nerds out there). Smart speakers are super cool and a step forward in voice – but they’re still half duplex, klunky, unnatural voice interfaces. We’ll all look back one day and remember how quaint today’s smart speakers were - like we remember morse code, tape players, and VCRs. Try turn taking a face-to-face conversation or conference call sometime and you’ll get a feel for what smart speakers, and all voice interfaces for that matter, are missing out on. There’s a whole field of study around the protocols and rules of human conversation called “Pragmatics” that study how humans interact one to one, one to many and many-to-many.

For example – I’ll say, “Alexa, play ‘Fool in the Rain’ by Led Zeppelin on Spotify,” and wait the requisite 3 seconds of silence so Alexa knows I’m done talking (might be easier to just say “over”). Then Alexa says, “I’m sorry, I can’t find ‘Fool in the Rain’ by Led Zeppelin on Spotify.” I’ll remember I cancelled Spotify and try to correct myself by speaking over Alexa “No, play it on Amazon Music.” It’s natural to do this – a person wouldn’t miss a beat having the same conversation.

In addition to the half duplex limitation – Alexa also can’t understand multiple speakers. Even the best user interfaces today employ turn taking to manage the conversations and don’t work at all with more than 2 speakers in a conversation. For example, if my children interrupt Alexa while she’s playing ‘Fool in the Rain’ and ask her to play “Space Unicorn“, a song that can make you insane after hearing it for the 400th time, I typically respond by shouting “Laa Laa Laa Laa!!” to confuse Alexa and keep her playing good music.

Managing the turn taking in a conversation with multiple speakers is no simple task. It requires that you listen while you talk and also respond to visual queues (in a face-to-face conversation). For example, Japanese speakers often produce backchannel expressions such as un or  while their partner is speaking. They also tend to mark the end of their own utterances with sentence-final particles, and produced vertical head movements near the end of their partner’s utterances. See Turn-taking - Wikipedia for a long description of the complexity. The listen and talk problem gets exponentially worse when you add more speakers. A bot will need to know whether its having a friendly conversation and should wait until the person is done talking or if the bot is arguing and should cut into the rant. For more detail on that complexity read this article.

Recognizing these short-comings is the first step in over-coming them. Nuance R&D is working on these problems and others to transform the way people interface with technology.

Stay tuned for parts 3 and 4 as we catalog the technical problems when telling your customer to “talk to the IVR like you would a human”.

  • Part 1 – Buffalo buffalo Buffalo buffalo buffalo buffalo Buffalo buffalo: Part 1 of What’s left to tackle in voice technology
  • Part 3 - Sentiment and Emotion in Voice - “Your customer seems angry - ummm - now what?”
  • Part 4 - “I heard you talking about the train schedule - can i help?” - voice interfaces as proactive participants in conversations(without being rude


Publish Date: January 3, 2018 5:00 AM

A better user experience leads to better business results

Today, many (if not most) companies face many different challenges in managing IT infrastructure, including systems, applications and hardware. This is especially true when it comes to their fleets of printers, scanners, and multifunction printers (MFPs) as well as any related software or workflow solutions.

To be more specific, these challenges can include:

  • Inconsistent user interfaces: We have all probably experienced this firsthand – the case where we are forced to adapt to a poorly designed, inconsistent or non-intuitive user interface (UI). It can be a significant issue: Bad UI experiences can lead to low adoption rates, more reliance on training services, decreased productivity and even a disengaged workforce.


  • Different technologies, inefficient workflows: Large companies today tend to have many different brands of printers and MFPs, each with their own operations, interfaces and ways to use them. This alone can make it difficult for users to perform important tasks, such as document capture and workflow, where an employee might copy an important document and automatically send it to a central repository. As a result, the entire organization misses a significant opportunity to increase productivity, efficiency and users’ morale.


  • Security and compliance risks: Printers actually pose a significant security risk, and now, companies of all sizes are under new pressures to comply with stringent data privacy regulations, such as GDPR. At best, it’s a difficult, time-consuming process. At worst, security breaches can lead to fines, lawsuits and long-term damages to the company’s reputation.


Or if companies are still relying on outdated technology – especially older printers and MFPs – they may have a much hard time managing and securing their entire IT environment. As a result, they may be subjecting themselves to even more security risks and potential compliance issues.

A whole new (user) experience

The good news is that there are extremely effective ways to overcome all of these challenges, and in doing so, provide better user experiences, workflows, and security.

For example, external terminals, such as the new Nuance® Edge™ for Copitrak terminal, already provide a much better UI and enhance the speed, functionality and quality of related processes, such as scanning.

It’s an important advantage, especially when you consider terminals like this unify the overall experience, which could be different – and confusing – from device to device. For example, according to recent research from, 83 percent of users report that “a seamless experience across all devices” is extremely important to them.

By giving employees a unified – and much better – UI, external terminals no longer “force” users to adapt to the different screens and steps they’re sure to find in a mixed-MFP environment. This alone helps employees work much faster, smarter and more effectively.

Better, more intuitive UI can also help employees with critical work task such as scanning. For example, today’s external terminals can provide powerful tools to fine-tune resolution, DPI, contrast, brightness, auto-color correction and more. These features and functionality help minimize time lost in steps like post image processing, saving time and freeing employees to become much more productive.

And when it comes to security, external terminals such as the Nuance Edge come installed with the latest Windows 10 operating system and other security tools. This helps any organization administer the latest network security polices to improve overall security and compliance efforts.


Publish Date: November 29, 2017 5:00 AM

Buffalo buffalo Buffalo buffalo buffalo buffalo Buffalo buffalo: Part 1 of What’s left to tackle in voice technology

Me: “Alexa - what’s the temperature going to be today?”

Alexa: “Right now the temperature is 56 degrees today in with cloudy skies.  Today you can expect clouds and showers with a high of 60 degrees and a low of 44 degrees. ”

Me: “What about tomorrow?”

Alexa: [blank stare]

Me:  “Ugh - Alexa - what will the temperature be tomorrow?”

Voice as a computer interface has come a long way, but it’s still clunky and nothing like talking to another person. Our amazement with how far the technology has come since voice recognition in IVRs came on the scene in the 1980s can make us forget the remaining problems we have to tackle to get to human-level interactions. In this blog series, I’m going to take each remaining hurdle and talk about where we are today, where we’re going and how Nuance is leading the way.

Part 1: Automatically generating dialog for conversations is a complex problem to solve.

“Buffalo buffalo Buffalo buffalo buffalo buffalo Buffalo buffalo.” Believe it or not, this is a grammatically correct sentence and illustrates why automating natural language processing and conversation is hard. If you’re wondering what the Buffalo sentence means you can click the link and read about it (helpful tip - take an Advil). The tl;dr (too long; didn’t read) version is that the word “buffalo” can be a proper noun, noun, or a verb, so the sentence translates to something about how buffalo from Buffalo bully (aka buffalo) buffalo, etc…

This is obviously an extreme example, but it just goes to show that there is plenty of meaning and “nuance” hidden in the words people choose that computers haven’t been “taught” to understand yet.

Here’s an example that may resonate more with English speakers:

SHE never told him that she loved him. (but someone else did)

She NEVER told him that she loved him. (zero times in their entire relationship)

She never TOLD him that she loved him. (she showed it but never said it out loud)

She never told HIM that she loved him. (but told everybody else)

She never told him that SHE loved him. (but that someone else did)

She never told him that she LOVED him. (only that he liked him and thought was funny)

She never told him that she loved HIM. (she said she loved someone else)

As a live, English-speaking human, you would catch the subtle changes in meaning just by placing inflection on different words. However, artificial intelligence would have to be taught that kind of nuance.

Another great illustration of the complexity of language can be seen in a video of physicist Richard Feynman, apparently being condescending to his interviewer: Richard Feynman Magnets - YouTube. The interviewer is simply asking Dr. Feynman to explain magnetism to him, and Dr. Feynman refuses and dismisses the question, saying that the interviewer won’t understand. The net of the video is that Dr. Feynman can’t explain magnetism in a meaningful way without a shared frame of reference – and he and the interviewer don’t share one. The interviewer doesn’t have the degrees that Dr. Feynman has, so he equates it to explaining to an alien why his wife is in the hospital with a broken leg. Well, she slipped and fell. Why did she slip and fall? Well, she was walking on ice. Why is ice slippery? …etc., on down into deeper and deeper levels of complexity – for seven minutes – and never answers the magnetism question. (One viewer posted, “This is why no one talks to you at parties.”)

This complexity is at the core of the problem we need to solve for computers to “learn” how to converse with humans. Nuance is making great advances in automating conversation.  Currently the state of the art in this area is still Simple Question Answering (essentially Enterprise Search front-ended with Natural Language Understanding). See Paul Tepper’s Post on advances in automating conversation. Nuance is working internally and with research partners on encoding the general knowledge that computers need in order to decipher the buffalo sentence and to have a frame of reference to converse with humans.

So, just in case you didn’t have a frame of reference when reading this blog post, go back and read the Wikipedia entry on the buffalo sentence and watch the Dr. Feynman video. Then you’ll understand the monstrous task we have in bringing voice technology up to human-level interactions.

Next time: Part 2: Sentiment and Emotion in Voice – “Your customer seems angry – umm – now what?”


Publish Date: November 21, 2017 5:00 AM

Hear ye! Hear Ye! Speech delivers!

It seems intuitive that an IVR that features a user-friendly, speech-enabled menu would deliver improved performance and customer experience over antiquated touch-tone systems. Well now we have the research to prove it.

In the past year Nuance hired a third-party research firm to evaluate the IVR customer experience offered by 50 leading companies in the Fortune 250 to see how well their IVRs perform. The results are surprising and unsurprising. Spoiler alert: the unsurprising part is how well speech-enabled IVRs worked compared to touch-tone.

The research criteria

Using a rating scale of 1–5, third-party researchers evaluated each of the 50 companies across six key criteria to assess the state of their IVR and their ability to help customers resolve issues quickly and painlessly. The six criteria were:

  1. Ease of use – how easy was the experience?
  2. Speed – how quickly was the call resolved?
  3. Speech recognition – if speech enabled, how well did the IVR understand the caller?
  4. Conversational dialogue – does the IVR engage in personal, back-and-forth dialogue?
  5. Caller intent – how well did the IVR determine why someone is calling?
  6. Audio quality – were the menus clear and easy to hear?

Researchers compiled all the results across these six criteria and generated an average score for each IVR.

The results

Across all 50 IVRs evaluated the average score was 2.3, indicating that there is much room for improvement in how IVRs support callers.

The unsurprising results

No surprise to us is that the IVRs in the top five performing industries below scored a whopping 35% higher than the bottom five. Is your company in one of these industries?

What makes the leaders stand out? No surprise that the reason for their improved IVR performance was that they invested in speech-enabling their IVRs at a much higher rate. 67% of the top performing companies adopted speech-enabled IVRs.

Speech-enabled IVRs — whether standalone or combined with Dual-tone Multi-frequency signaling (DTMF) — provide higher quality experiences than DTMF systems alone. As shown in the graphic below, companies with speech IVRs had significantly higher average scores:

The surprising results

The data above was no surprise at all. But what was surprising? Two things stand out:

First, 53% of the companies still employ an old-fashioned touch-tone IVR. Yes, DTMF and “Press 1 for Service” still lives on in the majority of companies we called. That’s great news for anyone who loves the 80’s but not so great for everyone looking to move into the future.

Second, industries that are heavily reliant on their IVR and contact center fell into the lower performing category. Companies that sell Insurance (Financial Services), Healthcare, and Health Insurance all scored significantly below the average. Given how important the phone is for engaging customers in these industries, it is curious to see them perform below average.

Is your company in the bottom tier? Still rely on DTMF? Then please read on!

Say “Yes” to speech

With the rise of voice-activated smart assistants in our phones, cars, and homes, the power of voice is on the rise with no sign of slowing down. So why have your customers greeted with technology from 1988? Your IVR is one of your most important channels, and it makes sense to start the move to speech today. Today’s modern, conversational IVRs use powerful speech recognition and natural language so callers can engage the IVR and simply say whatever they’d like – in their own words – and be directed to the right resource. Imagine your customers’ delight when they can stop pushing buttons and start using their own words.

Check out the full research infographic to review the results in more depth, and then contact Nuance to see how we can help you be a top performer.


Publish Date: November 14, 2017 5:00 AM

Preparing for the “next one”: The importance of emergency outbound communications

It was a month like we’d never seen before. As we watched Hurricanes Harvey, Irma, Jose and Maria impact the US and Carribbean, and a massive earthquake hit Mexico City, a series of questions may have run through our heads. How can we help those people? How do they rebuild? How can we better prepare? As a society, we’ve talked a lot in recent years about upgrading our infrastructure, and that goes beyond roads, bridges and power grids. It’s likely owing to my profession, but I believe the modernization of communications that can be leveraged during disasters like these can literally be life savers for those communities threatened by situations like these.

The timing seems right – what used to be unsophisticated outbound technologies like “robo calls” are now going through a renaissance as more advanced vendors orchestrate multiple proactive engagement channels like text messaging, push notifications, email and automated voice, coordinating with IVR and digital through an omni-channel fabric and improving ease-of-use through cloud platforms. Using outbound notifications before, during and after an emergency like tornado or flooding should be seen as the first line of defense for local governments. Often, the first thing that happens as regions gird themselves for a disaster is a massive increase in inbound calls to customer service lines. Citizens are demanding timely answers about what they should do, and call centers can quickly become overwhelmed as wait times grow.

By combining voice, text and other channels in an integrated fashion, residents get the information they need through the channels they prefer, extending the reach of critical messages like incident preparedness, evacuation routes and shelter locations.

We all know this type of communication is important – and may become increasingly vital – so, what should we look for in an outbound communications platform?

  1. Security and compliance: Security must be the number one priority. Be sure your vendor is protecting sensitive data and staying in regulatory compliance with PCI Level 1, ISO 27001 certification and HIPAA-compliant data centers.
  2. Self-service messaging: You’ll need an intuitive user interface so you can easily record a message, use text-to-speech or choose from pre-recorded messages.
  3. Omni-channel contact strategy: It’s critical that your solution works across channels and supports voice, email, text and smartphone push notification. The ability to transfer to a representative or call center directly from a voice message is a bonus.

We don’t know when or where “the next one” is coming, but we do have concrete steps we can take to limit loss of life and property. Now is the time to take that first step.


Publish Date: November 7, 2017 5:00 AM

Boo! Scary customer service practices that make us scream

Halloween is a time of frights and scares. Zombies, goblins, witches, and monsters are let loose on the public to scare and haunt them. And a good scare is tons of fun this time of year and makes us scream in delight. Good scares get our adrenaline going. But on the flipside, bad experiences that cause us to scream with anger get our blood boiling. That’s never good, but unfortunately it happens every day when consumers receive frustrating and ‘scarily bad’ customer service experiences.

Read on, if you dare, for three of the scariest customer service experiences we believe are guaranteed to make any customer scream.

“Push 1 for mortgage enquiries…”

There it is. Popping out of your phone like a monster bursting from behind a wall. The outdated IVR menu. (Cue the Jamie Lee Curtis in “Halloween” scream!) In a world of cool, new voice-enabled applications and assistants, the old fashioned IVR terrifies your customers.

Nobody wants to wade through endless mazes of touch-tone based options and push buttons like it’s 1978. Customers will scream in frustration. Surprisingly there are still many enterprises still using this old-fashioned technology today. Our research into 50 of the Fortune 250 IVRs shows that a scarily high 53% of the companies still employ an old fashioned touch-tone IVR. Hard to believe and yet so easy to fix.

Today’s modern conversational IVRs delight callers with powerful speech recognition and natural language so callers can simply say whatever they’d like – in their own words – to get directed to the right resource. Satisfaction goes up, and frustrating screams go down.

What was your second-grade teacher’s dog’s name?

Nothing sets me off quite like the random challenge questions to prove I am who I say I am. Most of the time they are asking me something I answered five years ago and promptly forgotten, or worse, something that is not hard to find out like Mother’s maiden name. Of course, now that I am on the phone with an agent, I am the one who looks stupid. “I don’t know… Spot?”

Fortunately for everyone PINs, passwords and challenge questions are the way of the past. Call centers, IVRs and virtual assistants all over the world are adding secure biometrics to ensure the person is who they claim to be. With secure voice biometrics customers can simply state a pass phrase they don’t have to remember, or even be recognized just from a normal conversation. In addition new biometric modalities enable people to use their face, fingerprint, iris and even unique behaviors to prove their identity. All the time not having to memorize anything! “I just remembered. Rover!”

“What’s happening with my flight/package/credit card?”

Too many times a customer must proactively call a company to enquire about an issue they are having. And nothing causes greater frustration and a maddening scream like a customer service agent acknowledging, “Oh yes. I see your flight is delayed.” Huh? So they knew about it? Well, then why didn’t they let the customer know in advance and prevent the phone call?

It doesn’t need to be this way. We live in a world of powerful push notifications through multiple channels where sending a text or email costs a fraction of a penny. Why don’t more companies get onboard with proactive outbound communications? Many do but only for limited scenarios like overdue bills or appointment reminders. They fail to connect the whole customer experience due to siloed service channels.

A proactive outbound platform connected to the inbound IVR platform ensures customers are notified in advance of issues like flight delays or suspicious charges on their credit card. A well-timed text or email ensures the right outcome and also increases customer satisfaction by preventing them from calling your contact center, which reduces operational expenses. Today’s consumers want to be notified proactively; they opt in for communications that help reduce their effort. New technology allows organizations to both notify consumers and engage in a two-way conversational text dialogue using smart, natural language understanding.


Beams, not screams

Being scared and having a good scream is fun – in the right situation. Calling service channels should not elicit a response best reserved for a Friday night horror flick. With the right investments and planning, organizations can offer their customers a service experience that leaves them beaming -  not screaming.


Publish Date: October 31, 2017 5:00 AM

As you wish. An inconceivable way of serving customers.

As you wish. That’s the catch phrase that resounds with Princess Bride fans as the movie’s 30th anniversary has recently passed and TCM plans a special showing in theatres on October 15. One of the reasons this silly romantic comedy has become such a cult classic is that it is littered with phrases like “Inconceivable!” and “Wuv, tru wuv,” that find a resting place in the back of our minds.

One of my favorites is “As you wish” – that statement of devotion and tru wuv that Westley proclaimed to Princess Buttercup. Wouldn’t it be refreshing to get customer service that said, “As you wish”? There are a small handful of establishments that follow that mantra when it comes to how they treat their customers, but this philosophy is pretty hard to find with voice or online customer service. Customers are underwhelmed by the digital experiences most brands deliver. Only 7% of brands are exceeding customer expectations, and part of the reason is that their queries are not being answered or solved. To many customers, an “As you wish” customer service is “inconceivable!”


Here are four ways to make your brand’s customer service say, “As you wish.”

  1. Conversational IVR gives human-like interactions that allow customers to self-serve and successfully resolve issues within the IVR. Natural Language Understanding enables the IVR to understand customers’ speech and intent, and delivers an intuitive service experience that anticipates the caller’s needs.
  2. Virtual assistance provides immediate, personalized self-serve assistance across various channels. It enables an intelligent, human-like dialogue between consumers and your brand, either by typing or speaking, yet frees up live agents to assist customers with more complex questions. A virtual assistant becomes smarter over time by learning from past and current interactions happening in your digital channels to constantly optimize the behavior and improve the accuracy of responses.
  3. Live chat provides prompt, live, human interaction with customers online. Live chat technology with skills-based routing will direct them to the right agent group where they can get their issues resolved with expert knowledge and efficiency – as you wish!
  4. All 3 working together can succeed in meeting customer expectations of immediacy, self-service, and the human touch. Connecting an IVR experience to a VA experience or to live chat, or connecting a VA engagement to a live chat experience, will add to your brand’s message of “As you wish” by making sure customers are getting the information they need in the way they want. Routing the customer in a seamless way in which your brand’s look, feel, and voice is consistent throughout, and in which the customer does not have to repeat himself, shows him that your brand is devoted to providing excellent service.

Is an “As you wish” customer service “inconceivable” to your customers? If you want to discover more about transforming your brand’s customer experience to meet consumer expectations, contact us today!


Publish Date: October 13, 2017 5:00 AM

Rules of customer engagement: 5 tips on how to create a meaningful customer experience

Some companies are a natural when it comes to communicating with their customers. They’re attractive, pleasant, interesting… but are they memorable? They may be the life of the party, but are they gaining customers that trust and value them? Are they making more than just acquaintances - but, rather, loyal customers?

A meaningful customer experience can be achieved by acquiring people skills that individuals must use in real life to create lasting relationships. Below is a simple list of engagement rules that can be applied not only to our personal relationships, but to enterprises that want to build a solid customer foundation. They can utilize these rules not only in their live chat programs, but also in virtual assistance and outbound communications.

  1. Use their love language. Relationships are much more successful when each other’s love language is spoken. Each party receives communication in a way they understand and appreciate. Customer relationships should operate on the same principle. Understand how your customers prefer to engage with you, whether via self-service guides, virtual assistance, live chat, mobile, or a combination. Furthermore, within that engagement, study the nuances that are specific to the channel - for example, customers engaging through mobile chat will be using short, simple sentences to communicate because of the small typing space, and would appreciate the same in response due to less screen space for reading on the go.
  2. Always remember a face. Don’t you feel important when someone remembers your name and something specific about you? When the customer comes back for a subsequent purchase, let them know that you remember them by offering deals that are relevant to the customer’s history. Showing them generic ads they have not shown interest in can turn them away. A good memory (made possible with customer data) makes your personalization efforts more meaningful, even in outbound communications.
  3. Talk about the weather. OK, maybe not. But real-time, location-based customer data should give you a good picture of where that customer is, and you can market to them accordingly. If the customer is in Boston in the middle of a snow storm, perhaps an apparel company would offer a special on scarves, or a telemedia company would proactively offer a free streaming of the Lord of the Rings trilogy for a snowed-in family hunkered by the fireplace.
  4. Be a good listener. People know you value their words when you ask them polite, probing questions, listen to their answers, and respond appropriately. In customer relationships, agents should actively listen to what’s at the core of their customer’s issue or search. And whether interacting through live chat, virtual assistance or self-serve guides, customers should be able to voice their opinion through surveys, customer forums, or social media. Pay attention and take the opportunity to improve.
  5. Be consistent. Nothing is more irritating in a real-life conversation than when a person contradicts what they said in a previous conversation – unless that conversation is between a customer and your company! Customers don’t care if one conversation was on their lap top and another is on their smartphone; they consider both to be part of the same interaction. Your customers’ omni-channel world-view expects nothing less than to receive the same engagement across all the channels they use for a particular purchase. Eliminate those silos that don’t communicate with each other!

What kind of people skills does your company have? How effective is your customer engagement strategy at making your customers feel valued? Applying these engagement rules can help in creating meaningful interactions, thereby building loyal customers.


Publish Date: October 12, 2017 5:00 AM

Nuance customers around the globe score big with Intelligent Assistants awards

Remember when you were a kid in school and the teacher would put a gold star by your name on the good work chart? There’s something about seeing that shiny little sticker that fills you with pride in the work you did and determination to be even better. It also shows your peers that you’ve got brains! It’s too bad as adults that we can’t receive gold stars every time we succeed at something. Or can we…

As a provider of customer engagement solutions and services, Nuance and our customers around the world receive “gold stars” when leading research/analyst firms recognize the innovative, customer-focused work we do.

Case in point: A leading delivery service – and a Nuance customer –  was just named winner of the 2017 Opus Research Intelligent Assistant Award. The delivery brand uses our AI-powered Virtual Assistant Nina to provide a high level of personalization across more than 79 countries in 15 languages. Nina lets customers get answers to their questions quickly and easily through the digital channel. The award-winning enterprise and Nuance were honored for delivering an engaging customer experience using natural language understanding, machine learning, and artificial intelligence through the virtual assistant on the brand’s website.

Using intelligent automation and conversational interaction, the Nuance Nina-powered virtual assistant can field frequent shipping questions from customers. In just a year and a half of deployment, in North America alone, the virtual assistant is yielding impressive results, including:

  • 6.7 million interactions total with customers,
  • 300k+ conversations on average per month with customers; and
  • 80-81% first contact resolution rate and 50+% deflection rate.

Why is it so rewarding for Nuance to have a customer receive such an honor? The Opus Research Intelligent Assistant Awards recognize leading brands who are utilizing virtual assistants to redefine digital commerce and customer care. That’s right. We’re redefining digital customer care!

At the same awards ceremony, another Nuance customer was recognized with an Intelligent Assistants Award: Australian Government agency IP Australia was honored for their integrated digital strategy, using their virtual assistant “Alex,” deployed in partnership with Datacom.

Launched in May 2016, ‘Alex’ leverages Nuance Nina to engage customers directly on IP Australia’s website and Facebook page, providing answers to questions and continuously learning from customer queries. As the Australian Government’s first integrated Intelligent Assistant and web-chat digital experience, Alex has had a significant impact on IP Australia’s digital engagement strategy. In 2013, only 12% of the agency’s 800,000 customer interactions a year utilized digital channels and this has grown to its current level of 99.6% digital adoption. To date, Alex has supported over 50,000 customer interactions and assisted in maintaining IP Australia’s customer service satisfaction ratings at over 84%.

Further optimizations to Alex include the introduction of Nuance Nina Coach in July 2017, a first for Asia Pacific. Nina Coach moves Alex into the next generation of Human-Assisted Virtual Assistants powered by Artificial Intelligence, enabling Alex to seamlessly bring in a live agent to assist with tricky questions. This action is recorded, analyzed, and folded back into Nina’s semantic brain, making the NLU technology smarter and more accurate over time, so the virtual assistant knows the answer on its own moving forward.

But wait! There’s more!

Nuance won an award, ourselves! At the AI Summit, San Francisco, we received the 2017 Alconics Award for Best Intelligent Assistant Innovation. The AIconics are the world’s only independently judged awards for practical applications of AI in business. The awards recognize the achievements and advances of the firms pushing the development of these burgeoning technologies forward, offering a level playing field on which Silicon Valley giants and cutting edge start-ups alike can showcase their work during the last year.

So… with three awards that recognize our work in redefining digital customer care, what can this tech company do? Well, we give ourselves three gold stars!


Publish Date: October 5, 2017 5:00 AM

Much ado about bots: How to choose the right enterprise chatbot

When the question for your enterprise business is no longer “To bot, or not to bot” but instead is “Which bot?”, where do you start to find the answer?

First, you must understand what large enterprises require in a chatbot. Consider these 7 guidelines for choosing a chatbot for your enterprise brand.

  1. Enterprise chatbots (or virtual assistants) should have secure multifactor authentication through biometrics to ensure your customers are who they say they are without ever leaving the virtual agent conversation. Enterprise chatbots must also meet the stringent data privacy and security standards of large enterprises.
  2. The chatbot must be implemented across the whole of a robust digital engagement platform - web, mobile web, mobile app, SMS, Facebook Messenger, IVR, social, and IOT. Chatbots can be instrumental in deflecting calls from IVR to digital channels, or in fielding common questions on digital channels, leaving the more complex customer tasks to a live chat agent. The VA should also have visibility to the full user journey, giving them context of the customer’s inquiry, enabling them to resolve the issue more quickly.
  3. Enterprise chatbots must be connected to live assistance. Nothing is more frustrating to a customer than for a virtual assistant to provide irrelevant responses to their inquiries. You must provide customers with real human contact via live chat when issues are more intricate than a frequently asked question. Plus, the live chat agent can be available to help the chatbot behind the scenes. At Nuance, this is called Nina Coach.
  4. The chatbot must be continuously learning, not only from its own interactions, but more importantly, from live chat conversations. Chat transcripts contain person-to-person conversations that provide valuable insights from which the VA can learn and grow.
  5. Enterprise chatbots should come with full access to professional services from experts who understand the inner-workings of virtual assistance in an omni-channel environment. A virtual assistant should never be a set-it-and-forget-it product when it comes to the many variables of your enterprise customer engagement.
  6. An enterprise chatbot must be powered by Natural Language Understanding, enabling conversations between humans and computers. On an enterprise scale, NLU needs to have capacity for global languages, grammar accuracy checking, content variations, and should be scalable to accommodate the vastness of global communications.
  7. Enterprise chatbots should integrate analytics to view the big picture and seize optimization opportunities. Rich data, reporting and analytics provide powerful insights that are crucial to performance improvements.

With customer loyalty and revenue at stake, selecting the right chatbot for your organization can make or break your customer service success. Using the above guidelines can get you off on the right foot towards your selection. If you want to learn more and dive deeper into this process, join us for the upcoming webinar, Key Considerations for Selecting the Right Chatbot for Enterprise Customer Service on October 3, at 11am ET/8am PT. Register here!


Publish Date: September 28, 2017 5:00 AM

Waiting on hold will soon become a thing of the past

Recently, Head of Nuance Communications Cognitive Innovation Group (CIG) Paul Tepper was interviewed by AI Business, a content portal for the latest news deciphering the impact of Artificial Intelligence (AI) in business. Paul sheds light on how AI is transforming the way businesses interact with and understand customers while providing insight into the opportunities and challenges the industry faces moving forward.

Here are the highlights of this very informative interview:

“AI is the greatest tool for unlocking the vast and unprecedented pools of unstructured data. … It has the potential to remove the friction we see today across a wide array of customer experiences.”

“AI can bridge the gap between increasing consumer demands and a strained customer service model.” … waiting on hold for an agent will soon become a ‘thing of the past.’

“Predictive AI will enable us to know what a customer is calling about before they even say anything. … Conversational AI will maintain context across multiple interactions and channels.”

Paul sums up the power of conversational AI. “Speech enables people to talk to devices hands-free, without needing a screen. This is especially helpful when your hands are busy, but in general, it enables people to talk to devices the way they talk to each other in the most natural, human way. Today, Automated Speech Recognition (ASR) systems are as accurate as humans or beyond human accuracy.”

Paul shared some thoughts on an important area of discussion – the need to safeguard, regulate, and control AI. Paul believes that much of the public fear today is overblown: “Again, we are still far away from ‘general AI’ achieving human-level intelligence as AI today and for the foreseeable future will be great at focused tasks.”

He stresses that we must take measures to keep secure the large volumes of data on which AI is trained.

Paul also reveals the power of the Nuance Omni-Channel Platform and highlights Nuance Dragon Drive and Nina as AI examples.

The article was written in anticipation of the AI Summit San Francisco, September 27-28. Yann Motte, Vice President, Strategy and Business Development, Cognitive Innovation Group, will be presenting at the AI Summit on the topic of “Making AI for Consumer Engagement Real.”

Stay tuned to What’s Next to get Yann’s insights and takeaways from the conference!


Publish Date: September 18, 2017 5:00 AM

Page: 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9

About us - in 60 seconds!

Submit Event

Upcoming Events

The place where the world's best meet and share their best practices!

A place for professionals to learn the latest and greatest strategies and ideas and to connect with the elite in the industry. 

This is the highest rated industry event with ... Read More...

Latest Americas Newsletter
both ids empty
session userid =
session UserTempID =
session adminlevel =
session blnTempHelpChatShow =
session cookie set = True
session page-view-total =
session page-view-total =
applicaiton blnAwardsClosed =
session blnCompletedAwardInterestPopup =
session blnCheckNewsletterInterestPopup =
session blnCompletedNewsletterInterestPopup =