What Microsoft’s acquisition of Nuance could mean for the Future of Workplace AI

My Tech Decision LogoThis post was originally published by Zachary Comeau at My Tech Decisions

Microsoft’s recent announcement that it is acquiring healthcare artificial intelligence and voice recognition company Nuance could signal a new era of voice-enabled technologies in the enterprise.

Nuance’s speech recognition technology for medical dictation is currently used in 77% of U.S. hospitals, and Microsoft plans to integrate those technologies with its Microsoft Cloud for Healthcare offering that was introduced last year.

However, the purchase price of $19.7 billion indicates that Microsoft has plans to bring more voice recognition technology to other vertical markets aside from healthcare.

We sat down with Igor Jablokov, founder and CEO of augmented AI company Pryon and an early pioneer of automated cloud platforms for voice recognition that helped invent the technology that led to Amazon’s virtual assistant Alexa, to talk about Microsoft’s move and how intelligent voice technology could impact the workplace.

What do you make of Microsoft’s acquisition of Nuance?

So look, it’s going to be a popular thing to talk about moves in healthcare, especially as we’re still through the throes of this pandemic. And most of us, I’m sure had a challenging 2020. So that’s a great, way to frame the acquisition, given Nuance, some of the medical dictation and other types of projects that they inserted into the healthcare workflow. So, that makes sense. But, would anybody actually pay that much for just something for healthcare? I would imagine Microsoft could have had as big an impact, if not larger, going directly for one of those EHR companies like Epic. So, that’s why, I’m like, “All right, healthcare, that’s good.” , is it going to be a roll up where they will be going after Epic in places like that, where there’s already lots of stored content, and then vertically integrate the whole thing? That’s, that’s the next play that I would see. They’re gunning for to own that workflow. Right. Okay. So that’s that piece. Now. On the other hand I see it as a broader play in employee productivity, because whenever Microsoft really opens up their pocketbooks, like they did here, right, this is, was what their second largest acquisition, it’s typically to reinforce the place where they’re, they’re the strongest than where they’re essentially , dairy cow is, and that’s employee productivity.

Microsoft has never been solely focused on healthcare. Their bread and butter is the enterprise. So how can the same technologies be applied to the enterprise?

You’re exactly right. Now why do we have special knowledge of the Nuance stuff? Well, the team that’s in this company Pryon, actually developed many of the engines inside of Nuance. So many years ago, Nuance felt like their engines were weak, and that IBM’s were ahead of the curve, if you will. I believe around the 2008 downturn, they came in to acquire the majority of IBM SAS speech chats and, and the like, and related AI technologies. And my now current chief technology officer was assigned to that unit project in terms of collaborating with them to integrate it into their, into their work for half a decade. So, that’s the plot twist here. We have a good sense now, these, it is true, that these engines were behind Siri and all these other experiences, but in reality, it wasn’t Nuance engines, it was IBM engines that were acquired through Nuance that ended up getting placed there, because of how highly accurate and more flexible these things were.

So let’s start with something like Microsoft Teams. To continue bolstering Teams with things like live transcriptions, to put a little AI system inside of Teams that has access to the enterprise’s knowledge as people are discussing things – it may not even be any new product, it could just be all the things that Microsoft is doing but they just needed more hands on deck, right in terms of this being a massive acqui-hire in terms of having more scientists and engineers working on applied AI. So I would say a third of it is they need more help with things that they’re already doing. , a third of it is a healthcare play, but I would watch for other moves for their vertical integration there. And then the third is for new capability that that we haven’t experienced yet on the employee productivity side of Microsoft.

 My TechDecisions Podcast Episode 107: Artificial Intelligence in the Enterprise

Microsoft already has their version of Siri and Alexa: Cortana. What do you think about Cortana and how it can be improved?

They attempted for it to be their thing everywhere. They just pulled it off the shelves – or proverbial shelves – on mobile, so it no longer exists as a consumer tech. So the only place that it lives now is on Windows desktops, right? So that’s not a great entry point. Then they tried doing the mashup, where, Cortana could be called via Alexa and vice versa. But when I talked to the unit folks at Amazon, and I’m like, “Look, you’re, you’re not going to allow them unit to really do what they want to do, right? Because they’re not going to allow you to do what you want to do on those desktops.” So it almost ends up being this weird thing like calling into contact centers and being transferred to another contact center. That’s what it felt like. In this case, Alexa got the drop on them, which is, which is strange and sorrowful in some ways.

Other AI assistants like Alexa are much further along than Cortana, but why aren’t we seeing much adoption in the enterprise?

There’s multiple reasons for that. There’s, there’s the reason of accuracy. And accuracy isn’t just you say something, you get an answer. But where do you get it from? Well, it has to be tied into enterprise data sources, right? Because most enterprises are not like what we have at home, where we buy into the Apple ecosystem, the Amazon ecosystem, the Google ecosystem. They’re heterogeneous environments where they have bits and pieces from every vendor. The next piece is latency and getting quick results that are accurate at scale. And then the last thing is security, right. So there’s certainly things that that Alexa developers do not get access to. And that’s not going to fly in the enterprise space. One of the things that we hear from enterprises, in pilots and in production, said that they’re starting to put in these API’s is starting to be their crown jewels, and the most sensitive things that they got. And, and if you actually read the terms and conditions from a lot of the big tech companies that are leveraging AI stuff, they’re very nebulous with where the information goes, right? Does it get transcribed or not? Are people eyeballing this stuff? Or not? And so most enterprises are like, “Hold on a second, you want us to put our secrets, we make these microchips and you want us to put secrets on M&A deals we’re about to do.?” They’re uncomfortable about that. It’s just a different ball of wax. And that’s why I think it’s going to be purpose-built companies that are going to be developing enterprise API’s.

I think there will be a greater demand for bringing some of these virtual assistants we all know to the enterprise – especially since we’ve been at home for over a year and using them in our home.

Your intuition is spot on. It’s not even so much people coming from home into work environments – it’s a whole generation that has been reared with Alexa and Siri and these  things. When you actually look at the majority of user experiences at work, using Concur or SAP or Dynamics, or Salesforce, or any of these types of systems, and they’re gonna toss grenades at this stuff over time, especially as they elevate in authority through the natural motions of expanding and their influence over their career. I think there’s going to be a new a new generation of enterprise software that’s going to be purpose built for these folks that are going to be taking over business. That’s basically the chink in the armor for any of these traditional enterprise companies. If you think if you look at Oracle, if you look at IBM, if you look at HP, if you look at Dell, if you look at any one of them. I don’t know where they go, at least on the software side. When a kid has grown up with Alexa, and there they are at 26 years old, they’re like, “No, I’m not gonna use that.” Why? Why can I just blurt something out and get an instant answer? But here I am running a region of Baskin Robbins, and I can’t say, “How many ice cream cones did we sell when it was 73 degrees out?” and get an instant answer one second later. So that’s what’s going to happen. I mean, we’re certainly as a company, since our inception, we’ve been architected not for the current world, but for this future world. Already elements of this are in production, as we announced with Georgia Pacific in in late January, and we’re working through it. And I have to say, one of the biggest compliments that I get, whether it’s showing this to big enterprises or government agencies and the like, is fundamentally they’re like, “Holy smokes, this doesn’t feel like anything else that we use. But behind the scenes not only are we using top flight UX folks to develop this, but we’re also working with behavioral scientists and the like, because all that want to use our software not have to use our software. But, most enterprise software gets chosen by the the CIO, the CTO, the CISO, and things like that. And most of them are thinking checking off boxes on functionality. And most enterprise developers cook their blue and white interface, get the fun feature function in there and call it a day. And I think they’re missing such opportunities by not finishing the work.

Spread the word

This post was originally published by Zachary Comeau at My Tech Decisions

Related posts