While contact center, video, and UCaaS were popular topics at Enterprise Connect 2019, none was more pervasive than artificial intelligence. AI was everywhere, throughout the exhibit hall booths, in many of the sessions, and in every keynote.
Sometimes AI technology itself was the topic, but for the most part the conversations were about AI as the enabler… although in some cases, what we heard was just marketing fluff. (We’ve seen before how new terms or waves get exaggerated -- cloud, WebRTC, and digital transformation, to name a few.)
AI-fueled communications and collaboration doesn’t look particularly different than what we see today. In fact, most AI solutions endeavor to be near invisible. The general goal is to reduce friction and improve experiences with predictive and/or contextual behaviors. That will likely change as more mature technologies enable us to reimagine solutions.
AI is one of those topics that means different things to different people. Some are sticklers over definitions. In a broad sense, AI at EC19 was about computers performing human-like behaviors as well as learning from tagged data and outcomes. The main branch of AI fueling many enterprise communications solutions is known as natural language processing (NLP), technology that analyzes, understands, and generates human written and spoken languages. The modern popularity of voice control and assistants (Siri, Alexa, etc.) is based on NLP successfully interpreting natural language.
A speech UI is fine for setting timers, but in terms of communications and collaboration, it seems more like a solution to a problem that didn’t exist. Of course, there are major exceptions -- particularly when accommodating those with special needs. However, it’s been about four years since Amazon invited external developers to write apps for Alexa, yet a killer application has yet to break out.
The industry has evolved for years to develop a single click/press to join a meeting, so replacing that with “Alexa [or “whoever”], start my meeting” is a dubious upgrade.
Transcription, on the other hand, unlocks many new features and capabilities. It’s faster to read than listen, and transcriptions create a searchable and translatable record of a conversation. Deeper analytics can reveal additional details such as sentiment and education level. Transcription services were easily found at EC19. Many of the larger providers, including Microsoft, Google, and Avaya, are bundling services, while numerous companies, such as Amazon, Twillio, Otter.ai, and Panopto, offer a la carte options.
The video vendors also had some new AI-powered features at EC19. The benefits of transcription described above apply to video as well. Both Microsoft and Zoom demonstrated subtitles and translation services within their video meeting solutions. The picture, too, is improved thanks to AI. Cisco demonstrated a clever aspect of its cognitive collaboration approach with name tags. Here, Cisco is combining facial recognition technology with corporate databases to generate onscreen, virtual name tags.
Now Microsoft, along with Highfive and Dolby, can bring a whiteboard into focus, enhance its image, and turn whoever is writing on it -- i.e., blocking the whitepaper -- transparent so the writing is visible to remote viewers, as shown below. But my favorite AI feature on video is auto-framing. Here the camera always seems to have the right pan, tilt, and zoom regardless of who or how many participants are in the meeting room. Many vendors, including Cisco, Logitech, Poly, and more support this capability on their room systems.
Green screen trickery certainly isn’t new, but now AI is enabling us to do it without a green screen. Both Microsoft and Zoom offer a virtual green screen effect that allows users to replace their backgrounds with a photo. Backgrounds can violate personal privacy or be distracting, so Microsoft’s related approach of background blur is equally compelling.
Contact center solutions are also benefiting from AI in several ways. Chatbots can improve the effectiveness of self-service options, but the bigger play is augmented agent. The general idea is that a bot silently monitors a conversation and offers the agent suggestive links or ideas on how to resolve the issue at hand. Increased customer satisfaction is nice, but so too are the operational benefits, which include faster handling times and reduced agent training. Amazon Web Services demonstrated this in its keynote, but most of the contact center vendors are embracing the concept.
The contact center will also see AI-enabled improvements in workforce management (WFM). TalkdeskIQ was the central differentiator of Talkdesk’s expansion into WFM, announced at EC. The company says its schedule tool, for example, has automated and predictive capabilities that continue to improve. It also introduced the concept of surge pay to incentivize agents to stay on during busy periods.
Cisco intends to bring its cognitive collaboration skills to contact center routing. It is a common practice for contact centers to prioritize customers by past behavior, this is sometimes known as lifetime value of a customer. Think of how airlines prioritize services and experiences for their elite customers. The problem with this approach is they are inherently rear-facing, based on past business behaviors. Cisco intends to expand this lifetime value option with a predictive capability that recognizes potentially valuable customers.
Of course, I would be remiss not to mention my Innovation Showcase, which featured six AI-powered solutions. Two examples here are Prodoscore, which evaluates the effectiveness of sales staff by aggregating and analyzing spoken and written communications, and TechSee, which offers augmented video services to assist interactions with customers and prospects.
AI is changing the industry and it’s an invitation to rethink what’s possible. Today, systems running our communications and collaboration have human-like senses. They can see, hear, and speak better than ever before. Plus, they can sense or interpret things that humans can’t -- radio signals, barcodes, ultrasonic sounds, patterns, and such.
We are at the beginning of a sweeping change. Most of these solutions are various forms of automation and prediction of existing processes and workflows. We are still trying to force fit AI into our non-AI systems. For example, the meeting bot Google demonstrated is still working around free/busy times instead of rescheduling based on meeting importance. A smarter future is coming.
I don’t need AI to predict that next year’s Enterprise Connect will also be a lot about AI.
Dave Michels is a contributing editor and analyst at TalkingPointz.