No Jitter is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

AI-Based Meeting Assistants: Do They Make an Impact?

Kelcor_ImageHeader.jpg

Image: Kelcor
The idea of creating artificial intelligence (AI)-based meeting assistants has been around for a long time. I first encountered it professionally when meeting with a small group of cognitive scientists, professors, and their students in the early 2000s. This group was trying to figure out how to create a tool that would summarize meeting content. This attempt took place in the early days and way before the excellent speech-to-text transcription capabilities we now have, thanks to machine-learning and faster computer processing units (CPUs).
 
Many of us may recall some brilliant personal AI-assistants from movies or books, including the notable J.A.R.V.I.S. (Just A Rather Very Intelligent System), Tony Stark’s personal assistant in the Iron Man series, and Winston, the quantum-computer assistant, from Dan Brown’s “Origin” thriller. While these assistants could seemly think and reason on a multitude of topics, today’s meeting assistants have far more modest capabilities.
 
A Google search on “virtual meeting assistant” returns a plethora of content, most of which relates to people who provide meeting scheduling and calendaring assistance from a distance. For better details on the type of digital AI-based meeting assistants addressed here, I suggest modifying the search term with “AI meeting assistant.” You’ll get the right hits.
 
So, what does an AI-based meeting assistant do?
 
First, let’s admit that there are a lot of nuances when one discusses AI-based assistants. Personal assistants, for example, are more like the superbots we find in Siri (Apple), Cortana (Microsoft), Alexa (Amazon), and Google Assistant. Each can respond to questions about a wide range of topics such as the weather, currency exchange, directions, math functions, web lookups, and more. With integrations, they can also control the Internet of things (IoT) devices such as thermostats, lighting, doorbells, locks, etc. Some of them can also schedule events on a calendar.
 
Meeting assistants focus more on their scope. For example, Hendrix.ai, Fireflies.ai, and Avoma provide meeting speech-to-text transcription, meeting recording, and some level of meeting analysis, typically based on keywords or phrases. These are third-party applications that one can add to existing meeting platforms like Zoom, Webex, and sometimes even regular audio conference calls.
 
These tools make meeting transcripts searchable by keyword. They also make the corresponding snippet of the meeting recording containing that keyword re-playable on demand. Some of the more advanced systems try to highlight text identifying important ideas, action items, decisions, and so forth. Below I’ve highlighted two meetings assistants and the capabilities each has to offer.
 
The Webex Assistant
Given the recent WebexOne event in which Cisco made several announcements around Webex, let’s focus on Webex Assistant and the capabilities it offers.
 
Webex Assistant has been in development for quite a while. Some years ago, Cisco showed demos of an intelligent assistant that could schedule meetings, reschedule meetings, and let people know they would be late to a meeting.
 
Webex Assistant has expanded the focus to include additional valuable use cases:
 
1. Transcribing, Captioning, and Highlighting meeting conversations in real-time. Webex Assistant allows Webex users on any device running Webex to start speech-to-text transcription in which the words spoken by each user are captured and tagged by the name of the person who said them. Optionally, a closed caption of the transcription can appear on the user’s screen in real-time. Cisco announced that language translation is coming in 2021, in which the closed captioning appears in the user’s native language.
 
Webex Assistant supports three highlight groups: Notes, Action Items, and Decisions. During a meeting, highlighting important words or phrases can occur in two different ways:
 
a.) Listens for “trigger words.” The natural language understanding engine in Webex Assistant triggers highlights based on words like “action item,” “can you schedule,” “follow up,” “the decision is,” “we agree,” “take a note,” and “capture that point.” When these kinds of words and phrases used during the meeting, Webex Assistant automatically creates a highlight. Webex Assistant uses the labels “Action Item,” “Note,” “Summary,” and “Decision” by default, but a person can change these labels after the meeting while reviewing the transcript.
b.) Uses a “wake word.” The wake word for Webex is “OK Webex.” If a user says, “OK Webex,” the Webex Assistant listens for a command. A person can say, “OK Webex, highlight that,” and the most recent phrase will highlight. Or a person can say, “OK Webex, take this action to such and such,” and an action item will highlight. And finally, using “OK Webex, the decision is this or that” will create a decision highlight.
2. Review a meeting after it ends. Once finished, a person can review the transcript and listen to the audio. Post-meeting reviews require the conference to be recorded for the transcript to be available. Webex supports keyword search that brings a person back to the place where they said a specific word or phrase in the meeting. The reviewer can listen to a snippet of audio to hear how someone pronounced the keyword. Optionally, reviewers can highlight words and phrases they find important, unhighlight unnecessary sections, or change the text label for a particular highlighted section to anything they wish. After highlighting and reviewing the transcript, Webex Assistant allows reviewers to share specific highlights via email to enable fast follow-up.
3. Control meetings in rooms with Webex Assistant for Webex Rooms systems using voice. Capabilities include reserving a room, starting or joining a meeting, adding someone to a meeting, dialing out to someone else, recording a meeting, changing volume, or ending the meeting.
 
Cortana in Microsoft 365
Cortana is Microsoft’s voice assistant, used in Microsoft Teams but only on iOS and Android mobile devices or Microsoft Teams Displays. It presently doesn’t work on Teams running on Windows 10 or Mac (at least I can’t see how to make it work on these devices).
Cortana is still more of an individual assistant than a meeting assistant that allows one to
  • Reply to Teams chat messages using voice
  • Check a calendar
  • Share a document
  • Join a meeting
  • Add a person to a meeting
  • Present a file during a meeting
It doesn’t handle much of the “in-meeting” controls like highlighting text or identifying action items. That said, if or when Microsoft put its mind toward building a meeting assistant using Cortana, it could do an awesome integration with other parts of Office 365. These include creating highlights and emailing them out to people, producing tasks and automatically making them part of a person’s Tasks app, and scheduling follow-up meetings that automatically populate on everyone’s calendar.
 
Where Is the Future for Meeting Assistants Headed?
Looking into my crystal ball, I predict meeting assistants to become far more capable of becoming a sentiment analysis on steroids. For example, given the advancements in AI-based facial detection, I imagine a future where a meeting assistant can examine visual cues to see if people are paying attention during a meeting, bored, agree or disagree based on facial expressions, and seem to understand the content.
 
I also visualize the day when real-time “meeting assist” is a reality. It’s kind of like “agent assist” in the contact center. But in this scenario, an AI-based meeting assistant is listening in to a meeting, following the dialog using natural language understanding, and exposing documents or salient information in real-time to help the individual participant. Perhaps they might even prompt what someone in the meeting might say, sort of like how AI-assistant can prompt the contact center agent in what to say. In this case, these AI-based meeting agents do become much more like J.A.R.V.I.S. and Winston. They will be primitive at first, but over time, get so good that those who don’t carry one into a meeting will be at a serious disadvantage.


Join us March 9-10, 2021, for our virtual event, Communications & Collaboration: 2024, and get guidance on which communications, collaboration, and related technologies belong in your strategic three-year plans.