IBM and Avaya, and Cisco, Oh My!
Michelle, ma belle
these are words that go together well
- Lennon and McCartney
On a recent episode of the public radio program, Marketplace, the host, Kai Ryssdal, asked his guest to describe her job in five words or less. I can't remember what her response was, but it made me think about what mine might be. My first reaction was something akin to "Waiting until retirement day," but after some serious pondering, I finally came to, "I make cool things." While my days certainly have their moments of boredom, unnecessary processes, and tedious tasks, I am lucky enough to have the freedom to spend much of my work week exploring new technologies as I hone my programming skills.
Case in point is my recent endeavor described in the article, "Cisco, Meet Avaya. Avaya, Meet Cisco." In case you missed it, I wrote of how I connected Arrow Connect IoT devices to Webex Teams via Avaya Breeze. I showed how a sensor event (e.g. temperature > 90 degrees) can dynamically create a Webex Teams space, add participants to the space, and periodically send telemetry updates to the entire team. This bridging of disparate technologies created a workspace that allowed the right people to be immediately and visually notified of critical problems. It also gave them tools to help diagnose and solve those problems in real time.
As cool as that was, however, it was a one-sided conversation. Having an IoT device "talk" to people is pretty darn amazing, but wouldn't it be even more astounding if those same people could talk back to the device? Better yet, how about letting them speak as they do with another human being? So, instead of coming up with a strict lexicon of commands and phrases, I want folks to be able to use the same words and sentence structures they employ with every other person on their virtual team.
We call this human-to-machine interaction Natural Language Processing and it's revolutionizing the way we interface with artificial intelligence (AI) platforms. You find it in consumer voice assistant products such as Apple's Siri and Amazon's Alexa, and in the business world, it's at the heart of IBM's Watson and Google's Dialog Flow. It's being added to contact centers, help desks, and a slew of other business applications.
This leads me to my latest "cool thing." I began with the software I wrote for my Cisco/Avaya/IoT platform, added a few enhancements, and then surrounded everything with an AI wrapper. In the end, I had a bidirectional solution that allowed sensors to reach out to people and for people to reach back to those same sensors. This essentially adds an AI bot into a conversation as the really smart coworker who has all the answers, but never asks for a day off.
Without getting too geeky on you, allow me to explain the flow of what I am calling an AI Bot for Webex Teams. Pictorially, the flow looks like this:
- A Webex Teams user enters a message into a Webex Teams space.
- Webex Teams forwards the message to a listener (known as a webhook) attached to the space. The listener is looking for messages that begin with a particular keyword. Similar to how Amazon uses the keyword "Alexa" to begin processing a message, in this case I programmed the listener to look for the name, "Michelle." If that keyword is encountered, the listener forwards the message to a Breeze workflow.
- The Breeze workflow uses IBM Watson Assistant to determine the intent of the message. An intent is exactly what it sounds like -- the intention or purpose of an utterance of human speech. My Watson Assistant has been trained to monitor, control, and report on IoT sensors.
- The Breeze workflow uses the intent to determine what action should take place. For example, the intent "incident" instructs the workflow to create a ServiceNow incident report. The report is populated with the current telemetry values from the IoT sensor associated with the workflow.
- Breeze sends the appropriate message back to Webex Teams. For instance, "The current temperature is 74.23 degrees," or "LED 1 has been turned off."
- Webex Teams displays the message to the space and all its team members.
Here is a closer look at the Webex Teams application. Notice how the user asks for information using regular speech. Notice also how "Michelle" doesn't speak until spoken to.
Here's a snippet of the Breeze workflow that ties everything together. Note the IoT, ServiceNow, IBM Watson, and Webex Teams tasks. This is where the bot logic lives.
The addition of AI and workflow technology into a team collaboration platform (be it Cisco, Microsoft, Slack, Avaya CPaaS, etc.) is pretty darn amazing. It's like having a personal butler that waits patiently until instructed to do something.
While the workflow described above is geared towards IoT, that's only one example of what's possible. Off the top of my head, I can envision bots for healthcare, retail, transportation, energy production, and manufacturing.
As I wrote in my previous article, these solutions are made possible because of open interfaces. Arrow Connect, Avaya Breeze, IBM Watson, ServiceNow, and Webex Teams expose RESTful Web services APIs that allow developers like me to write software that glues these products together into something new and completely different. It's like a Reese's peanut butter cup, without being limited to only peanut butter and chocolate. The combinations are endless.
And best of all, it allows me to enthusiastically describe my job as: "I make cool things."