We’re less than two weeks away from the November 1 general availability release of the much-anticipated Microsoft Copilot AI personal assistant, so this week’s Enterprise Connect virtual summit on Microsoft Teams was perfectly timed to help enterprise decision-makers understand where these capabilities stand, and what it could mean for their end users.
In his keynote talk for the summit, Kevin Kieller of EnableUC offered a deliberately provocative formulation: “AI doesn’t matter,” he said.
What he meant, of course, is that the technology isn’t important, it’s how you use it: “What matters is improved outcomes,” Kieller said. Does it help users complete tasks faster, produce better quality output, and generally improve the experience?
His final advice to attendees came in the form of a series of questions to ask yourself about AI functionality:
- What does success look like?
- How will you know collaboration AI is benefitting your organization?
- Who has access to your confidential data when using AI features?
- Are you sending confidential data (or confidential customer data) outside your existing security boundary?
- Do your users understand LLM [large language models] can “hallucinate”?
- Does AI expose your organization to “over sharing”? [i.e., data that may have been shared in places where it shouldn’t be, is now accessible to AI]
- Are there bias or copyright concerns associated with the LLM you are using?
- Does the value exceed the costs? (Who needs this capability?)
That last point is one that a lot of the industry will be watching as Microsoft and Google both roll out AI assistants priced at $30 per user per month (by contrast, Zoom’s AI assistant is added on at no additional cost to customers).
Everyone wants to deploy the AI they’ve been hearing about in the news and the vendor hype: The one that can do whatever you need, perfectly, the first time, saving you hours of drudgery. Needless to say, that AI doesn’t actually exist, and so there are many lessons to be learned about how cost-effective these AI-based personal assistants really can be.
We’ll gain some of these lessons between now and Enterprise Connect 2024 next March in Orlando, and I’m excited that Kevin Kieller will be delivering a talk at EC24 about the role of LLMs and AI in collaboration platforms. He’ll be partnering with Brent Kelly of Omdia for the presentation, and together they’ll explore how some of the key vendors are implementing LLMs and AI-based features in their collaboration platforms. They’ll offer details on what you need to know about the AI technology your collaboration vendor uses, and they’ll address a point I believe will become more important as these capabilities roll out: The opportunities and challenges of using generative AI in a multi-vendor environment.
So, we’ve reached the moment when we start to find out what AI really can do for productivity when (or if?) it’s deployed at scale. Hold on; it should be a wild ride.