No Jitter is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Copilot Can Augment, Not Replace, the Work We Do

Microsoft 365 Copilot goes into general availability on Wednesday, November 1, 2023. To get an idea of what the value to individual users and to the enterprise, and whether or not the assistant is worth the monthly cost, No Jitter talked to Omdia's Digital Workplace team leader, Tim Banting. We touched on what workplace challenges Copilot is meant to solve, whether it can transcend its training data, and how an enterprise can determine ROI for any AI assistant.

This conversation has been condensed and edited for clarity.

*

NJ: What are the workplace problems that Copilot is meant to solve for somebody?

Banting: I think the problem it's trying to solve is being in multiple places at once. It's very difficult to concentrate on what matters, why it matters, and who it matters to.

And I also think that we aren't very sensitive to how we work. We attend meetings that don't really have agendas and not necessarily outcomes, either. We can't be present in a meeting and also take notes effectively. There are a number of ways in which Copilot can help with meeting notes, intelligently; it may even get to a point where it will be able to decide whether we should have had a meeting at all. For example, we could have taken a vote or we could have done a video clip or recorded something because there were fifty people on a town hall and there were only three minutes for questions, and it wasn't the best use of everyone's time.

There are other uses like, how can we use Copilot to intelligently summarize documents or content from a reporting perspective? For example, I use Grammarly and that's very good at being able to improve my writing. As I take notice of how it's effective, it will help essentially train me so I can be more engaging or more convincing. Those are various ways Copilot could be effective, too.

[I’d also say that] there's nothing artificially intelligent about it – it's all about augmented insights or augmented capabilities.

NJ: When you say "augmented" … is it effectively an external prioritizer and taskmaster for the worker?

Banting: Yeah, I think that's a very good way to describe it. And then if you look at somewhere like the contact center side of things, it could be giving you insights on customer behavior, customer journey, customer experience. And I think what we will find is that the augmented insights or the intelligent insights you get in the customer contact side of the house, you'll also map to the employee experience on the organizational back office.

NJ: I'm glad you mentioned Grammarly earlier – I use it too and I reject suggestions. And when autocorrect came out, you had blogs devoted to the hilarious mistakes autocorrect would make when it anticipated your typing patterns. This brings up one of the parameters of any sort of assistive tool like this: accuracy for usefulness. How good does an augmented tool like Copilot have to be for people to accept it as part of their workflow instead of making fun of its current capacities?

Banting: There's the problem – I think we refer to those [glitches] as hallucinations. I don't think we can blindly trust AI. I always look at AI with a healthy degree of skepticism, in the same way that I wouldn't trust driving to a self-driving car. I need to be there because there are some things it does really well, but there are other things where you sort of think, I better just check through this to make sure it's got the right context. I don't really see Copilot as being so transformational that it will replace everything and we can blindly trust it and completely rely on it.

NJ: AI is only as good as training data. Copilot's going to iterate based on your workplace interactions, so how do you handle the possibility that the training data just reinforces uncomfortable or unfair social patterns in meetings?

Banting: It's the way we are as humans. If you look at the Bay of Pigs invasion and the way in which [President John F.] Kennedy got into this groupthink mode, you need to have that diversity of thought in a meeting. I'm not sure how AI currently covers that, and it's an interesting question. What's the AI alternative to groupthink and how do you bring diversity into the equation? Is the training data all it's geared up to be? All sorts of confirmation biases, groupthink – it gets very tricky, very quickly. It does require some way of addressing the balance.

AI needs an awful lot of training data to make sense of it, and I think that a lot of companies underestimate that. They think great, we'll just install AI and it will make sense of our environment. But AI requires a huge amount of data to train it effectively, and even then that data could lead to groupthink and some very wrong decisions based on the data you have because everyone talks to customers, but they don't talk to noncustomers…they don't talk to prospects. So if everyone buys this solution based on the traits you know, how do you get AI to look at the people who aren't buying it, and what are their traits?

NJ: It sounds like you've identified that it's kind of a global problem across AI in general – it's great at reinforcing patterns, but it may not prod you or give you the insight to look outside the pattern for opportunities.

Banting: Definitely, absolutely.

NJ: Some have argued summarization is going to be the killer feature in Copilot. Then we just talked about the difficulty of a more holistic and contextual view of the outcomes. I wanted to get your feel for how easy it is to accurately summarize a conversation or a meeting.

Banting: It does get things wrong and there are things where you need domain-specific training. For example, if I talk about "UC," Copilot will probably translate that as "you see." But in our context, it's unified communications, so there's very specific domain expertise that Copilot needs to be trained on. It really does need an awful lot of training and I don't know that AI is in a place where it understands the context as to whether or not that's an acronym or it's an actual word.

My concern is that people will use AI to show up in a meeting and [the summarization tool] might just go wrong and miss something essential that should be a task and a priority. I think once you start looking at some of the meeting summaries and things that Copilot can and can't do, there's going to be a slap of realization across the face and people will be thinking, Do I really want to pay $30 per user per month for this in its current iteration?

NJ: Especially when Zoom just announced that their AI assistant will be offered free to paying customers. What I'm curious about is your assessment of whether or not that pricing model is sustainable or if we're going to see it change over the next few years.

Banting: Well, my thoughts are that Microsoft has topped out in terms of growth and the only way in which they can recoup some of the $15 billion or so investment that it made in OpenAI is to start charging for AI in the way it has done, and [one] way it can appease the shareholders is to increase the annual revenue per user. Microsoft, I think, will have to go to great lengths to show where the return on investment is $30 per user per month is.

NJ: You raise a good point about paying for AI because right now a lot of the AI-powered assistive tools that we take for granted are already built in. But developing these features and supporting them and changing them and getting training data, that costs money. Is Copilot at the beginning of vendors trying to habituate us to the idea that AI is something that customers certainly have to pay for?

Banting: In the contact center you can justify the return on investment in AI because you can turn around and say, well, if this saves 10 minutes of post-call processing, it means that agents can make [or take] more calls, and if it gives me an intelligent summary, they're not busy typing notes. It can reduce the workload in the contact center by doing some real transactional stuff very quickly.

It's a little more complicated looking at the enterprise as a whole, because of course AI might have saved me some time in a meeting, but will that time then be spent by me having a coffee or browsing the Web? Trying to demonstrate the ROI is really quite nebulous in the back office, but easy in the front office.