The eagle has landed. The hype bomb has detonated. The future is here. Microsoft Copilot, “a digital companion for your whole life,” as the company calls it, is live.

Following three quarters of fanfare around Microsoft’s AI features — including Copilot generative AI tools for Microsoft 365 productivity apps — one would think customers would break the Internet clamoring to buy seat licenses, sort of like when Taylor Swift tickets go on sale. Or perhaps a good old-fashioned Black Friday fracas, where everyone’s either searching for the cheapest flat-screen TV or streaming Netflix. So far, nothing like that has been reported.

“Knowledge [and front-line] workers are able to rid ourselves of drudgery,” promised Microsoft CEO Satya Nadella in a September preview in advance of today’s go-live for Microsoft 365 Copilot. “Business leaders finally have a new tool to be able to reimagine, simplify [and] optimize business processes, whether it’s sales, marketing, finance, customer service, what have you.”

Microsoft will likely reveal more details at its Ignite user conference Nov. 14-17. But CIOs and other IT decision-makers have enough information to think about the feasibility of piloting Copilot in their organizations. Copilot AI is priced at $30 per user, per month — a 60% increase to the $50 per user, per month they already pay for the premium license for the productivity apps. Testing it will reveal if Copilot is really worth the cost, maintenance, troubleshooting, monitoring and user education it will require.

Much potential upside

“General availability” for Microsoft 365 Copilot AI tools means you can now buy all you want. But that doesn’t mean everyone will rush out to buy it, a la the latest Apple bauble. In fact, it will take some time for users to wrap their heads around how Copilot will commingle with their human-led processes.

But they will come, said Daniel Newman, Futurum Research principal analyst.

“I tend to believe that Copilot adoption will be strong, as medium and large enterprises see this as a way to quickly implement GenAI with critical business applications, seeking greater productivity and efficiency at a relatively low expense,” Newman said. “I do, however, see rollouts being incremental, rather than everyone all at once.”

At the enterprise level, where everyday documents number into the millions and billons, Copilot’s benefits — like those other vendors such as Box and OpenText have released or previewed — are very clear: Summarizing documents, extracting content, pinpointing search results, suggesting calendar organization. Who doesn’t spend part of their day running down a stitch of copy, contact information or a refresh to our memories about the next project? We all do.

We also waste our time explaining what happened in the meeting to people who were late or couldn’t make it. We add people to a team and need to summarize progress on the project so far. Customer service agents need to clip pieces of giant help documents to solve a customer’s problems — that is, if they can find the information before the customer tires of being on hold or leaves a chat. These tasks grind away at our focus, our energy, our loyalty to our jobs.

Give all that stuff over to the bots. Sure. No-brainer. Maybe it won’t give us so much extra time that we will dream up hitherto unthinkable wonderful products and services our company will create next, as OpenAI founder Sam Altman hopes it will, but maybe it will keep us from getting as many headaches from squinting at our monitors.

Deep Analysis founder Alan Pelz-Sharpe said humans couldn’t possibly assign metadata to the sheer volume of unstructured, uncurated documents many enterprises manage in order to make them findable in the sea of files. There just aren’t enough humans to do it, even if you could get a CFO to sign off on such a boondoggle. That’s a perfect job for AI.

So what’s not to like?

CIOs have to consider policies and procedures to monitor and detect problems that might come with the adoption of this new class of generative AI productivity tools for document managers, IT workers, data analysts and front-line workers.

A few of them come to mind for data security expert Tal Zamir, CTO at Perception Point, an Israeli security software company. Threat actors have long used “CEO imitation” email phishing to hack into companies. They have been successful even without Copilot’s “sound like me” feature, which auto-generates text in a writer’s tone and syntax. And you don’t even have to speak the CEO’s native language, because the AI can — or soon will — translate it for you.

Suppose a company has data loss prevention policies to restrict the sharing of sensitive information, such as human resources files or intellectual property. The company might even be able to prevent cutting and pasting sensitive content or email it.

But what happens when a generative AI tool allows someone with privileges to summarize a sensitive document? IT teams will have to determine how to tag that content and keep it locked down, lest their security protocols spring a leak.

Then there’s the credibility of content. When generative AI hallucinates — or even just comes up with wordings or descriptions that color outside the lines of a company’s values, tone, branding, style, culture or other parameters important to a particular workplace — it poses a problem. Copilot implementation teams had better plan for this in advance.

Lastly, Microsoft and Adobe indemnify users against copyright claims in the content their AI tools generate. But let the buyer beware: Even if Microsoft claims to stand behind you if its generative AI gets you in trouble, do you really want to risk besmirching your company’s good name by being accused of plagiarism and having to defend yourself in court?

Despite all these risks — and more — that may come with Microsoft 365 Copilot, IT buyers can take solace in one facet of its deployment: Knowing what employees use to do their jobs is better than not knowing.

“Using something like Copilot is definitely better than employees going around and using random generative AI technologies that are public, free offerings that are sometimes legit free offerings from Google or OpenAI,” Zamir said. “But sometimes, it’s shady cybercriminals taking advantage of the hype to create fake generative AI models baked into AI applications so that they get sensitive data.

“The introduction of Microsoft Copilot in the enterprise gives employees a way to use this technology in a safer way, because it is on Microsoft’s hosted servers that are disconnected from the public OpenAI models,” Zamir added.

So, enjoy the fireworks and enthusiasm around your new helpful AI office coworkers. But keep a close eye on them, lest they run amok and get you in trouble.

This article first appeared in TechTarget, written by Don Fluckinger on November 2, 2023.