Monday, December 25, 2023

We’re the good guys so let us look over your shoulder…

https://richmond.com/zzstyling/column/microsoft-365-copilot-is-here-what-are-the-legal-risks-of-using-it/article_9004342e-9f9c-11ee-9b82-df3d9f4cc1df.html

Microsoft 365 Copilot is here. What are the legal risks of using it?

Copilot adds generative AI capability to core Microsoft Office applications, such as Word, Outlook, Excel, Teams, and PowerPoint. It can be used to create, summarize and analyze things in those applications.

The biggest concern is confidentiality. With many generally available generative AIs, such as ChatGPT, anything you put in a prompt is used in the AI’s training. That creates a risk that your input could appear in someone else’s output. Also, the AI provider can see your input and output.

Microsoft promises that, with Copilot, your inputs and outputs are kept confidential. It says it will not use your input or output to train its AI for its other customers, and your input will not show up in the output of other Copilot users (at least outside of your company).

But there is a major catch: Microsoft says it captures and may access your Copilot prompts and outputs for 30 days. It operates an abuse monitoring system to review that material for violations of its code of conduct and “other applicable product terms.” Microsoft says its customers with special needs regarding inputs containing sensitive, confidential, or legally regulated input data can apply to Microsoft for an exemption from this abuse monitoring.



No comments: