When deploying Microsoft Copilot. How do we prevent people from accessing confidential information?

Providing everyone in the business with access to Enterprise Large Language models and Answer Engines like Microsoft Copilot provides significant productivity gains. When the tools are used well.

With that deployment comes concerns around access to personal or confidential information like performance reviews, contracts and salaries, or disputes.

On one hand an employee might not have access to the folder on the shared drive that contains that information. But while using an AI tool that has enterprise search capability with access to those folders could the wrong information fall into the wrong hands?

Short Answer: Yes. Unless you have thought about permissions and access controls from the start.

When deploying Microsoft Copilot across an enterprise, preventing unauthorized access to confidential content or information involves a combination of technical controls, user permissions, and data governance strategies.

Here are some key steps to consider:

1. Implement Appropriate Access Controls

Ensure that users have access only to the information they need to perform their job functions. Microsoft Copilot respects user-specific permissions to any content or information it retrieves, generating responses based on information that users explicitly have permission to access.

2. Use Microsoft 365 Permission Models

Utilize the permission models available in Microsoft 365 services, such as SharePoint, to ensure the right users or groups have the right access to the right content within your organization. This includes permissions for users outside your organization through inter-tenant collaboration solutions, such as shared channels in Microsoft Teams.

3. Apply Sensitivity Labels and Encryption

Use Microsoft Purview sensitivity labels to classify and protect documents and emails across your organization. Encryption can be applied by sensitivity labels or by restricted permissions in Microsoft 365 apps using Information Rights Management (IRM) to prevent unauthorized access to sensitive content.

4. Monitor and Control Plug-in Use

When using plug-ins to enhance Microsoft Copilot's functionality, check the privacy statement and terms of use of the plug-in to determine how it handles your organization’s data. Ensure that plug-ins do not compromise the security or privacy of sensitive information.

5. Configure Data Loss Prevention (DLP) Policies

Implement DLP policies to identify, monitor, and protect sensitive information across Microsoft 365 services. DLP can help prevent the accidental sharing of sensitive information.

6. Educate Users

Train users on the importance of data classification and the risks associated with mishandling sensitive information. Encourage them to be vigilant about the types of prompts they enter into Microsoft Copilot and the potential sensitivity of the information they are working with.

7. Monitor Usage and Abuse

Abuse monitoring for Microsoft Copilot occurs in real-time, without providing Microsoft any standing access to customer data. Ensure that your organization has mechanisms in place to monitor for misuse and take appropriate action when necessary.

8. Adhere to Compliance Requirements

Ensure that Microsoft Copilot for Microsoft 365 adheres to all existing privacy, security, and compliance commitments to Microsoft 365 commercial customers. Stay informed about regulatory requirements and adapt your deployment strategy accordingly.

9. Disable Certain Connected Experiences

If necessary, turn off connected experiences that analyze content for Microsoft 365 Apps on Windows or Mac devices in your organization's policy settings. This can prevent Copilot from using certain types of data.

By combining these strategies, organizations can deploy Microsoft Copilot while maintaining control over sensitive information and ensuring that confidentiality is preserved.

Justin Flitter

Founder of NewZealand.AI.

http://unrivaled.co.nz
Previous
Previous

GenAI Case Story: Moderna & OpenAI

Next
Next

Competitor research use case with Google Gemini.