How Secure is Copilot for Microsoft 365?
With anything new, there are concerns. Especially in the technology world. So when Microsoft announced their roll outs of Microsoft Copilot for M365, naturally, not only did every ear perk up at the promised productivity increases, but the cybersecurity world immediately took notice. We did it with ChatGPT just like we did it with the introduction of the cloud back in the mid-2010s. Because with any newfangled technology toy, the first thing that goes through our minds (and probably yours) is: how safe is this to use in my day to day business?
(By the way if you're totally lost on what Copilot even is, here are two resources to get you started: 3 Key Features of Copilot for M365 and The Difference Between Copilot and Copilot for M365).
security concerns for Microsoft copilot
Before you dig deeper into the tool itself, let's start with a couple concerns you should always be aware of when thinking about integrating something new into your infrastructure.
Data Access and Storage Capabilities
Who has access to what information should always be a top priority. Additionally, you need to understand where your information is going to be stored so, say someone who has access to your information, understands where they can access data and doesn't share with people they shouldn't be.
One thing to consider is where Microsoft Copilot will be gathering information from. You will have the option to toggle "Work" or "Web" (more information below), but for the most part, Copilot for M365 will access information from your Microsoft Graph. All information in your Microsoft Graph is unique to you and your company (think Microsoft Office, Teams, Sharepoint, OneDrive, Exchange/Email). That means anyone who has access to your Graph can add, delete, or change things. One step further, that means anyone who shares information from that Graph will be giving information about your Graph to outside sources. This is a major concern when thinking about integrating Copilot into your M365 applications.
Learning Curve
Copilot for M365 has a LOT of features. Because there are so many things you can do, that means there are as many things you must learn to do. This learning curve can cause security concerns because you really have no control over what people are putting into your Microsoft Graph and what they're sharing across it. We recommend that you do a slow roll out to a few employees who will become your "solution masters" and can then help others in their learning process. Yes, it increases the onboarding time, but would you rather take a year to fully understand and integrate or be a part of $1.1 billion dollar paid out to hackers in 2023?
Either way, understanding a complex application will only increase your security concerns.
Is Copilot for Microsoft 365 safe?
The short answer, yes. Microsoft goes above and beyond to make sure all data supplied to their new AI tool is safe. Microsoft assures that "Microsoft Copilot for Microsoft 365 is compliant with our existing privacy, security, and compliance commitments to Microsoft 365 commercial customers, including the General Data Protection Regulation (GDPR) and European Union (EU) Data Boundary." To get more information on the ins and outs of their privacy statement, check out their webpage here.
For now, lets break it down to see just how secure Copilot for M365 is.
- Copilot for M365 Only Pulls From Your Microsoft Graph
This is probably one of the biggest safety reassurances while simultaneously being the biggest safety concern. Consider the learning curve mentioned above. If your employees are struggling to understand what can go into the Graph that also leave the Graph, then you're going to have a problem.
Conversely, by only accessing information from your personalized Graph, that means the information provided is not only correct, but, incredibly secure as Copilot will never give your information to someone who does not have access to it. The only data concern here would be user error - which, admittedly, is a big concern across all data security solutions. Your information is safe in Copilot for M365, just be careful who you're giving permissions to.
As a final encouragement, Copilot's large language model (LLM) is only fed by you. The prompts, responses, and data accessed through your Microsoft Graph aren't used to train Microsoft LLMs, including those used by Microsoft Copilot for Microsoft 365. This includes content such as emails, chats, and documents that you have permission to access in your Microsoft Graph. Your Copilot for Microsoft 365 experience is personalized to your information and yours only, including the Microsoft apps you use every day like PowerPoint, Excel, and Teams. - Work vs. Web Function
Copilot will only use the data you provide it with UNLESS you ask it to pull from the web. So as long as you've vetted the people and the things which they have included in your graph, your "work" toggle is completely safe from outside threats. You can still use the "web" function, just be careful what prompts you're providing it with. Additionally, Microsoft cannot access your encrypted information. So if it's safe from Microsoft, it's safe from Copilot.
This is a good time to reiterate the importance of Zero Trust Policies. Set up your systems so you do not trust or give access to people outside your organization. That way you've laid the groundwork for erasing user error of sharing information. Even if they are toggled on "Web", this will help eradicate the internet getting access to your information provided. - Microsoft Protects Customer Data
Microsoft has a policy in place to protect customer data regardless, and this applies to Copilot for M365 as well. This is taken from their Copilot Privacy FAQ, detailing how they successfully protect customer data:
-
- Multiple forms of protection to safeguard organizational data. Service-side technologies are utilized to encrypt customer content both at rest and in transit, ensuring robust security measures. For comprehensive information on encryption protocols, go to Encryption in the Microsoft Cloud. Connections are safeguarded using Transport Layer Security (TLS). The transmission of data from Dynamics 365 and Power Platform to Azure OpenAI Service is facilitated through the Microsoft backbone network to ensure the reliability and safety of the transfer.
- Architected to protect tenant, group, and individual data. We know data leakage is a concern for customers. Large Language Models (LLMs) are not further trained on, or learn from, your tenant data or your prompts. Within your tenant, our permissions model provides safeguards and enterprise-grade security as seen in our Azure offerings. On an individual level, Copilot presents data that only you can access using the same technology that we've been using for years to secure customer data.
- Built on Microsoft's comprehensive approach to security, compliance, and privacy. Copilot is integrated into Microsoft services like Dynamics 365 and Power Platform and inherits these products' security, compliance, and privacy policies and processes. Multi-factor authentication, compliance boundaries, privacy protections, and more make Copilot the AI solution you can trust.
- Multiple forms of protection to safeguard organizational data. Service-side technologies are utilized to encrypt customer content both at rest and in transit, ensuring robust security measures. For comprehensive information on encryption protocols, go to Encryption in the Microsoft Cloud. Connections are safeguarded using Transport Layer Security (TLS). The transmission of data from Dynamics 365 and Power Platform to Azure OpenAI Service is facilitated through the Microsoft backbone network to ensure the reliability and safety of the transfer.
Still Concerned About Security?
There's still a lot to learn, but as we do, we're understanding more and more about the ways in which we can provide the most secure solutions for your business. If you're still unsure, talk to one of our security experts about how you can be sure of the tools you're implementing. Whether it's just general inquiries or tools specific questions, we've got you back.
Contact us today with the good, the bad, and the ugly. We'll make sure you get the answers you need.
Be a thought leader and share:
About the Author
Creative content writer and producer for Centre Technologies. I joined Centre after 5 years in Education where I fostered my great love for making learning easier for everyone. While my background may not be in IT, I am driven to engage with others and build lasting relationships on multiple fronts. My greatest passions are helping and showing others that with commitment and a little spark, you can understand foundational concepts and grasp complex ideas no matter their application (because I get to do it every day!). I am a lifelong learner with a genuine zeal to educate, inspire, and motivate all I engage with. I value transparency and community so lean in with me—it’s a good day to start learning something new! Learn more about Emily Kirk »