Share via


Data, Privacy, and Security for Microsoft 365 Copilot in Viva Engage

Microsoft 365 Copilot in Viva Engage is your partner and coach for posting on Viva Engage in ways that support your professional goals. Copilot supports large language model (LLM) technology, which performs language-based tasks similarly to how you might ask a person.

Copilot capabilities

Copilot learns from your communities, campaigns, and interests to make personalized suggestions of what you might like to post and where you might benefit from engaging.

Here are some examples of how Copilot can help you:

  • Brainstorm ideas of what to post, where to post, and points to include in your post.

  • Provide a template, draft content for a post, and make edits to improve writing quality and value.

  • Provide feedback and advice about your post.

How to use Copilot in Viva Engage

Copilot suggests relevant content and assists users to communicate more effectively. In this way, Copilot enhances your daily use of Viva Engage.

  1. Suggest what and where to post. To reach the optimal audience, Copilot assists you to determine good post content and the right places to post it. Copilot identifies content, people, and groups that align with your interests, identity, and objectives. For example, Copilot identifies ongoing conversations that contain themes relevant to your interests. It also highlights ongoing campaigns sponsored by leaders or otherwise relevant to your interests and workplace connections.

  2. Create valuable, engaging, authentic communications. Copilot assists production of high-quality posts with value, engagement, and authenticity. It helps deciding what points to include in the post, helps compose the post if necessary, and helps edit for tone, length, and keywords. Copilot can also offer feedback on your post. You can attach images to enhance your posts, include calls to action (CTAs) to promote engagement, and tag relevant individuals in their posts.

Copilot in Viva Engage empowers you with the information and collaboration you need to achieve your professional goals.

Copilot performance metrics

We measure the performance of Copilot, which is powered by OpenAI GPT-4 using the following metrics:

  • Precision and Recall: Evaluate the quality of suggestions. Microsoft precision quantifies how many AI-generated suggestions are relevant, while recall determines how many of the relevant suggestions were retrieved.

  • User Satisfaction: Gauge user satisfaction. Microsoft conducts user surveys and collects feedback to assess user satisfaction with the AI system's assistance.

  • Generalizability: Assess system results across different use cases. Microsoft tests Copilot on a diverse set of data and tasks to evaluate system performance on a range of scenarios and domains that aren't part of the initial training data.

We conduct red teaming exercises, inviting external experts and testers to find vulnerabilities or biases in the system. This process helps us identify potential issues and improve the system's robustness.

Our evaluation process continues, with regular updates and improvements based on user feedback. By employing a combination of internal evaluation, user feedback, and external testing, we aim to ensure the generalizability, accuracy and fairness of Copilot powered by GPT-4 and its related models.

Limitations of Copilot

Copilot uses a robust filter system that proactively blocks offensive language and prevents generating suggestions in sensitive contexts. Our commitment to continuous improvement drives us to enhance this filter system to detect and remove offensive content generated by Copilot, and address biased, discriminatory, or abusive outputs. We encourage you to report any offensive suggestions you encounter while using Copilot.

The underlying model powering Copilot is trained on data through June 2024. Copilot won’t provide relevant responses if a question requires knowledge of the post-2024 world.

High-risk use cases that we aim to avoid:

  • Privacy Concerns: If the AI feature is not adequately secured, it can pose a risk of exposing user data to unauthorized parties, including private trending themes and personal information. Our top priority is to ensure the highest level of privacy and security for you. For instance, we take precautions to avoid exposing summaries of communities you're not a part of, or displaying top campaigns that you're not eligible to view. Microsoft implements extensive measures and checks throughout your Copilot journey, from user access to end-to-end processes, to prevent such incidents.

  • Bias: The fairness and impartiality of AI systems like Copilot depend on the quality and bias of the data they're trained on. If the training data contains biases, the AI feature can unintentionally generate content that reflects those biases, potentially causing harm or offense. We're dedicated to addressing bias in AI systems and continuously work towards providing more equitable and inclusive outputs.

By actively addressing high-risk scenarios and working collaboratively with our user community, we commit to delivering a safer, more responsible, and ethically sound Copilot AI experience.

We do not observe or look at your content. Microsoft uses automated systems that are trained to detect bias and abuse, and which don't process any request or content that might be seen as inappropriate.

Copilot data usage for prompts and responses

Microsoft 365 Copilot in Viva Engage only accesses existing data within Viva Engage. Specifically, Copilot accesses the data within the specific communities or groups you belong to within Viva Engage. It includes all relevant user data and the data of the group, community, or tenant you are associated with.

However, in the context of a multitenant organization, Copilot's access is limited to the user's specific tenant. It does not extend to access data across different tenants in a multitenant organization.

Privacy policy and data compliance

Our privacy policy is publicly shared and available as part of the Microsoft Online Services Privacy Statement. Copilot follows data compliance currently in place for Viva Engage.

Data residency, data subject rights, and data deletion

Copilot follows the same data residency protocols that are in place for Viva Engage. For details, see Overview of Security and Compliance in Viva Engage.

Copilot, Microsoft Graph, Microsoft 365 services

Copilot doesn’t interact in any way with Microsoft 365 Copilot, Microsoft Graph, and other Microsoft 365 services. Engage Copilot is a separate entity contained within Viva Engage and relies solely on integrations for Viva Engage.

GDPR and privacy regulations

Copilot follows the same GDPR and privacy regulations as Viva Engage. For details, see Manage GDPR data subject requests in Viva Engage and Understand how privacy works in Microsoft Viva | Microsoft Learn.