You are a marketing assistant preparing for a budget meeting with your manager.
You need to evaluate key spending and performance trends from the last year to understand which marketing channels deliver the best return on investment (ROI).
What should you use in Microsoft 365 Copilot to achieve the goal? More than one answer choice may achieve the goal. Select the BEST answer.
Answer : A
Evaluating spend and performance trends and determining ROI is a structured analytics task. In Microsoft 365 Copilot, the Analyst agent is designed for quantitative analysis workflows: it can interpret tables, analyze datasets, identify trends over time, calculate metrics (such as ROI), and present results clearly (often including charts or summarized insights). This makes it the best fit when your goal is to compare marketing channels and understand which ones are delivering the strongest return based on last year's data.
Chat (Option B) can help you brainstorm questions or explain results, but it is not purpose-built for deep numeric analysis. The Researcher agent (Option C) is optimized for gathering and synthesizing information from sources and producing research-style outputs, not performing ROI calculations on internal performance data. A notebook (Option D) is useful for organizing files and keeping shared reference material across related conversations, but it does not itself perform the analysis---you would still need an analysis-capable agent.
You are a merchandiser who is planning for the upcoming season.
You prompt Microsoft 365 Copilot to suggest which products to stock based on historical sales data. Without reviewing the suggestions or checking current market trends, you place a large order based solely on the output of Copilot.
What is this an example of?
Answer : B
This scenario is an example of overreliance. Microsoft's AI guidance highlights that Copilot is an assistive tool that can accelerate analysis and drafting, but users remain responsible for validating outputs---especially for high-impact business decisions. Overreliance occurs when someone treats AI output as authoritative and acts on it without applying judgment, cross-checking sources, or validating against current conditions.
Here, the merchandiser uses Copilot to generate stocking recommendations from historical sales data, then places a large order without reviewing the suggestions or incorporating current market trends. Even if the historical data is accurate, demand can shift due to seasonality changes, competitor actions, pricing, supply constraints, macroeconomic factors, or new product launches. Copilot's recommendations should be treated as a starting point for decision-making, not the final decision.
Option A (verification) is the opposite behavior---checking accuracy before acting. Option C (fabrication) would involve Copilot inventing facts; the prompt doesn't indicate fictional data, only unvalidated reliance. Option D (prompt injection) involves malicious instructions embedded in content to manipulate the model, which is not described here.
You are a medical researcher.
You need to ensure that Microsoft 365 Copilot responds to you in a professional manner that is appropriate to your field. What is the best approach to achieve the goal?
Answer : B
Custom instructions allow users to define persistent preferences for tone, style, and response behavior across conversations. Microsoft documentation explains that Custom instructions help tailor Copilot responses to professional roles, communication styles, and domain expectations.
For a medical researcher requiring consistently professional, field-appropriate responses, configuring Custom instructions in Settings ensures that Copilot maintains the desired tone and terminology in future interactions.
Attaching examples influences only a single conversation. Saving prompts to the Prompt Gallery aids reuse but does not enforce tone globally. Copilot Memory stores factual preferences but is not intended for style configuration.
Therefore, the best and most comprehensive approach is to configure Custom instructions in Settings.
You use Microsoft 365 Copilot.
You discover that you had a conversation that used a knowledge source that contains confidential information.
You need to delete the conversation data without requiring administrative approval. You must retain your other conversations, if possible.
What should you use?
Answer : D
Microsoft 365 Copilot follows Microsoft's enterprise data governance and privacy principles, which allow users to manage their own conversation history where appropriate. According to Microsoft AI Business Professional guidance, users can review and delete individual Copilot conversation histories directly from their personal account settings without requiring administrative intervention.
Option D is correct because the My Account portal in Microsoft 365 Copilot allows individual users to manage their activity history, including deleting specific conversations. This enables targeted removal of sensitive interactions while retaining other conversation data.
Option A is incorrect because the Copilot app itself does not provide full account-level activity management capabilities. Options B and C are administrative tools used for tenant-wide governance, compliance, retention policies, and eDiscovery. These portals typically require administrative privileges and are not intended for individual user self-service deletion of specific conversations.
Therefore, to delete a single confidential Copilot conversation without affecting other chats and without requiring administrator approval, the correct tool is the My Account portal in Microsoft 365 Copilot.
You sign in to the Microsoft 365 Copilot app by using your work account as shown in the exhibit.
A colleague tells you that when they open the Microsoft 365 Copilot app, they have access to the Researcher agent. You need to access the Researcher agent.
What should you do?
Answer : B
In Microsoft 365 Copilot, agents such as Researcher are accessed through the Agents experience within the Copilot app. If the user interface does not immediately display a specific agent, the correct action is to browse or search the available agents catalog. The exhibit shows the left navigation pane with an Explore agents option.
According to Microsoft AI Business Professional guidance, built-in and custom agents can be discovered and enabled through the Explore agents section. If the user already has the appropriate Copilot license and is signed in with their work account, there is no need to switch accounts or request another license.
Signing in through a browser does not change feature availability, and using a personal account would remove access to organizational features. Therefore, to access the Researcher agent, you should select Explore agents and search for Researcher.
==================================================
You use Microsoft 365 Copilot to generate a training plan.
You need to check if there are any existing training plans in your organization that are similar to the new training plan. What should you use in Copilot?
Answer : A
Microsoft 365 Copilot integrates with Microsoft Search to help users discover relevant content across their organization's Microsoft 365 data estate, including SharePoint, OneDrive, Teams, and Exchange. When the objective is to determine whether similar training plans already exist, the appropriate action is to perform a search across organizational content.
Using Search allows Copilot to query indexed enterprise documents and return files, plans, or related materials that the user has permission to access. This supports content reuse, avoids duplication of work, and aligns with Microsoft's guidance on leveraging organizational knowledge efficiently.
Designer is focused on visual content creation, Apps provides access to Microsoft 365 applications, and Pages is used for creating and organizing content within Copilot. None of these options are intended for discovering existing documents across the tenant.
Therefore, to identify similar existing training plans within your organization, the correct tool to use in Copilot is Search.
==================================================
You run a saved prompt and receive the following response:
"You asked for a summary of File.docx. However, the file appears to be either empty, corrupted, or in a format that I cannot process."
What is a possible cause of the response?
Answer : D
When Microsoft 365 Copilot cannot access a referenced file, it may return a message indicating that the file is empty, corrupted, or cannot be processed. In many cases, this message appears when the user does not have sufficient permissions to open the file in SharePoint, OneDrive, or another Microsoft 365 location.
Copilot operates within the Microsoft Graph security boundary and strictly respects user permissions. If a saved prompt references File.docx but the user no longer has access to it---due to permission changes, file relocation, or removal---Copilot cannot retrieve the content for grounding. As a result, it cannot analyze or summarize the file and returns a processing-related error.
Not scheduling the prompt is unrelated to file processing. Running from a web or desktop app does not affect file readability. Using a different agent does not typically cause a file-format processing error.
Therefore, the most likely cause is that you do not have access to the file.