Microsoft Implementing Analytics Solutions Using Microsoft Fabric DP-600 Exam Questions

Page: 1 / 14
Total 166 questions
Question 1

You have a Fabric tenant that contains two workspaces named Workspace1 and Workspace2 and a user named User1.

You need to ensure that User1 can perform the following tasks:

Create a new domain.

Create two subdomains named subdomain1 and subdomain2.

Assign Workspace1 to subdomain1.

Assign Workspace2 to subdomain2.

The solution must follow the principle of least privilege.

Which role should you assign to User1?



Answer : A

User1 must be able to:

Create a new domain.

Create two subdomains.

Assign Workspace1 to subdomain1.

Assign Workspace2 to subdomain2.

Key Role Definitions in Fabric:

Domain admin: Can manage domain settings, add/remove workspaces, manage contributors.

Domain contributor: Can contribute to an existing domain but cannot create a domain.

Workspace admin: Only controls permissions/settings inside a workspace, not domains.

Fabric admin: Tenant-wide admin, broader than required.

Requirement: Follow least privilege.

To create a new domain Requires Domain admin.

To assign workspaces to subdomains Domain admin rights are sufficient.

Fabric admin is excessive and breaks least privilege.

Correct Answe r: A . Domain admin


Question 2

You need to ensure the data loading activities in the AnalyticsPOC workspace are executed in the appropriate sequence. The solution must meet the technical requirements.

What should you do?



Answer : A

To meet the technical requirement that data loading activities must ensure the raw and cleansed data is updated completely before populating the dimensional model, you would need a mechanism that allows for ordered execution. A pipeline in Microsoft Fabric with dependencies set between activities can ensure that activities are executed in a specific sequence. Once set up, the pipeline can be scheduled to run at the required intervals (hourly or daily depending on the data source).


Question 3

You need to implement the date dimension in the data store. The solution must meet the technical requirements.

What are two ways to achieve the goal? Each correct answer presents a complete solution.

NOTE: Each correct selection is worth one point.



Answer : A, B

Both a dataflow (A) and a Stored procedure activity in a pipeline (B) are capable of creating and populating a date dimension table. A dataflow can perform the transformation needed to create the date dimension, and it aligns with the preference for using low-code tools for data ingestion when possible. A Stored procedure could be written to generate the necessary date dimension data and executed within a pipeline, which also adheres to the technical requirements for the PoC.


Question 4

You have a Fabric warehouse named Warehousel that contains a table named Table! Tablel contains customer data.

You need to implement row-level security (RLS) for Tablel. The solution must ensure that users can see only their respective data.

Which two objects should you create? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.



Answer : A, C


Question 5

You have a Fabric tenant that contains a workspace named Workspace1 and a user named User1. Workspace1 contains a warehouse named DW1.

You share DW1 with User1 and assign User1 the default permissions for DW1.

What can User1 do?



Answer : A

Comprehensive Detailed Explanation

Step 1: Default permissions when sharing a warehouse in Fabric

When you share a Fabric warehouse (DW1) with a user and assign them default permissions, the user:

Gets Build permission by default.

Build permission allows the user to:

Use the default dataset automatically created for the warehouse.

Build reports and dashboards against that dataset.

It does not grant direct query permissions to the underlying tables or files.

Step 2: Analyze the options

A . Build reports by using the default dataset.

Correct. Sharing a warehouse gives Build permission on the warehouse's default dataset.

User1 can create Power BI reports but cannot directly query the warehouse via SQL.

B . Read the underlying Parquet files from OneLake.

Incorrect. Access to underlying OneLake files requires Direct Lake storage access or shortcut permissions, not provided by default sharing.

C . Connect to DW1 via the TDS endpoint.

Incorrect. The TDS endpoint is supported by semantic models, not warehouses.

D . Read data from the tables in DW1.

Incorrect. Default permissions do not allow table-level SQL access. They allow using the dataset for reporting.

Step 3: Correct Answe r

The only capability User1 has by default is:

A . Build reports by using the default dataset.

Reference

Fabric warehouses -- permissions

Build permission in Power BI


Question 6

You have a Fabric tenant that contains a workspace named Workspace"!. You plan to deploy a semantic model named Model 1 by using the XMLA endpoint.

You need to optimize the deployment of Model!. The solution must minimize how long it takes to deploy Modell. What should you do in Workspace1?



Answer : C


Question 7

You have a Fabric workspace named Workspace1 that is assigned to a newly created Fabric capacity named Capacity1.

You create a semantic model named Model1 and deploy Model1 to Workspace1.

You need to publish changes to Model1 directly from Tabular Editor.

What should you do?



Answer : D

In Microsoft Fabric, semantic models (formerly datasets) can be managed using external tools like Tabular Editor. To push or publish changes directly to a Fabric workspace from Tabular Editor, the workspace must support XMLA endpoint connectivity with read-write access.

Explanation of Each Option

A . For Workspace1, enable Git integration.

Git integration allows source control and versioning of items in a Fabric workspace. It is intended for lifecycle management of artifacts and does not provide the connectivity required for Tabular Editor to publish semantic model changes.

B . For Workspace1, create a managed private endpoint.

A managed private endpoint provides secure connectivity between Fabric and external data sources. It is not related to the ability to publish semantic models through XMLA.

C . For Model1, enable external sharing.

External sharing governs who can access the model outside the organization. This does not affect publishing workflows from developer tools.

D . For Capacity1, set XMLA Endpoint to Read Write.

This is the correct choice. The XMLA endpoint provides a programmatic connection point for external tools like Tabular Editor. By default, the XMLA endpoint is enabled only in read-only mode. Changing the setting to Read Write allows developers to push changes, update metadata, and deploy semantic models directly from Tabular Editor into the Fabric workspace.

Summary

To publish changes to a Fabric semantic model using Tabular Editor, the XMLA endpoint of the assigned capacity must be set to Read Write. Without this configuration, Tabular Editor can only connect in read-only mode and cannot deploy updates.

Reference

Use the XMLA endpoint in Microsoft Fabric

Manage capacities in Microsoft Fabric


Page:    1 / 14   
Total 166 questions