Microsoft Implementing an Azure Data Solution DP-200 Exam Practice Test

Page: 1 / 14
Total 243 questions
Question 1

On which data store you configure TDE to meet the technical requirements?



Answer : B

Scenario: Transparent data encryption (TDE) must be enabled on all data stores, whenever possible.

The datacentre for Mechanical Workflow must be moved to Azure SQL data Warehouse.


Question 2

You have an Azure Stream Analytics job that receives clickstream data from an Azure event hub.

You need to define a query in the Stream Analytics job. The query must meet the following requirements:

Count the number of clicks within each 10-second window based on the country of a visitor.

Ensure that each click is NOT counted more than once.

How should you define the query?



Answer : A

Tumbling window functions are used to segment a data stream into distinct time segments and perform a function against them, such as the example below. The key differentiators of a Tumbling window are that they repeat, do not overlap, and an event cannot belong to more than one tumbling window.

Example:


https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-window-functions

Question 3

You develop data engineering solutions for a company. The company has on-premises Microsoft SQL Server databases at multiple locations.

The company must integrate data with Microsoft Power BI and Microsoft Azure Logic Apps. The solution must avoid single points of failure during connection and transfer to the cloud. The solution must also minimize latency.

You need to secure the transfer of data between on-premises databases and Microsoft Azure.

What should you do?



Answer : D

You can create high availability clusters of On-premises data gateway installations, to ensure your organization can access on-premises data resources used in Power BI reports and dashboards. Such clusters allow gateway administrators to group gateways to avoid single points of failure in accessing on-premises data resources. The Power BI service always uses the primary gateway in the cluster, unless it's not available. In that case, the service switches to the next gateway in the cluster, and so on.

References:

https://docs.microsoft.com/en-us/power-bi/service-gateway-high-availability-clusters


Question 4

You need to configure a disaster recovery solution for SALESDB to meet the technical requirements.

What should you configure in the backup policy?



Answer : C

Scenario: SALESDB must be restorable to any given minute within the past three weeks.

The Azure SQL Database service protects all databases with an automated backup system. These backups are retained for 7 days for Basic, 35 days for Standard and 35 days for Premium. Point-in-time restore is a self-service capability, allowing customers to restore a Basic, Standard or Premium database from these backups to any point within the retention period.

References:

https://azure.microsoft.com/en-us/blog/azure-sql-database-point-in-time-restore/


Question 5

You need to implement event processing by using Stream Analytics to produce consistent JSON documents.

Which three actions should you perform? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

You need to ensure that the missing indexes for REPORTINGDB are added.

What should you use?



Answer : D

Automatic tuning options include create index, which identifies indexes that may improve performance of your workload, creates indexes, and automatically verifies that performance of queries has improved.

Scenario:

REPORTINGDB stores reporting data and contains server columnstore indexes.

Migrate SALESDB and REPORTINGDB to an Azure SQL database.

References:

https://docs.microsoft.com/en-us/azure/sql-database/sql-database-automatic-tuning


Question 6

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

A company uses Azure Data Lake Gen 1 Storage to store big data related to consumer behavior.

You need to implement logging.

Solution: Use information stored in Azure Active Directory reports.

Does the solution meet the goal?



Answer : B

Instead configure Azure Data Lake Storage diagnostics to store logs and metrics in a storage account.

References:

https://docs.microsoft.com/en-us/azure/data-lake-store/data-lake-store-diagnostic-logs


Question 7

Note: This question is part of series of questions that present the same scenario. Each question in the series contains a unique solution. Determine whether the solution meets the stated goals.

You develop a data ingestion process that will import data to a Microsoft Azure SQL Data Warehouse. The data to be ingested resides in parquet files stored in an Azure Data Lake Gen 2 storage account.

You need to load the data from the Azure Data Lake Gen 2 storage account into the Azure SQL Data Warehouse.

Solution:

1. Create a remote service binding pointing to the Azure Data Lake Gen 2 storage account

2. Create an external file format and external table using the external data source

3. Load the data using the CREATE TABLE AS SELECT statement

Does the solution meet the goal?



Answer : B

You need to create an external file format and external table from an external data source, instead from a remote service binding pointing.

References:

https://docs.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-load-from-azure-data-lake-store


Page:    1 / 14   
Total 243 questions