Special Summer Sale 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: save70

Free and Premium Microsoft DP-700 Dumps Questions Answers

Page: 1 / 4
Total 98 questions

Implementing Data Engineering Solutions Using Microsoft Fabric Questions and Answers

Question 1

You need to recommend a method to populate the POS1 data to the lakehouse medallion layers.

What should you recommend for each layer? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Options:

Buy Now
Question 2

You need to ensure that the data analysts can access the gold layer lakehouse.

What should you do?

Options:

A.

Add the DataAnalyst group to the Viewer role for WorkspaceA.

B.

Share the lakehouse with the DataAnalysts group and grant the Build reports on the default semantic model permission.

C.

Share the lakehouse with the DataAnalysts group and grant the Read all SQL Endpoint data permission.

D.

Share the lakehouse with the DataAnalysts group and grant the Read all Apache Spark permission.

Question 3

You need to ensure that usage of the data in the Amazon S3 bucket meets the technical requirements.

What should you do?

Options:

A.

Create a workspace identity and enable high concurrency for the notebooks.

B.

Create a shortcut and ensure that caching is disabled for the workspace.

C.

Create a workspace identity and use the identity in a data pipeline.

D.

Create a shortcut and ensure that caching is enabled for the workspace.

Question 4

You need to populate the MAR1 data in the bronze layer.

Which two types of activities should you include in the pipeline? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

Options:

A.

ForEach

B.

Copy data

C.

WebHook

D.

Stored procedure

Question 5

You need to create the product dimension.

How should you complete the Apache Spark SQL code? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Options:

Question 6

You have two Fabric workspaces named Workspace1 and Workspace2.

You have a Fabric deployment pipeline named deployPipeline1 that deploys items from Workspace1 to Workspace2. DeployPipeline1 contains all the items in Workspace1.

You recently modified the items in Workspaces1.

The workspaces currently contain the items shown in the following table.

Items in Workspace1 that have the same name as items in Workspace2 are currently paired.

You need to ensure that the items in Workspace1 overwrite the corresponding items in Workspace2. The solution must minimize effort.

What should you do?

Options:

A.

Delete all the items in Workspace2, and then run deployPipeline1.

B.

Rename each item in Workspace2 to have the same name as the items in Workspace1.

C.

Back up the items in Workspace2, and then run deployPipeline1.

D.

Run deployPipeline1 without modifying the items in Workspace2.

Question 7

You have a Fabric workspace that contains an eventstream named EventStreaml. EventStreaml outputs events to a table named Tablel in a lakehouse. The streaming data is souiced from motorway sensors and represents the speed of cars.

You need to add a transformation to EventStream1 to average the car speeds. The speeds must be grouped by non-overlapping and contiguous time intervals of one minute. Each event must belong to exactly one window.

Which windowing function should you use?

Options:

A.

sliding

B.

hopping

C.

tumbling

D.

session

Question 8

You have a Fabric workspace that contains a warehouse named Warehouse1.

While monitoring Warehouse1, you discover that query performance has degraded during the last 60 minutes.

You need to isolate all the queries that were run during the last 60 minutes. The results must include the username of the users that submitted the queries and the query statements. What should you use?

Options:

A.

the Microsoft Fabric Capacity Metrics app

B.

views from the queryinsights schema

C.

Query activity

D.

the sys.dm_exec_requests dynamic management view

Question 9

You have a Fabric warehouse named DW1 that loads data by using a data pipeline named Pipeline1. Pipeline1 uses a Copy data activity with a dynamic SQL source. Pipeline1 is scheduled to run every 15 minutes.

You discover that Pipeline1 keeps failing.

You need to identify which SQL query was executed when the pipeline failed.

What should you do?

Options:

A.

From Monitoring hub, select the latest failed run of Pipeline1, and then view the output JSON.

B.

From Monitoring hub, select the latest failed run of Pipeline1, and then view the input JSON.

C.

From Real-time hub, select Fabric events, and then review the details of Microsoft.Fabric.ItemReadFailed.

D.

From Real-time hub, select Fabric events, and then review the details of Microsoft. Fabric.ItemUpdateFailed.

Question 10

HOTSPOT

You have a Fabric workspace named Workspace1_DEV that contains the following items:

10 reports

Four notebooks

Three lakehouses

Two data pipelines

Two Dataflow Gen1 dataflows

Three Dataflow Gen2 dataflows

Five semantic models that each has a scheduled refresh policy

You create a deployment pipeline named Pipeline1 to move items from Workspace1_DEV to a new workspace named Workspace1_TEST.

You deploy all the items from Workspace1_DEV to Workspace1_TEST.

For each of the following statements, select Yes if the statement is true. Otherwise, select No.

NOTE: Each correct selection is worth one point.

Options:

Question 11

What should you do to optimize the query experience for the business users?

Options:

A.

Enable V-Order.

B.

Create and update statistics.

C.

Run the VACUUM command.

D.

Introduce primary keys.

Question 12

You need to implement the solution for the book reviews.

Which should you do?

Options:

A.

Create a Dataflow Gen2 dataflow.

B.

Create a shortcut.

C.

Enable external data sharing.

D.

Create a data pipeline.

Question 13

HOTSPOT

You need to troubleshoot the ad-hoc query issue.

How should you complete the statement? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Options:

Question 14

You need to ensure that the authors can see only their respective sales data.

How should you complete the statement? To answer, drag the appropriate values the correct targets. Each value may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content

NOTE: Each correct selection is worth one point.

Options:

Question 15

You need to resolve the sales data issue. The solution must minimize the amount of data transferred.

What should you do?

Options:

A.

Spilt the dataflow into two dataflows.

B.

Configure scheduled refresh for the dataflow.

C.

Configure incremental refresh for the dataflow. Set Store rows from the past to 1 Month.

D.

Configure incremental refresh for the dataflow. Set Refresh rows from the past to 1 Year.

E.

Configure incremental refresh for the dataflow. Set Refresh rows from the past to 1 Month.

Page: 1 / 4
Total 98 questions