Labour Day Sale - Limited Time 60% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: 575363r9

Welcome To DumpsPedia

DP-200 Sample Questions Answers

Questions 4

You need to replace the SSIS process by using Data Factory.

Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.

Options:

Buy Now
Questions 5

Which counter should you monitor for real-time processing to meet the technical requirements?

Options:

A.

Concurrent users

B.

SU% Utilization

C.

Data Conversion Errors

D.

Buy Now
Questions 6

You need to implement the encryption for SALESDB.

Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.

Options:

Buy Now
Questions 7

How should you monitor SALESDB to meet the technical requirements?

Options:

A.

Query the sys.resource_stats dynamic management view.

B.

Review the Query Performance Insights for SALESDB.

C.

Query the sys.dm_os_wait_stats dynamic management view.

D.

Review the auditing information of SALESDB.

Buy Now
Questions 8

You need to implement event processing by using Stream Analytics to produce consistent JSON documents.

Which three actions should you perform? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

Options:

A.

Define an output to Cosmos DB.

B.

Define a query that contains a JavaScript user-defined aggregates (UDA) function.

C.

Define a reference input.

D.

Define a transformation query.

E.

Define an output to Azure Data Lake Storage Gen2.

F.

Define a stream input.

Buy Now
Questions 9

You have an Azure Cosmos DB database that uses the SQL API.

You need to delete stale data from the database automatically.

What should you use?

Options:

A.

soft delete

B.

Low Latency Analytical Processing (LLAP)

C.

schema on read

D.

Time to Live (TTL)

Buy Now
Questions 10

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this scenario, you will NOT be able to return to it. As a result, these

questions will not appear in the review screen.

You have an Azure SQL database named DB1 that contains a table named Table1. Table1 has a field named Customer_ID that is varchar(22).

You need to implement masking for the Customer_ID field to meet the following requirements:

  • The first two prefix characters must be exposed.
  • The last four prefix characters must be exposed.
  • All other characters must be masked.

Solution: You implement data masking and use a credit card function mask.

Does this meet the goal?

Options:

A.

Yes

B.

No

Buy Now
Questions 11

You manage a financial computation data analysis process. Microsoft Azure virtual machines (VMs) run the process in daily jobs, and store the results in virtual hard drives (VHDs.)

The VMs product results using data from the previous day and store the results in a snapshot of the VHD. When a new month begins, a process creates a new VHD.

You must implement the following data retention requirements:

  • Daily results must be kept for 90 days
  • Data for the current year must be available for weekly reports
  • Data from the previous 10 years must be stored for auditing purposes
  • Data required for an audit must be produced within 10 days of a request.

You need to enforce the data retention requirements while minimizing cost.

How should you configure the lifecycle policy? To answer, drag the appropriate JSON segments to the correct locations. Each JSON segment may be used once, more than once, or not at all. You may need to drag the split bat between panes or scroll to view content.

NOTE: Each correct selection is worth one point.

Options:

Buy Now
Questions 12

You have an Azure Stream Analytics job.

You need to ensure that the job has enough streaming units provisioned.

You configure monitoring of the SU% Utilization metric.

Which two additional metrics should you monitor? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

Options:

A.

Watermark Delay

B.

Late Input Events

C.

Out of order Events

D.

Backlogged Input Events

E.

Function Events

Buy Now
Questions 13

You have an Azure Blob storage account.

Developers report that an HTTP 403 (Forbidden) error is generated when a client application attempts to access the storage account. You cannot see the error messages in Azure Monitor.

What is a possible cause of the error?

Options:

A.

The client application is using an expired shared access signature (SAS) when it sends a storage request.

B.

The client application deleted, and then immediately recreated a blob container that has the same name.

C.

The client application attempted to use a shared access signature (SAS) that did not have the necessary permissions.

D.

The client application attempted to use a blob that does not exist in the storage service.

Buy Now
Questions 14

You have a data warehouse in Azure Synapse Analytics.

You need to ensure c rest.

What should you enable?

Options:

A.

Transparent Data Encryption (TDE)

B.

Secure transfer required

C.

Always Encrypted for all columns

D.

Advanced Data Security for this database

Buy Now
Questions 15

Note: This question is part of series of questions that present the same scenario. Each question in the series contains a unique solution. Determine whether the solution meets the stated goals.

You develop a data ingestion process that will import data to a Microsoft Azure SQL Data Warehouse. The data to be ingested resides in parquet files stored in an Azure Data Lake Gen 2 storage account.

You need to load the data from the Azure Data Lake Gen 2 storage account into the Azure SQL Data Warehouse.

Solution:

1. Create a remote service binding pointing to the Azure Data Lake Gen 2 storage account

2. Create an external file format and external table using the external data source

3. Load the data using the CREATE TABLE AS SELECT statement

Does the solution meet the goal?

Options:

A.

Yes

B.

No

Buy Now
Questions 16

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this scenario, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You are developing a solution that will use Azure Stream Analytics. The solution will accept an Azure Blob storage file named Customers. The file will contain both in-store and online customer details. The online customers will provide a mailing address.

You have a file in Blob storage named LocationIncomes that contains based on location. The file rarely changes.

You need to use an address to look up a median income based on location. You must output the data to Azure SQL Database for immediate use and to Azure Data Lake Storage Gen2 for long-term retention.

Solution: You implement a Stream Analytics job that has one streaming input, one reference input, two queries, and four outputs.

Does this meet the goal?

Options:

A.

Yes

B.

No

Buy Now
Questions 17

You need to ensure that phone-based polling data can be analyzed in the PollingData database.

Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer are and arrange them in the correct order.

Options:

Buy Now
Questions 18

You need to ensure that Azure Data Factory pipelines can be deployed. How should you configure authentication and authorization for deployments? To answer, select the appropriate options in the answer choices.

NOTE: Each correct selection is worth one point.

Options:

Buy Now
Questions 19

You need to ensure polling data security requirements are met.

Which security technologies should you use? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Options:

Buy Now
Questions 20

You need to ensure phone-based polling data upload reliability requirements are met. How should you configure monitoring? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Options:

Buy Now
Questions 21

You need to provision the polling data storage account.

How should you configure the storage account? To answer, drag the appropriate Configuration Value to the correct Setting. Each Configuration Value may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.

NOTE: Each correct selection is worth one point.

Options:

Buy Now
Questions 22

You need to ensure that phone-based poling data can be analyzed in the PollingData database.

How should you configure Azure Data Factory?

Options:

A.

Use a tumbling schedule trigger

B.

Use an event-based trigger

C.

Use a schedule trigger

D.

Use manual execution

Buy Now
Questions 23

You need to process and query ingested Tier 9 data.

Which two options should you use? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

Options:

A.

Azure Notification Hub

B.

Transact-SQL statements

C.

Azure Cache for Redis

D.

Apache Kafka statements

E.

Azure Event Grid

F.

Azure Stream Analytics

Buy Now
Questions 24

You need to set up access to Azure SQL Database for Tier 7 and Tier 8 partners.

Which three actions should you perform in sequence? To answer, move the appropriate three actions from the list of actions to the answer area and arrange them in the correct order.

Options:

Buy Now
Questions 25

Which masking functions should you implement for each column to meet the data masking requirements? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Options:

Buy Now
Questions 26

On which data store you configure TDE to meet the technical requirements?

Options:

A.

Cosmos DB

B.

SQL Data Warehouse

C.

SQL Database

Buy Now
Questions 27

What should you include in the Data Factory pipeline for Race Central?

Options:

A.

a copy activity that uses a stored procedure as a source

B.

a copy activity that contains schema mappings

C.

a delete activity that has logging enabled

D.

a filter activity that has a condition

Buy Now
Questions 28

You are monitoring the Data Factory pipeline that runs from Cosmos DB to SQL Database for Race Central.

You discover that the job takes 45 minutes to run.

What should you do to improve the performance of the job?

Options:

A.

Decrease parallelism for the copy activities.

B.

Increase that data integration units.

C.

Configure the copy activities to use staged copy.

D.

Configure the copy activities to perform compression.

Buy Now
Questions 29

You need to build a solution to collect the telemetry data for Race Control.

What should you use? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Options:

Buy Now
Questions 30

Which masking functions should you implement for each column to meet the data masking requirements? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Options:

Buy Now
Questions 31

What should you implement to optimize SQL Database for Race Central to meet the technical requirements?

Options:

A.

the sp_update stored procedure

B.

automatic tuning

C.

Query Store

D.

the dbcc checkdb command

Buy Now
Questions 32

You are building the data store solution for Mechanical Workflow.

How should you configure Table1? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Options:

Buy Now
Status:
Expired , and Replaced By
Exam Code: DP-200
Exam Name: Implementing an Azure Data Solution
Last Update: Apr 14, 2023
Questions: 243
$68  $169.99
$52  $129.99
$44  $109.99
buy now DP-200