Are you preparing for the Microsoft Azure DP-200 exam? Don’t know where to start? This article is a DP-200 study guide (with links). https://www.pass4itsure.com/dp-200.html DP-200 exam questions & answers.I hope this article can help you prepare for the DP-200 certification exam. Also, please share this information in your circles to help them prepare for the exam.

DP-200 Dumps (PDF)

Free Microsoft DP-200 Dumps (PDF) https://drive.google.com/open?id=1MdsDECN7dqml2OMJ0XQ-5LaNMQ__ZwXm

To view other Azure Certificate Study Guides, click here

Download Microsoft DP-100 Dumps Free Updates For DP-100 Exam Questions

DP-200 Practice Tests

184 + Microsoft DP-200 Practice Exam Questions.It is recommended that you verify your understanding through practice questions.

QUESTION 1
You need to ensure that Azure Data Factory pipelines can be deployed. How should you configure authentication and
authorization for deployments? To answer, select the appropriate options in the answer choices. NOTE: Each correct
selection is worth one point.
Hot Area:

gailemartinenaite dp-200 exam questions-q1

Correct Answer:

gailemartinenaite dp-200 exam questions-q1-2

 

Explanation/Reference:
The way you control access to resources using RBAC is to create role assignments. This is a key concept to understand

QUESTION 2
You develop data engineering solutions for a company.
A project requires analysis of real-time Twitter feeds. Posts that contain specific keywords must be stored and
processed on Microsoft Azure and then displayed by using Microsoft Power BI. You need to implement the solution.
Which five actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to
the answer area and arrange them in the correct order.
Select and Place:

gailemartinenaite dp-200 exam questions-q2
Correct Answer:

gailemartinenaite dp-200 exam questions-q2-2

Step 1: Create an HDInisght cluster with the Spark cluster type
Step 2: Create a Jyputer Notebook
Step 3: Create a table
The Jupyter Notebook that you created in the previous step includes code to create an hvac table.
Step 4: Run a job that uses the Spark Streaming API to ingest data from Twitter
Step 5: Load the hvac table into Power BI Desktop
You use Power BI to create visualizations, reports, and dashboards from the Spark cluster data.
References:
https://acadgild.com/blog/streaming-twitter-data-using-spark
https://docs.microsoft.com/en-us/azure/hdinsight/spark/apache-spark-use-with-data-lake-store

QUESTION 3
What should you include in the Data Factory pipeline for Race Central?
A. a copy activity that uses a stored procedure as a source
B. a copy activity that contains schema mappings
C. a delete activity that has logging enabled
D. a filter activity that has a condition
Correct Answer: B
Scenario:
An Azure Data Factory pipeline must be used to move data from Cosmos DB to SQL Database for Race Central. If the
data load takes longer than 20 minutes, configuration changes must be made to Data Factory.
The telemetry data is sent to a MongoDB database. A custom application then moves the data to databases in SQL
Server 2017. The telemetry data in MongoDB has more than 500 attributes. The application changes the attribute
names
when the data is moved to SQL Server 2017.
You can copy data to or from Azure Cosmos DB (SQL API) by using Azure Data Factory pipeline.
Column mapping applies when copying data from source to sink. By default, copy activity map source data to sink by
column names. You can specify explicit mapping to customize the column mapping based on your need. More
specifically,
copy activity:
Read the data from source and determine the source schema
Use default column mapping to map columns by name, or apply explicit column mapping if specified.
Write the data to sink
Write the data to sink
References:
https://docs.microsoft.com/en-us/azure/data-factory/copy-activity-schema-and-type-mapping

QUESTION 4
You plan to use Microsoft Azure SQL Database instances with strict user access control. A user object must:
Move with the database if it is run elsewhere Be able to create additional users You need to create the user object with
correct permissions.
Which two Transact-SQL commands should you run? Each correct answer presents part of the solution. NOTE: Each
correct selection is worth one point.
A. ALTER LOGIN Mary WITH PASSWORD = \\’strong_password\\’;
B. CREATE LOGIN Mary WITH PASSWORD = \\’strong_password\\’;
C. ALTER ROLE db_owner ADD MEMBER Mary;
D. CREATE USER Mary WITH PASSWORD = \\’strong_password\\’;
E. GRANT ALTER ANY USER TO Mary;
Correct Answer: CD
C: ALTER ROLE adds or removes members to or from a database role, or changes the name of a user-defined
database role.
Members of the db_owner fixed database role can perform all configuration and maintenance activities on the database,
and can also drop the database in SQL Server.
D: CREATE USER adds a user to the current database.
Note: Logins are created at the server level, while users are created at the database level. In other words, a login allows
you to connect to the SQL Server service (also called an instance), and permissions inside the database are granted to
the database users, not the logins. The logins will be assigned to server roles (for example, serveradmin) and the
database users will be assigned to roles within that database (eg. db_datareader, db_bckupoperator).
References:
https://docs.microsoft.com/en-us/sql/t-sql/statements/alter-role-transact-sql https://docs.microsoft.com/en-us/sql/tsql/statements/create-user-transact-sq

QUESTION 5
You have an Azure SQL database named DB1 that contains a table named Table1. Table1 has a field named
Customer_ID that is varchar(22).
You need to implement masking for the Customer_ID field to meet the following requirements:
The first two prefix characters must be exposed.
The last four prefix characters must be exposed.
All other characters must be masked.
Solution: You implement data masking and use an email function mask.
Does this meet the goal?
A. Yes
B. No
Correct Answer: B
Must use Custom Text data masking, which exposes the first and last characters and adds a custom padding string in
the middle.
References: https://docs.microsoft.com/en-us/azure/sql-database/sql-database-dynamic-data-masking-get-started

QUESTION 6
You need to process and query ingested Tier 9 data.
Which two options should you use? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
A. Azure Notification Hub
B. Transact-SQL statements
C. Azure Cache for Redis
D. Apache Kafka statements
E. Azure Event Grid
F. Azure Stream Analytics
Correct Answer: EF
Explanation:
Event Hubs provides a Kafka endpoint that can be used by your existing Kafka based applications as an alternative to
running your own Kafka cluster.
You can stream data into Kafka-enabled Event Hubs and process it with Azure Stream Analytics, in the following steps:
1.
Create a Kafka enabled Event Hubs namespace.
2.
Create a Kafka client that sends messages to the event hub.
3.
Create a Stream Analytics job that copies data from the event hub into an Azure blob storage. Scenario:

gailemartinenaite dp-200 exam questions-q6

Tier 9 reporting must be moved to Event Hubs, queried, and persisted in the same Azure region as the company\\’s
main office
References: https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-kafka-stream-analytics

QUESTION 7
You are designing a new Lambda architecture on Microsoft Azure.
The real-time processing layer must meet the following requirements:
Ingestion:
1. Receive millions of events per second
2. Act as a fully managed Platform-as-a-Service (PaaS) solution
3. Integrate with Azure Functions
Stream processing:
1. Process on a per-job basis
2. Provide seamless connectivity with Azure services
3. Use a SQL-based query language
Analytical data store:
1. Act as a managed service
2. Use a document store
3. Provide data encryption at rest
You need to identify the correct technologies to build the Lambda architecture using minimal effort. Which technologies
should you use? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

gailemartinenaite dp-200 exam questions-q7

Correct Answer:

gailemartinenaite dp-200 exam questions-q7-2

Box 1: Azure Event Hubs
This portion of a streaming architecture is often referred to as stream buffering. Options include Azure Event Hubs,
Azure IoT Hub, and Kafka.
Incorrect Answers: Not HDInsight Kafka
Azure Functions need a trigger defined in order to run. There is a limited set of supported trigger types, and Kafka is not
one of them.
Box 2: Azure Stream Analytics
Azure Stream Analytics provides a managed stream processing service based on perpetually running SQL queries that
operate on unbounded streams.
You can also use open source Apache streaming technologies like Storm and Spark Streaming in an HDInsight cluster.
Box 3: Azure SQL Data Warehouse
Azure SQL Data Warehouse provides a managed service for large-scale, cloud-based data warehousing. HDInsight
supports Interactive Hive, HBase, and Spark SQL, which can also be used to serve data for analysis.
References:
https://docs.microsoft.com/en-us/azure/architecture/data-guide/big-data/

QUESTION 8
A company plans to analyze a continuous flow of data from a social media platform by using Microsoft Azure Stream
Analytics. The incoming data is formatted as one record per row.
You need to create the input stream.
How should you complete the REST API segment? To answer, select the appropriate configuration in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

gailemartinenaite dp-200 exam questions-q8

Correct Answer:

gailemartinenaite dp-200 exam questions-q8-2

QUESTION 9
You develop data engineering solutions for a company.
A project requires an in-memory batch data processing solution.You need to provision an HDInsight cluster for batch processing of data on Microsoft Azure.
How should you complete the PowerShell segment? To answer, select the appropriate option in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

gailemartinenaite dp-200 exam questions-q9

Correct Answer:

gailemartinenaite dp-200 exam questions-q9-2

QUESTION 10
Your company has on-premises Microsoft SQL Server instance.
The data engineering team plans to implement a process that copies data from the SQL Server instance to Azure Blob
storage. The process must orchestrate and manage the data lifecycle.
You need to configure Azure Data Factory to connect to the SQL Server instance.
Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions
to the answer area and arrange them in the correct order.
Select and Place:

gailemartinenaite dp-200 exam questions-q10

Correct Answer:

gailemartinenaite dp-200 exam questions-q10-2

QUESTION 11
You have an Azure SQL data warehouse.
Using PolyBase, you create table named [Ext].[Items] to query Parquet files stored in Azure Data Lake Storage Gen2
without importing the data to the data warehouse.
The external table has three columns.
You discover that the Parquet files have a fourth column named ItemID.
Which command should you run to add the ItemID column to the external table?

gailemartinenaite dp-200 exam questions-q11

A. Option A
B. Option B
C. Option C
D. Option D
Correct Answer: A
Incorrect Answers:
B, D: Only these Data Definition Language (DDL) statements are allowed on external tables:
CREATE TABLE and DROP TABLE
CREATE STATISTICS and DROP STATISTICS
CREATE VIEW and DROP VIEW References: https://docs.microsoft.com/en-us/sql/t-sql/statements/create-external-table-transact-sql

QUESTION 12
An application will use Microsoft Azure Cosmos DB as its data solution. The application will use the Cassandra API to
support a column-based database type that uses containers to store items.
You need to provision Azure Cosmos DB. Which container name and item name should you use? Each correct answer
presents part of the solutions.
NOTE: Each correct answer selection is worth one point.
A. collection
B. rows
C. graph
D. entities
E. table
Correct Answer: BE
B: Depending on the choice of the API, an Azure Cosmos item can represent either a document in a collection, a row in
a table or a node/edge in a graph. The following table shows the mapping between API-specific entities to an Azure
Cosmos item:

gailemartinenaite dp-200 exam questions-q12

References: https://docs.microsoft.com/en-us/azure/cosmos-db/databases-containers-items

QUESTION 13
You manage security for a database that supports a line of business application. Private and personal data stored in the database must be protected and encrypted.
You need to configure the database to use Transparent Data Encryption (TDE).
Which five actions should you perform in sequence? To answer, select the appropriate actions from the list of actions to
the answer area and arrange them in the correct order.
Select and Place:

gailemartinenaite dp-200 exam questions-q13

gailemartinenaite dp-200 exam questions-q13-2

 

DP-200 Exam Video Learning

DP-200 Exam Voucher

Pass4itsure discount code 2020

Related Links (Exam DP-200)

https://docs.microsoft.com/en-us/learn/certifications/exams/dp-200

https://docs.microsoft.com/en-us/azure/machine-learning/how-to-manage-workspace

https://docs.microsoft.com/en-us/azure/machine-learning/how-to-data-ingest-adf

https://docs.microsoft.com/en-us/azure/machine-learning/studio/azure-ml-customer-churn-scenario

Free Microsoft DP-200 Dumps (PDF) https://drive.google.com/open?id=1MdsDECN7dqml2OMJ0XQ-5LaNMQ__ZwXm

Get FREE Microsoft Azure DP-200 practice tests,DP-200 exam video,DP-200 pdf dumps is here!For more questions, please visit https://www.pass4itsure.com/dp-200.html (Q&As: 184).