更新时间: 试题数量: 购买人数: 提供作者:

有效期: 个月

章节介绍: 共有个章节

收藏
搜索
题库预览

Introductory Info Case study -
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.
To start the case study -
To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an
All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.
Overview -
ADatum Corporation is a retailer that sells products through two sales channels: retail stores and a website.
Existing Environment -
ADatum has one database server that has Microsoft SQL Server 2016 installed. The server hosts three mission-critical databases named SALESDB, DOCDB, and REPORTINGDB.
SALESDB collects data from the stored and the website.
DOCDB stored documents that connect to the sales data in SALESDB. The documents are stored in two different JSON formats based on the sales channel.
REPORTINGDB stores reporting data and contains several columnstore indexes. A daily process creates reporting data in REPORTINGDB from the data in
SALESDB. The process is implemented as a SQL Server Integration Services (SSIS) package that runs a stored procedure from SALESDB.
Requirements -
Planned Changes -
ADatum plans to move the current data infrastructure to Azure. The new infrastructure has the following requirements:
Migrate SALESDB and REPORTINGDB to an Azure SQL database.
Migrate DOCDB to Azure Cosmos DB.
The sales data, including the documents in JSON format, must be gathered as it arrives and analyzed online by using Azure Stream Analytics. The analytic process will perform aggregations that must be done continuously, without gaps, and without overlapping.
As they arrive, all the sales documents in JSON format must be transformed into one consistent format.
Azure Data Factory will replace the SSIS process of copying the data from SALESDB to REPORTINGDB.
Technical Requirements -
The new Azure data infrastructure must meet the following technical requirements:
Data in SALESDB must encrypted by using Transparent Data Encryption (TDE). The encryption must use your own key.
SALESDB must be restorable to any given minute within the past three weeks.
Real-time processing must be monitored to ensure that workloads are sized properly based on actual usage patterns.
Missing indexes must be created automatically for REPORTINGDB.
Disk IO, CPU, and memory usage must be monitored for SALESDB. Question You need to configure a disaster recovery solution for SALESDB to meet the technical requirements.
What should you configure in the backup policy?

Introductory Info Background -
Proseware, Inc, develops and manages a product named Poll Taker. The product is used for delivering public opinion polling and analysis.
Polling data comes from a variety of sources, including online surveys, house-to-house interviews, and booths at public events.
Polling data -
Polling data is stored in one of the two locations:
An on-premises Microsoft SQL Server 2019 database named PollingData
Azure Data Lake Gen 2
Data in Data Lake is queried by using PolyBase
Poll metadata -
Each poll has associated metadata with information about the poll including the date and number of respondents. The data is stored as JSON.
Phone-based polling -
Security -
Phone-based poll data must only be uploaded by authorized users from authorized devices
Contractors must not have access to any polling data other than their own
Access to polling data must set on a per-active directory user basis
Data migration and loading -
All data migration processes must use Azure Data Factory
All data migrations must run automatically during non-business hours
Data migrations must be reliable and retry when needed
Performance -
After six months, raw polling data should be moved to a storage account. The storage must be available in the event of a regional disaster. The solution must minimize costs.
Deployments - All deployments must be performed by using Azure DevOps. Deployments must use templates used in multiple environments (含图) No credentials or secrets should be used during deployments
Reliability -
All services and processes must be resilient to a regional Azure outage.
Monitoring -
All Azure services must be monitored by using Azure Monitor. On-premises SQL Server performance must be monitored. Question You need to ensure that phone-based poling data can be analyzed in the PollingData database.
How should you configure Azure Data Factory?

Introductory Info Case study -
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.
To start the case study -
To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an
All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.
Overview -
ADatum Corporation is a retailer that sells products through two sales channels: retail stores and a website.
Existing Environment -
ADatum has one database server that has Microsoft SQL Server 2016 installed. The server hosts three mission-critical databases named SALESDB, DOCDB, and REPORTINGDB.
SALESDB collects data from the stored and the website.
DOCDB stored documents that connect to the sales data in SALESDB. The documents are stored in two different JSON formats based on the sales channel.
REPORTINGDB stores reporting data and contains several columnstore indexes. A daily process creates reporting data in REPORTINGDB from the data in
SALESDB. The process is implemented as a SQL Server Integration Services (SSIS) package that runs a stored procedure from SALESDB.
Requirements -
Planned Changes -
ADatum plans to move the current data infrastructure to Azure. The new infrastructure has the following requirements:
Migrate SALESDB and REPORTINGDB to an Azure SQL database.
Migrate DOCDB to Azure Cosmos DB.
The sales data, including the documents in JSON format, must be gathered as it arrives and analyzed online by using Azure Stream Analytics. The analytic process will perform aggregations that must be done continuously, without gaps, and without overlapping.
As they arrive, all the sales documents in JSON format must be transformed into one consistent format.
Azure Data Factory will replace the SSIS process of copying the data from SALESDB to REPORTINGDB.
Technical Requirements -
The new Azure data infrastructure must meet the following technical requirements:
Data in SALESDB must encrypted by using Transparent Data Encryption (TDE). The encryption must use your own key.
SALESDB must be restorable to any given minute within the past three weeks.
Real-time processing must be monitored to ensure that workloads are sized properly based on actual usage patterns.
Missing indexes must be created automatically for REPORTINGDB.
Disk IO, CPU, and memory usage must be monitored for SALESDB. Question Which windowing function should you use to perform the streaming aggregation of the sales data?

Introductory Info Case Study -
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.
To start the case study -
To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.
Overview -
General Overview -
Litware, Inc. is an international car racing and manufacturing company that has 1,000 employees. Most employees are located in Europe. The company supports racing teams that complete in a worldwide racing series.
Physical Locations -
Litware has two main locations: a main office in London, England, and a manufacturing plant in Berlin, Germany.
During each race weekend, 100 engineers set up a remote portable office by using a VPN to connect the datacentre in the London office. The portable office is set up and torn down in approximately 20 different countries each year.
Existing environment -
Race Central -
During race weekends, Litware uses a primary application named Race Central. Each car has several sensors that send real-time telemetry data to the London datacenter. The data is used for real-time tracking of the cars.
Race Central also sends batch updates to an application named Mechanical Workflow by using Microsoft SQL Server Integration Services (SSIS).
The telemetry data is sent to a MongoDB database. A custom application then moves the data to databases in SQL Server 2017. The telemetry data in MongoDB has more than 500 attributes. The application changes the attribute names when the data is moved to SQL Server 2017.
The database structure contains both OLAP and OLTP databases.
Mechanical Workflow -
Mechanical Workflow is used to track changes and improvements made to the cars during their lifetime.
Currently, Mechanical Workflow runs on SQL Server 2017 as an OLAP system.
Mechanical Workflow has a table named Table1 that is 1 TB. Large aggregations are performed on a single column of Table1.
Requirements -
Planned Changes -
Litware is in the process of rearchitecting its data estate to be hosted in Azure. The company plans to decommission the London datacentre and move all its applications to an Azure datacenter.
Technical Requirements -
Litware identifies the following technical requirements:
Data collection for Race Central must be moved to Azure Cosmos DB and Azure SQL Database. The data must be written to the Azure datacenter closest to each race and must converge in the least amount of time.
The query performance of Race Central must be stable, and the administrative time it takes to perform optimizations must be minimized.
The database for Mechanical Workflow must be moved to Azure Synapse Analytics.
Transparent data encryption (TDE) must be enabled on all data stores, whenever possible.
An Azure Data Factory pipeline must be used to move data from Cosmos DB to SQL Database for Race Central. If the data load takes longer than 20 minutes, configuration changes must be made to Data Factory.
The telemetry data must migrate toward a solution that is native to Azure.
The telemetry data must be monitored for performance issues. You must adjust the Cosmos DB Request Units per second (RU/s) to maintain a performance
SLA while minimizing the cost of the RU/s.
Data Masking Requirements -
During race weekends, visitors will be able to enter the remote portable offices. Litware is concerned that some proprietary information might be exposed. The company identifies the following data masking requirements for the Race Central data that will be stored in SQL Database:
Only show the last four digits of the values in a column named SuspensionSprings.
Only show a zero value for the values in a column named ShockOilWeight. Question What should you implement to optimize SQL Database for Race Central to meet the technical requirements?
Case study -
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.
To start the case study -
To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an
All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.
Overview -
XYZ is an online training provider. They regularly conduct polls on the effectiveness of their training material. The polling data comes from various sources, such as online surveys and public events.
The polling data would be stored into 2 locations:
ג€¢ An on-premise Microsoft SQL Server database
ג€¢ Azure Data Lake Gen 2 storage
The data in the data lake would be queried using PolyBase.
Each poll data also has associated metadata. The metadata is stored as JSON. The metadata has the date and number of people who have taken the poll.
Polling data is also taken via phone calls. Below are the security requirements for polling data taken via phone calls:
ג€¢ The poll data must be uploaded by authorized users from authorized devices
ג€¢ External contractors can only access their own polling data
ג€¢ The access to the polling data would be given to users on a per-active directory user basis
Other requirements:
ג€¢ All data migration processes must be carried out using Azure Data Factory
ג€¢ All of the data migrations must run automatically and be carried out during non-business hours
ג€¢ All services and processes must be resilient to regional Azure outages
ג€¢ All services must be monitored using Azure Monitor
ג€¢ The performance of the on-premise SQL Server must also be monitored
ג€¢ After 3 months all polling data must be moved to low cost storage
ג€¢ All deployments must be performed using Azure DevOps
ג€¢ Deployments must make use of templates
ג€¢ No credentials or secrets of any kind must be exposed during deployments
You have to create the storage account that would be used to store the polling data.
Which of the following would you use as the Account type?