A company plans to use Apache Spark analytics to analyze intrusion detection data. You need to recommend a solution to analyze network and system activity data for malicious activities and policy violations. The solution must minimize administrative efforts. What should you recommend?
Correct Answer: C
Azure HDInsight offers pre-made, monitoring dashboards in the form of solutions that can be used to monitor the workloads running on your clusters. There are solutions for Apache Spark, Hadoop, Apache Kafka, live long and process (LLAP), Apache HBase, and Apache Storm available in the Azure Marketplace. Note: With Azure HDInsight you can set up Azure Monitor alerts that will trigger when the value of a metric or the results of a query meet certain conditions. You can condition on a query returning a record with a value that is greater than or less than a certain threshold, or even on the number of results returned by a query. For example, you could create an alert to send an email if a Spark job fails or if a Kafka disk usage becomes over 90 percent full. Reference: https://azure.microsoft.com/en-us/blog/monitoring-on-azure-hdinsight-part-4-workload-metrics-and-logs/
Question 142
You have an Azure Synapse Analytics Apache Spark pool named Pool1. You plan to load JSON files from an Azure Data Lake Storage Gen2 container into the tables in Pool1. The structure and data types vary by file. You need to load the files into the tables. The solution must maintain the source data types. What should you do?
Correct Answer: B
Serverless SQL pool can automatically synchronize metadata from Apache Spark. A serverless SQL pool database will be created for each database existing in serverless Apache Spark pools. Serverless SQL pool enables you to query data in your data lake. It offers a T-SQL query surface area that accommodates semi-structured and unstructured data queries. To support a smooth experience for in place querying of data that's located in Azure Storage files, serverless SQL pool uses the OPENROWSET function with additional capabilities. The easiest way to see to the content of your JSON file is to provide the file URL to the OPENROWSET function, specify csv FORMAT. Reference: https://docs.microsoft.com/en-us/azure/synapse-analytics/sql/query-json-files https://docs.microsoft.com/en-us/azure/synapse-analytics/sql/query-data-storage
Question 143
DRAG DROP You have SQL Server on an Azure virtual machine that contains a database named DB1. DB1 is 30 TB and has a 1-GB daily rate of change. You back up the database by using a Microsoft SQL Server Agent job that runs Transact-SQL commands. You perform a weekly full backup on Sunday, daily differential backups at 01:00 in the morning, and transaction log backups every five minutes. The database fails on Wednesday at 10:00 in the morning. Which three backups should you restore in sequence? To answer, move the appropriate backups from the list of backups to the answer area and arrange them in the correct order. Select and Place:
Correct Answer:
Section: [none]
Question 144
You have an Azure virtual machine named VM1 on a virtual network named VNet1. Outbound traffic from VM1 to the internet is blocked. You have an Azure SQL database named SqlDb1 on a logical server named SqlSrv1. You need to implement connectivity between VM1 and SqlDb1 to meet the following requirements: * Ensure that all traffic to the public endpoint of SqlSrv1 is blocked. * Minimize the possibility of VM1 exfiltrating data stored in SqlDb1. What should you create on VNet1?
Correct Answer: C
Azure Private Link enables you to access Azure PaaS Services (for example, Azure Storage and SQL Database) and Azure hosted customer-owned/partner services over a private endpoint in your virtual network. Traffic between your virtual network and the service travels the Microsoft backbone network. Exposing your service to the public internet is no longer necessary. Reference: https://docs.microsoft.com/en-us/azure/private-link/private-link-overview Monitor and Optimize Operational Resources Testlet 1 This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided. To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study. At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section. To start the case study To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question. Existing Environment Network Environment The manufacturing and research datacenters connect to the primary datacenter by using a VPN. The primary datacenter has an ExpressRoute connection that uses both Microsoft peering and private peering. The private peering connects to an Azure virtual network named HubVNet. Identity Environment Litware has a hybrid Azure Active Directory (Azure AD) deployment that uses a domain named litwareinc.com. All Azure subscriptions are associated to the litwareinc.com Azure AD tenant. Database Environment The sales department has the following database workload: * An on-premises named SERVER1 hosts an instance of Microsoft SQL Server 2012 and two 1-TB databases. * A logical server named SalesSrv01A contains a geo-replicated Azure SQL database named SalesSQLDb1. SalesSQLDb1 is in an elastic pool named SalesSQLDb1Pool. SalesSQLDb1 uses database firewall rules and contained database users. * An application named SalesSQLDb1App1 uses SalesSQLDb1. The manufacturing office contains two on-premises SQL Server 2016 servers named SERVER2 and SERVER3. The servers are nodes in the same Always On availability group. The availability group contains a database named ManufacturingSQLDb1 Database administrators have two Azure virtual machines in HubVnet named VM1 and VM2 that run Windows Server 2019 and are used to manage all the Azure databases. Licensing Agreement Litware is a Microsoft Volume Licensing customer that has License Mobility through Software Assurance. Current Problems SalesSQLDb1 experiences performance issues that are likely due to out-of-date statistics and frequent blocking queries. Requirements Planned Changes Litware plans to implement the following changes: * Implement 30 new databases in Azure, which will be used by time-sensitive manufacturing apps that have varying usage patterns. Each database will be approximately 20 GB. * Create a new Azure SQL database named ResearchDB1 on a logical server named ResearchSrv01. ResearchDB1 will contain Personally Identifiable Information (PII) data. * Develop an app named ResearchApp1 that will be used by the research department to populate and access ResearchDB1. * Migrate ManufacturingSQLDb1 to the Azure virtual machine platform. * Migrate the SERVER1 databases to the Azure SQL Database platform. Technical Requirements Litware identifies the following technical requirements: * Maintenance tasks must be automated. * The 30 new databases must scale automatically. * The use of an on-premises infrastructure must be minimized. * Azure Hybrid Use Benefits must be leveraged for Azure SQL Database deployments. * All SQL Server and Azure SQL Database metrics related to CPU and storage usage and limits must be analyzed by using Azure built-in functionality. Security and Compliance Requirements Litware identifies the following security and compliance requirements: * Store encryption keys in Azure Key Vault. * Retain backups of the PII data for two months. * Encrypt the PII data at rest, in transit, and in use. * Use the principle of least privilege whenever possible. * Authenticate database users by using Active Directory credentials. * Protect Azure SQL Database instances by using database-level firewall rules. * Ensure that all databases hosted in Azure are accessible from VM1 and VM2 without relying on public endpoints. Business Requirements Litware identifies the following business requirements: * Meet an SLA of 99.99% availability for all Azure deployments. * Minimize downtime during the migration of the SERVER1 databases. * Use the Azure Hybrid Use Benefits when migrating workloads to Azure. * Once all requirements are met, minimize costs whenever possible. Monitor and Optimize Operational Resources Question Set 2
Question 145
You need to design a data retention solution for the Twitter feed data records. The solution must meet the customer sentiment analytics requirements. Which Azure Storage functionality should you include in the solution?
Correct Answer: C
The lifecycle management policy lets you: Delete blobs, blob versions, and blob snapshots at the end of their lifecycles Reference: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-lifecycle-management-concepts