Braindump2go Free Exam Dumps with PDF and VCE Collection
https://www.mcitpdump.com/november-2020braindump2go-az-204-vce-and-pdf-dumps-az-204-291q-free-offerq75-q89.html
Export date: Thu Nov 21 13:44:59 2024 / +0000 GMT

[November-2020]Braindump2go AZ-204 VCE and PDF Dumps AZ-204 291Q Free Offer[Q75-Q89]


2020/November Latest Braindump2go AZ-204 Exam Dumps with PDF and VCE Free Updated Today! Following are some new AZ-204 Real Exam Questions!

QUESTION 75
You develop a website. You plan to host the website in Azure. You expect the website to experience high traffic volumes after it is published.
You must ensure that the website remains available and responsive while minimizing cost.
You need to deploy the website.
What should you do?

A. Deploy the website to a virtual machine.
Configure the virtual machine to automatically scale when the CPU load is high.
B. Deploy the website to an App Service that uses the Shared service tier.
Configure the App Service plan to automatically scale when the CPU load is high.
C. Deploy the website to a virtual machine.
Configure a Scale Set to increase the virtual machine instance count when the CPU load is high.
D. Deploy the website to an App Service that uses the Standard service tier.
Configure the App Service plan to automatically scale when the CPU load is high.

Answer: D
Explanation:
Windows Azure Web Sites (WAWS) offers 3 modes: Standard, Free, and Shared.
Standard mode carries an enterprise-grade SLA (Service Level Agreement) of 99.9% monthly, even for sites with just one instance.
Standard mode runs on dedicated instances, making it different from the other ways to buy Windows Azure Web Sites.
Incorrect Answers:
B: Shared and Free modes do not offer the scaling flexibility of Standard, and they have some important limits.
Shared mode, just as the name states, also uses shared Compute resources, and also has a CPU limit.
So, while neither Free nor Shared is likely to be the best choice for your production environment due to these limits.

QUESTION 76
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You develop an HTTP triggered Azure Function app to process Azure Storage blob data. The app is triggered using an output binding on the blob.
The app continues to time out after four minutes. The app must process the blob data.
You need to ensure the app does not time out and processes the blob data.
Solution: Use the Durable Function async pattern to process the blob data.
Does the solution meet the goal?

A. Yes
B. No

Answer: B
Explanation:
Instead pass the HTTP trigger payload into an Azure Service Bus queue to be processed by a queue trigger function and return an immediate HTTP success response.
Note: Large, long-running functions can cause unexpected timeout issues. General best practices include:
Whenever possible, refactor large functions into smaller function sets that work together and return responses fast. For example, a webhook or HTTP trigger function might require an acknowledgment response within a certain time limit; it's common for webhooks to require an immediate response. You can pass the HTTP trigger payload into a queue to be processed by a queue trigger function. This approach lets you defer the actual work and return an immediate response.
Reference:
https://docs.microsoft.com/en-us/azure/azure-functions/functions-best-practices 3 2 1

QUESTION 77
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You develop an HTTP triggered Azure Function app to process Azure Storage blob data. The app is triggered using an output binding on the blob.
The app continues to time out after four minutes. The app must process the blob data.
You need to ensure the app does not time out and processes the blob data.
Solution: Pass the HTTP trigger payload into an Azure Service Bus queue to be processed by a queue trigger function and return an immediate HTTP success response.
Does the solution meet the goal?

A. Yes
B. No

Answer: A
Explanation:
Large, long-running functions can cause unexpected timeout issues. General best practices include:
Whenever possible, refactor large functions into smaller function sets that work together and return responses fast. For example, a webhook or HTTP trigger function might require an acknowledgment response within a certain time limit; it's common for webhooks to require an immediate response. You can pass the HTTP trigger payload into a queue to be processed by a queue trigger function. This approach lets you defer the actual work and return an immediate response.
Reference:
https://docs.microsoft.com/en-us/azure/azure-functions/functions-best-practices 3 2 1

QUESTION 78
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You develop an HTTP triggered Azure Function app to process Azure Storage blob data. The app is triggered using an output binding on the blob.
The app continues to time out after four minutes. The app must process the blob data.
You need to ensure the app does not time out and processes the blob data.
Solution: Configure the app to use an App Service hosting plan and enable the Always On setting.
Does the solution meet the goal?

A. Yes
B. No

Answer: B
Explanation:
Instead pass the HTTP trigger payload into an Azure Service Bus queue to be processed by a queue trigger function and return an immediate HTTP success response.
Note: Large, long-running functions can cause unexpected timeout issues. General best practices include:
Whenever possible, refactor large functions into smaller function sets that work together and return responses fast. For example, a webhook or HTTP trigger function might require an acknowledgment response within a certain time limit; it's common for webhooks to require an immediate response. You can pass the HTTP trigger payload into a queue to be processed by a queue trigger function. This approach lets you defer the actual work and return an immediate response.
Reference:
https://docs.microsoft.com/en-us/azure/azure-functions/functions-best-practices 3 2 1

QUESTION 79
You are developing an Azure Cosmos DB solution by using the Azure Cosmos DB SQL API. The data includes millions of documents. Each document may contain hundreds of properties.
The properties of the documents do not contain distinct values for partitioning. Azure Cosmos DB must scale individual containers in the database to meet the performance needs of the application by spreading the workload evenly across all partitions over time.
You need to select a partition key.
Which two partition keys can you use? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.

A. a single property value that does not appear frequently in the documents
B. a value containing the collection name
C. a single property value that appears frequently in the documents
D. a concatenation of multiple property values with a random suffix appended
E. a hash suffix appended to a property value

Answer: DE
Explanation:
You can form a partition key by concatenating multiple property values into a single artificial partitionKey property. These keys are referred to as synthetic keys.
Another possible strategy to distribute the workload more evenly is to append a random number at the end of the partition key value. When you distribute items in this way, you can perform parallel write operations across partitions.
Note: It's the best practice to have a partition key with many distinct values, such as hundreds or thousands. The goal is to distribute your data and workload evenly across the items associated with these partition key values. If such a property doesn't exist in your data, you can construct a synthetic partition key.
Reference:
https://docs.microsoft.com/en-us/azure/cosmos-db/synthetic-partition-keys 4

QUESTION 80
You are building a website that uses Azure Blob storage for data storage. You configure Azure Blob storage lifecycle to move all blobs to the archive tier after 30 days.
Customers have requested a service-level agreement (SLA) for viewing data older than 30 days.
You need to document the minimum SLA for data recovery.
Which SLA should you use?

A. at least two days
B. between one and 15 hours
C. at least one day
D. between zero and 60 minutes

Answer: B
Explanation:
The archive access tier has the lowest storage cost. But it has higher data retrieval costs compared to the hot and cool tiers. Data in the archive tier can take several hours to retrieve depending on the priority of the rehydration. For small objects, a high priority rehydrate may retrieve the object from archive in under 1 hour.
Reference:
https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-storage-tiers?tabs=azure-portal 5

QUESTION 81
You develop an app that allows users to upload photos and videos to Azure storage. The app uses a storage REST API call to upload the media to a blob storage account named Account1. You have blob storage containers named Container1 and Container2.
Uploading of videos occurs on an irregular basis.
You need to copy specific blobs from Container1 to Container2 when a new video is uploaded.
What should you do?

A. Copy blobs to Container2 by using the Put Blob operation of the Blob Service REST API
B. Create an Event Grid topic that uses the Start-AzureStorageBlobCopy cmdlet
C. Use AzCopy with the Snapshot switch to copy blobs to Container2
D. Download the blob to a virtual machine and then upload the blob to Container2

Answer: B
Explanation:
The Start-AzureStorageBlobCopy cmdlet starts to copy a blob.
Example 1: Copy a named blob
C:PS>Start-AzureStorageBlobCopy -SrcBlob "ContosoPlanning2015" -DestContainer "ContosoArchives" - SrcContainer "ContosoUploads"
This command starts the copy operation of the blob named ContosoPlanning2015 from the container named ContosoUploads to the container named ContosoArchives.
Reference:
https://docs.microsoft.com/en-us/powershell/module/azure.storage/start-azurestorageblobcopy?view=azurermps-6.13.0 6

QUESTION 82
You are developing an ASP.NET Core website that uses Azure FrontDoor. The website is used to build custom weather data sets for researchers. Data sets are downloaded by users as Comma Separated Value (CSV) files. The data is refreshed every 10 hours.
Specific files must be purged from the FrontDoor cache based upon Response Header values.
You need to purge individual assets from the Front Door cache.
Which type of cache purge should you use?

A. single path
B. wildcard
C. root domain

Answer: A
Explanation:
These formats are supported in the lists of paths to purge:
Single path purge: Purge individual assets by specifying the full path of the asset (without the protocol and domain), with the file extension, for example, /pictures/strasbourg.png; Wildcard purge: Asterisk (*) may be used as a wildcard. Purge all folders, subfolders, and files under an endpoint with /* in the path or purge all subfolders and files under a specific folder by specifying the folder followed by /*, for example, /pictures/*.
Root domain purge: Purge the root of the endpoint with "/" in the path.
Reference:
https://docs.microsoft.com/en-us/azure/frontdoor/front-door-caching 7

QUESTION 83
You are developing a Java application that uses Cassandra to store key and value data. You plan to use a new Azure Cosmos DB resource and the Cassandra API in the application. You create an Azure Active Directory (Azure AD) group named Cosmos DB Creators to enable provisioning of Azure Cosmos accounts, databases, and containers.
The Azure AD group must not be able to access the keys that are required to access the data.
You need to restrict access to the Azure AD group.
Which role-based access control should you use?

A. DocumentDB Accounts Contributor
B. Cosmos Backup Operator
C. Cosmos DB Operator
D. Cosmos DB Account Reader

Answer: C
Explanation:
Azure Cosmos DB now provides a new RBAC role, Cosmos DB Operator. This new role lets you provision Azure Cosmos accounts, databases, and containers, but can't access the keys that are required to access the data. This role is intended for use in scenarios where the ability to grant access to Azure Active Directory service principals to manage deployment operations for Cosmos DB is needed, including the account, database, and containers.
Reference:
https://azure.microsoft.com/en-us/updates/azure-cosmos-db-operator-role-for-role-based-access-control-rbac-is-now-available/ 8

QUESTION 84
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You are developing a website that will run as an Azure Web App. Users will authenticate by using their Azure Active Directory (Azure AD) credentials.
You plan to assign users one of the following permission levels for the website: admin, normal, and reader. A user's Azure AD group membership must be used to determine the permission level.
You need to configure authorization.
Solution: Configure the Azure Web App for the website to allow only authenticated requests and require Azure AD log on.
Does the solution meet the goal?

A. Yes
B. No

Answer: B
Explanation:
Instead in the Azure AD application's manifest, set value of the groupMembershipClaims option to All.
Reference:
https://blogs.msdn.microsoft.com/waws/2017/03/13/azure-app-service-authentication-aad-groups/ 11 10 9

QUESTION 85
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You are developing a website that will run as an Azure Web App. Users will authenticate by using their Azure Active Directory (Azure AD) credentials.
You plan to assign users one of the following permission levels for the website: admin, normal, and reader. A user's Azure AD group membership must be used to determine the permission level.
You need to configure authorization.
Solution: Create a new Azure AD application. In the application's manifest, set value of the groupMembershipClaims option to All.
In the website, use the value of the groups claim from the JWT for the user to determine permissions.
Does the solution meet the goal?

A. Yes
B. No

Answer: A
Explanation:
To configure Manifest to include Group Claims in Auth Token
1. Go to Azure Active Directory to configure the Manifest. Click on Azure Active Directory, and go to App registrations to find your application:
2. Click on your application (or search for it if you have a lot of apps) and edit the Manifest by clicking on it.
3. Locate the "groupMembershipClaims" setting. Set its value to either "SecurityGroup" or "All". To help
you decide which:
"SecurityGroup" - groups claim will contain the identifiers of all security groups of which the user is a member.
"All" - groups claim will contain the identifiers of all security groups and all distribution lists of which the user is a member
Now your application will include group claims in your manifest and you can use this fact in your code.
Reference:
https://blogs.msdn.microsoft.com/waws/2017/03/13/azure-app-service-authentication-aad-groups/ 11 10 9

QUESTION 86
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You are developing a website that will run as an Azure Web App. Users will authenticate by using their Azure Active Directory (Azure AD) credentials.
You plan to assign users one of the following permission levels for the website: admin, normal, and reader. A user's Azure AD group membership must be used to determine the permission level.
You need to configure authorization.
Solution:
Create a new Azure AD application. In the application's manifest, define application roles that match the required permission levels for the application.
Assign the appropriate Azure AD group to each role. In the website, use the value of the roles claim from the JWT for the user to determine permissions.
Does the solution meet the goal?

A. Yes
B. No

Answer: B
Explanation:
To configure Manifest to include Group Claims in Auth Token
1. Go to Azure Active Directory to configure the Manifest. Click on Azure Active Directory, and go to App registrations to find your application:
2. Click on your application (or search for it if you have a lot of apps) and edit the Manifest by clicking on it.
3. Locate the "groupMembershipClaims" setting. Set its value to either "SecurityGroup" or "All". To help you decide which:
"SecurityGroup" - groups claim will contain the identifiers of all security groups of which the user is a member.
"All" - groups claim will contain the identifiers of all security groups and all distribution lists of which the user is a member
Now your application will include group claims in your manifest and you can use this fact in your code.
Reference:
https://blogs.msdn.microsoft.com/waws/2017/03/13/azure-app-service-authentication-aad-groups/ 11 10 9

QUESTION 87
You develop and deploy an ASP.NET web app to Azure App Service. You use Application Insights telemetry to monitor the app.
You must test the app to ensure that the app is available and responsive from various points around the world and at regular intervals. If the app is not responding, you must send an alert to support staff.
You need to configure a test for the web app.
Which two test types can you use? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.

A. integration
B. multi-step web
C. URL ping
D. unit
E. load

Answer: BC
Explanation:
There are three types of availability tests:
URL ping test: a simple test that you can create in the Azure portal.
Multi-step web test: A recording of a sequence of web requests, which can be played back to test more complex scenarios. Multi-step web tests are created in Visual Studio Enterprise and uploaded to the portal for execution.
Custom Track Availability Tests: If you decide to create a custom application to run availability tests, the TrackAvailability() method can be used to send the results to Application Insights.
Reference:
https://docs.microsoft.com/en-us/azure/azure-monitor/app/monitor-web-app-availability 12

QUESTION 88
You are developing an e-commerce solution that uses a microservice architecture.
You need to design a communication backplane for communicating transactional messages between various parts of the solution. Messages must be communicated in first-in-first-out (FIFO) order.
What should you use?

A. Azure Storage Queue
B. Azure Event Hub
C. Azure Service Bus
D. Azure Event Grid

Answer: A
Explanation:
As a solution architect/developer, you should consider using Service Bus queues when:
Your solution requires the queue to provide a guaranteed first-in-first-out (FIFO) ordered delivery.
Reference:
https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-azure-and-service-bus-queues-compared-contrasted 13

QUESTION 89
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You are developing an Azure Service application that processes queue data when it receives a message from a mobile application. Messages may not be sent to the service consistently.
You have the following requirements:
- Queue size must not grow larger than 80 gigabytes (GB).
- Use first-in-first-out (FIFO) ordering of messages.
- Minimize Azure costs.
You need to implement the messaging solution.
Solution: Use the .Net API to add a message to an Azure Storage Queue from the mobile application. Create an Azure Function App that uses an Azure Storage Queue trigger.
Does the solution meet the goal?

A. Yes
B. No

Answer: B
Explanation:
Create an Azure Function App that uses an Azure Service Bus Queue trigger.
Reference:
https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-storage-queue-triggered-function 14


Resources From:

1.2020 Latest Braindump2go AZ-204 Exam Dumps (PDF & VCE) Free Share:
https://www.braindump2go.com/az-204.html

2.2020 Latest Braindump2go AZ-204 PDF and AZ-204 VCE Dumps Free Share:
https://drive.google.com/drive/folders/1qSt-4_6x_lmYQoqET8lOEQmDK4tWxrUP?usp=sharing

3.2020 Free Braindump2go AZ-204 PDF Download:
https://www.braindump2go.com/free-online-pdf/AZ-204-PDF(84-94).pdf
https://www.braindump2go.com/free-online-pdf/AZ-204-PDF-Dumps(67-72).pdf
https://www.braindump2go.com/free-online-pdf/AZ-204-VCE(95-108).pdf
https://www.braindump2go.com/free-online-pdf/AZ-204-VCE-Dumps(73-83).pdf

Free Resources from Braindump2go,We Devoted to Helping You 100% Pass All Exams!

Links:
  1. https://docs.microsoft.com/en-us/azure/azure-funct ions/functions-best-practices
  2. https://docs.microsoft.com/en-us/azure/azure-funct ions/functions-best-practices
  3. https://docs.microsoft.com/en-us/azure/azure-funct ions/functions-best-practices
  4. https://docs.microsoft.com/en-us/azure/cosmos-db/s ynthetic-partition-keys
  5. https://docs.microsoft.com/en-us/azure/storage/blo bs/storage-blob-storage-tiers?tabs=azure-portal
  6. https://docs.microsoft.com/en-us/powershell/module /azure.storage/start-azurestorageblobcopy?view=azu rermps-6.13.0
  7. https://docs.microsoft.com/en-us/azure/frontdoor/f ront-door-caching
  8. https://azure.microsoft.com/en-us/updates/azure-co smos-db-operator-role-for-role-based-access-contro l-rbac-is-now-available/
  9. https://blogs.msdn.microsoft.com/waws/2017/03/13/a zure-app-service-authentication-aad-groups/
  10. https://blogs.msdn.microsoft.com/waws/2017/03/13/a zure-app-service-authentication-aad-groups/
  11. https://blogs.msdn.microsoft.com/waws/2017/03/13/a zure-app-service-authentication-aad-groups/
  12. https://docs.microsoft.com/en-us/azure/azure-monit or/app/monitor-web-app-availability
  13. https://docs.microsoft.com/en-us/azure/service-bus -messaging/service-bus-azure-and-service-bus-queue s-compared-contrasted
  14. https://docs.microsoft.com/en-us/azure/azure-funct ions/functions-create-storage-queue-triggered-func tion
Post date: 2020-11-12 06:43:34
Post date GMT: 2020-11-12 06:43:34

Post modified date: 2020-11-12 06:43:34
Post modified date GMT: 2020-11-12 06:43:34

Export date: Thu Nov 21 13:44:59 2024 / +0000 GMT
This page was exported from Braindump2go Free Exam Dumps with PDF and VCE Collection [ https://www.mcitpdump.com ]
Export of Post and Page has been powered by [ Universal Post Manager ] plugin from www.ProfProjects.com