100% de satisfacción garantizada Inmediatamente disponible después del pago Tanto en línea como en PDF No estas atado a nada 4.2 TrustPilot
logo-home
Examen

AWS Certified Developer Associate 4: Multiple Choice Questions And Correct Answers with Rationale,100% Verified

Puntuación
-
Vendido
-
Páginas
42
Grado
A+
Subido en
04-02-2024
Escrito en
2023/2024

AWS Certified Developer Associate 4: Multiple Choice Questions And Correct Answers with Rationale,100% Verified You are a developer for a company. You have been asked to deploy an application for development purposes onto an Elastic beanstalk environment. You need to ensure that custom software is installed on the backend Linux servers that are launched as part of the Elastic Beanstalk environment. Which of the following can be used to achieve this? Choose 2 answers from the options given below A. Create an XML file with the required package names to be installed on the server B. Create an YAML file with the required package names to be installed on the server C. Place the file in the ebextensions folder in your Application Source Bundle D. Place the file in the .config folder in your Application Source Bundle Explanation : Answer - B and C The AWS Documentation mentions the following AWS Elastic Beanstalk supports a large number of configuration options ( modify the settings that are applied to resources in your environment. Several of these options have default values that can be overridden to customize your environment. Other options can be configured to enable additional features. Elastic Beanstalk supports two methods of saving configuration option settings. Configuration files in YAML or JSON format can be included in your application's source code in a directory named .ebextensions and deployed as part of your application source bundle. You create and manage configuration files locally. Option A is invalid because the configuration file needs to be in YAML or JSON format Option D is invalid because the configuration file needs to be placed in the .ebextensions folder For more information on the environment configuration options , please refer to the below URL Your company currently used Puppet as its configuration management software. You are the development lead and now have to deploy an application for development onto AWS. You have to leverage your company's existing scripts on Puppet for deployment of the environment. Which of the following would be the best service for deployment of the application? A. AWS Elasticbeanstalk B. AWS Opswork C. AWS Redshift D. AWS DynamoDB Explanation : Answer - B The AWS Documentation mentions the following AWS OpsWorks is a configuration management service that helps you configure and operate applications in a cloud enterprise by using Puppet or Chef. AWS OpsWorks Stacks and AWS OpsWorks for Chef Automate let you use Chef ( management, while OpsWorks for Puppet Enterprise lets you configure a Puppet Enterprise ( for enforcing the desired state of your infrastructure, and automating on-demand tasks. Option A is incorrect since Opswork should be chosen as the preference since its support Puppet Options C and D are incorrect since these are data stores For more information on Opswork , please refer to the below URL A company is planning on developing an application that is going to make use of a DynamoDB table. The structure of the table is given below AttributeName Type Description CustomerID String Automatically generated GUID Customer NameString Name of the Customer Location String Place/ Area Interests String Wish list of products Which of the following should be chosen as the partition key to ensure the MOST effective distribution of keys? A. CustomerID B. CustomerName C. Location D. Interests Explanation : Answer - A The most effective one will be the Customer ID since this would ideally be unique and hence give a better key partition. Because of GUID provides programmatically unique Identity. For more information on DynamoDB , please refer to the below URL You're are a developer for a company that has been hired to lead the application development for a company. The application needs to interact with a backend data store. The application would need to perform many complex join operations on the data store. Which of the following would be ideal data store for the application? A. AWS DynamoDB B. AWS RDS C. AWS Redshift D. AWS S3 Explanation : Answer - B Amazon Relational Database Service (Amazon RDS) is a web service that makes it easier to set up, operate, and scale a relational database in the cloud. It provides cost-efficient, resizable capacity for an industry-standard relational database and manages common database administration tasks. Since you need complex query design , it is better to choose one of the available relational database services Option A is incorrect since AWS DynamoDB does not support complex joins Option C is incorrect since this is normally used for petabyte data storage Option D is incorrect since this is used for Object level storage For more information on AWS RDS , please refer to the below URL Your company is planning on storing documents in an S3 bucket. The documents are sensitive, and employees should use Multi Factor authentication when trying to access documents. Which of the following must be done to fulfil this requirement? A. Ensure that Encryption is enabled the bucket AWS server-side encryption B. Ensure that Encryption is enabled the bucket using KMS keys C. Ensure that the a bucket policy is in place with a condition of "aws:MultiFactorAuthPresent":"false" with a Deny policy D. Ensure that the a bucket policy is in place with a condition of "aws:MultiFactorAuthPresent":"true" with a Deny policy Explanation : Answer - C The AWS Documentation gives an example on how to add a bucket policy which ensures that only if users are MFA authenticated , will they have access the bucket. Options A and B are incorrect since the question talks about MFA and not encryption Option D is incorrect since aws:MultiFactorAuthPresent should be checked against the false value for a Deny policy For more information on this use case scenario , please refer to the below URL bucket/ Your development team currently uses Jenkins for managing the CI/CD process. You need to move the process on to AWS. Which of the following service would be the ideal service for this requirement? A. AWS CodeBuild B. AWS CodePipeline C. AWS Elastic Beanstalk D. AWS Opswork Explanation : Answer - B CodePipeline is a continuous delivery service for fast and reliable application updates. Jenkins is a popular continuous integration and continuous delivery tool. Jenkins can build and test your software projects continuously while offering various delivery options as well as a very extensible interface powered by Jenkins plugins. AWS CodePipieline plugin for Jenkins helps to implement the CI/CD process. jenkins-and-aws-elastic-beanstalk/ ( deployment-on-aws-with-aws-codepipeline-jenkins-and-aws-elastic-beansatlk/) Option A is incorrect. The question asks about migration your CI/CD tool to AWS. CodeBuild is not a fully-fledged CI/CD solution. It is solely a build tool; AWS CodeBuild is a fully managed build service that compiles source code, runs tests, and produces software packages that are ready to deploy. Options C and D are incorrect since this is used for managing application environments For more information on AWS CodeBuild , please refer to the below URL An application is being designed to make use of DynamoDB. As per the requirements , the table will accept items which are of 6 KB of size per second. The number of requests per second is estimated to be around 10. If strong consistency is required , what should be the read capacity set for the table? A. 5 B. 10 C. 20 D. 40 Explanation : Answer - C The calculation of throughput capacity for the table would be as follows Since 6KB's is the item size , we need to consider it in chunks of 4KB , hence that would be 2 Since there are around 10 requests per second , that means = 2*10=20 Since its required at strong consistency level , the read capacity would be 20. Based on the calculations , all other options are incorrect For more information on DynamoDB throughput , please refer to the below URL T You've created a Lambda function with the default settings. You add code to this function which makes calls to DynamoDB. You try and test the function. But the function is not completing its execution. Which of the following might be probable causes for this? Choose 2 answers from the options given below A. The IAM Role attached to the function does not have access to DynamoDB B. The timeout of the function has been reached. C. You need to deploy the function first D. You need to create a version for the function first Answer - A and B These are given as some of the requirements in the AWS Documentation Option C is incorrect since deployment is not needed from the AWS Console. Option D is incorrect since this is not a pre-requisite for the function to run For more information on AWS Lambda resource model , please refer to the below URL You've just configured a Lambda function that sits behind the API gateway service. When you try to invoke the Lambda function via the API gateway service from Javascript in your HTML page, you receive the following error. No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'null' is therefore not allowed access What can be done to resolve this error? A. Enable CORS for the lambda function B. Enable CORS for the methods in the API gateway C. Change the IAM policy for the Lambda function to enable anonymous access D. Change the IAM policy for the API gateway to enable anonymous access Answer - B The AWS Documentation mentions the following When your API's resources receive requests from a domain other than the API's own domain, you must enable cross-origin resource sharing (CORS) for selected methods on the resource. This amounts to having your API respond to the OPTIONS preflight request with at least the following CORS-required response headers: Access-Control-Allow-Methods Access-Control-Allow-Headers Access-Control-Allow-Origin Option A is incorrect because CORS is set on the API gateway level and not the Lambda function Options C and D are incorrect since IAM Policy is not the reason as to why the error is occurring For more information on CORS for the API gateway , please refer to the below URL As a developer , you have created some Lambda functions and are now hosting them via the AWS API gateway service. You need to control access to the API gateway. Which of the following can be incorporated to control access to the API gateway? Choose 2 answers from the options given below. A. AWS Cognito User pool B. Lambda Authorizers C. API Methods D. API stages Answer - A and B The AWS Documentation mentions the following As an alternative to using IAM roles and policies ( authorizers ( ) (formerly known as custom authorizers), you can use an Amazon Cognito user pool ( html) to control who can access your API in Amazon API Gateway. Option C is invalid since these are used to define methods such as GET , POST for your API gateway Option D is invalid since this is used to host the different stages for your API gateway For more information on using the API gateway with Cognito , please refer to the below URL html You are planning on hosting a static web site using the features available with S3. Which of the following steps need to be carried out in order to ensure that you can host your static web site in S3. Choose 3 answers from the options given below A. Enable WebSite hosting B. Enable versioning for the bucket C. Configure an Index document D. Ensure that permissions are set for Website access Answer - A,C and D This is given in the AWS Documentation Configuring a Bucket for Website Hosting You can host a static website in an Amazon Simple Storage Service (Amazon S3) bucket. However, to do so requires some configuration. Some optional configurations are also available, depending on your website requirements. Required configurations: Enabling Website Hosting Configuring Index Document Support Permissions Required for Website Access Option B is invalid since this is not a pre-requisite to have web sites hosted in S3 For more information on S3 web site hosting , please refer to the below URL You are developing an application which will make use of Kinesis Firehose for streaming the records onto the Simple Storage Service. You company policy mandates that all data needs to be encrypted at rest. How can you achieve this with Kinesis Firehose? Choose 2 answers for the options given below. A. Enable Encryption for a Kinesis Data Firehose B. Install an SSL certificate in Kinesis Data Firehose C. Ensure that all data records are transferred via SSL D. Ensure that Kinesis streams are used to transfer the data from the producers Answer - A and D This is given in the AWS Documentation If you have sensitive data, you can enable server-side data encryption when you use Amazon Kinesis Data Firehose. However, this is only possible if you use a Kinesis stream as your data source. When you configure a Kinesis stream as the data source of a Kinesis Data Firehose delivery stream, Kinesis Data Firehose no longer stores the data at rest. Instead, the data is stored in the Kinesis stream. Options B and C are invalid because this is used for encrypting data in transit For more information on Data encryption with Kinesis Firehose , please refer to the below URL You're the team lead for an application. You have been instructed to make use of Jenkins for your CI/CD pipeline and other AWS Services for deployment purposes. Which of the following would you consider fulfilling this requirement? Select 2 Options. A. Configure an EC2 Instance with Jenkins Installed B. Configure the Access Keys on the EC2 Instance to access Code Pipeline C. Configure an IAM Role for EC2 to access Code Pipeline D. Use the AWS CodeBuild service Answer - A and C This is given in the AWS Documentation As a best practice, when you use a Jenkins build provider for your pipeline's build or test action, install Jenkins on an Amazon EC2 instance and configure a separate EC2 instance profile. Make sure the instance profile grants Jenkins only the AWS permissions required to perform tasks for your project, such as retrieving files from Amazon S3. The instance profile provides applications running on an Amazon EC2 instance with the credentials to access other AWS services. As a result, you do not need to configure AWS credentials (AWS access key and secret key). Option B is incorrect since this is not the secure way to access AWS resources Option D is incorrect since you have been told as per the question to make use of the Jenkins software For more information on Best practises for CodePipeline , please refer to the below URL You are developing a system that will be sending messages to an SQS queue. Another application will be running on an EC2 Instance that will be used to process the messages. Which of the following are BEST practices when it comes to making COST effective use of the SQS queues? Choose 2 answers from the options given below A. Use short polling for SQS queues B. Use long polling for SQS queues C. Group the SQS API operations in batches D. Use single queue operations Answer - B and C This is given in the AWS Documentation Reducing Amazon SQS Costs The following best practices can help you reduce costs and take advantage of additional potential cost reduction and near-instantaneous response. Batching Message Actions To reduce costs, batch your message actions: To send, receive, and delete messages, and to change the message visibility timeout for multiple messages with a single action, use the Amazon SQS batch API actions. To combine client-side buffering with request batching, use long polling together with the buffered asynchronous client included with the AWS SDK for Java. Note The Amazon SQS Buffered Asynchronous Client doesn't currently support FIFO queues. Because of what is mentioned in the AWS Documentation as best practices , other options are invalid For more information on reducing costs for SQS , please refer to the below URL html Your application currently makes use of SQS Standard queues. The requirements for the application have now changed, and there is now a need for exactly-once processing of messages. How can you achieve this? A. Use the AWS Console to covert the standard queue to a FIFO queue B. Use the AWS CLI to covert the standard queue to a FIFO queue C. Add the .fifo extension to the existing queue D. Create a new FIFO queue and point the application to the new queue Answer - D This is clearly mentioned in the AWS Documentation Moving from a Standard Queue to a FIFO Queue If you have an existing application that uses standard queues and you want to take advantage of the ordering or exactly-once processing features of FIFO queues, you need to configure the queue and your application correctly. Note You can't convert an existing standard queue into a FIFO queue. To make the move, you must either create a new FIFO queue for your application or delete your existing standard queue

Mostrar más Leer menos
Institución
AWS Certified Developer Associate
Grado
AWS Certified Developer Associate











Ups! No podemos cargar tu documento ahora. Inténtalo de nuevo o contacta con soporte.

Escuela, estudio y materia

Institución
AWS Certified Developer Associate
Grado
AWS Certified Developer Associate

Información del documento

Subido en
4 de febrero de 2024
Número de páginas
42
Escrito en
2023/2024
Tipo
Examen
Contiene
Preguntas y respuestas

Temas

$16.99
Accede al documento completo:

100% de satisfacción garantizada
Inmediatamente disponible después del pago
Tanto en línea como en PDF
No estas atado a nada

Conoce al vendedor

Seller avatar
Los indicadores de reputación están sujetos a la cantidad de artículos vendidos por una tarifa y las reseñas que ha recibido por esos documentos. Hay tres niveles: Bronce, Plata y Oro. Cuanto mayor reputación, más podrás confiar en la calidad del trabajo del vendedor.
TheInstructor NURSING
Seguir Necesitas iniciar sesión para seguir a otros usuarios o asignaturas
Vendido
53
Miembro desde
1 año
Número de seguidores
30
Documentos
4956
Última venta
6 días hace
The Instructor

NURSING, ECONOMICS, MATHEMATICS, BIOLOGY, AND HISTORY MATERIALS BEST TUTORING, HOMEWORK HELP, EXAMS, TESTS, AND STUDY GUIDE MATERIALS WITH GUARANTEED A+ I am a dedicated medical practitioner with diverse knowledge in Nursing and Mathematics. I also have additional knowledge in mathematics-based courses (finance and economics).

3.8

8 reseñas

5
4
4
2
3
0
2
0
1
2

Recientemente visto por ti

Por qué los estudiantes eligen Stuvia

Creado por compañeros estudiantes, verificado por reseñas

Calidad en la que puedes confiar: escrito por estudiantes que aprobaron y evaluado por otros que han usado estos resúmenes.

¿No estás satisfecho? Elige otro documento

¡No te preocupes! Puedes elegir directamente otro documento que se ajuste mejor a lo que buscas.

Paga como quieras, empieza a estudiar al instante

Sin suscripción, sin compromisos. Paga como estés acostumbrado con tarjeta de crédito y descarga tu documento PDF inmediatamente.

Student with book image

“Comprado, descargado y aprobado. Así de fácil puede ser.”

Alisha Student

Preguntas frecuentes