Comprehensive Questions
with Verified Answers
Graded A+
1. Set up asynchronous data replication to another RDS DB instance
hosted in another AWS Region Answer: Create a read replica
2. A parallel file system for "hot" (frequently accessed) data Answer:
Amazon FSx for Lustre
3. Implement synchronous data replication across AZs w/ automatic
failover in Amazon RDS Answer: Enable Multi-AZ deployment in Amazon RDS
4. Needs a storage service to host "cold" (infrequently accessed) data
Answer: Amazon S3 Glacier
5. Set up a relational DB and a DR plan w/ an RPO of 1 second and RTO
of less than 1 minute Answer: Use Amazon Aurora Global Database
6. Monitor database metrics and send email notifications if a specific
threshold has been breached Answer: Create an SNS topic and add the topic in the
CloudWatch alarm
7. Set up a DNS failover to a static website Answer: Use Route 53 with the
failover option to a static S3 website bucket or CloudFront distribution
8. Implement an automated backup for all the EBS volumes Answer: Use
Amazon Data Lifecycle Manager to automate the creation of EBS snapshots
9. Monitor the available swap space of your EC2 instances Answer: Install
the CloudWatch agent and monitor the SwapUtilization metric
10. Implement a 90-day backup retention policy on Amazon Aurora
Answer: Use AWS Backup
1/
5
, 11. Implement a fanout messaging Answer: Create an SNS topic w/ a message
filtering polcy and configure multiple SQS queues to subscribe to the topic
12. A DB that has a read replication latency of less than 1 second
Answer: Use Amazon Aurora w/ cross-region replicas
13. A specific type of ELB that uses UDP as the protocol for
communication between clients and thousands of game servers
around the world Answer: Use a Network LB for TCP/UDP protocols
14. Monitor the memory and disk space utilization of an EC2
instance Answer: Install Amazon CloudWatch agent on the instance
15. Retrieve a subset of data from a large CSV file stored in the S3
bucket Answer: Perform an S3 Select operation based on the bucket's name and object's
key
16. Upload 1 TB file to an S3 bucket Answer: Use Amazon S3 multipart upload API
to upload large objects in parts
2/
5