Questions and Detailed Answers | Get it
100% Correct Answers
Question #191Topic 1
A company collects temperature, humidity, and atmospheric pressure data
in cities across multiple continents. The average volume of data collected
per site each day is 500 GB. Each site has a high-speed internet
connection. The company's weather forecasting applications are based in a
single Region and analyze the data daily.What is the FASTEST way to
aggregate data from all of these global sites?
A. Enable Amazon S3 Transfer Acceleration on the destination bucket. Use
multipart uploads to directly upload site data to the destination bucket.
B. Upload site data to an Amazon S3 bucket in the closest AWS Region.
Use S3 cross-Region replication to copy objects to the destination bucket.
,C. Schedule AWS Snowball jobs daily to transfer data to the closest AWS
Region. Use S3 cross-Region replication to copy objects to the destination
bucket.
D. Upload the data to an Amazon EC2 instance in the closest R - 🧠
ANSWER ✔✔A
Question #192Topic 1
A company has a custom application running on an Amazon EC instance
that:ג€¢ Reads a large amount of data from Amazon S3ג€¢ Performs a
multi-stage analysisג€¢ Writes the results to Amazon DynamoDBThe
application writes a significant number of large, temporary files during the
multi-stage analysis. The process performance depends on the temporary
storage performance.What would be the fastest storage option for holding
the temporary files?
A. Multiple Amazon S3 buckets with Transfer Acceleration for storage.
B. Multiple Amazon EBS drives with Provisioned IOPS and EBS
optimization.
C. Multiple Amazon EFS volumes using the Network File System version
4.1 (NFSv4.1) protocol.
,D. Multiple instance store volumes with software RAID 0. - 🧠 ANSWER
✔✔D
Question #193Topic 1
A leasing company generates and emails PDF statements every month for
all its customers. Each statement is about 400 KB in size. Customers can
download their statements from the website for up to 30 days from when
the statements were generated. At the end of their 3-year lease, the
customers are emailed a ZIP file that contains all the statements.What is
the MOST cost-effective storage solution for this situation?
A. Store the statements using the Amazon S3 Standard storage class.
Create a lifecycle policy to move the statements to Amazon S3 Glacier
storage after 1 day.
B. Store the statements using the Amazon S3 Glacier storage class. Create
a lifecycle policy to move the statements to Amazon S3 Glacier Deep
Archive storage after 30 days.
C. Store the statements using the Amazon S3 Standard storage class.
Create a lifecycle policy to move the statements to Amazon S3 One Zone-
Infrequent Access (S - 🧠 ANSWER ✔✔D
COPYRIGHT©NINJANERD 2025/2026. YEAR PUBLISHED 2025. COMPANY REGISTRATION NUMBER: 619652435. TERMS OF USE. PRIVACY
STATEMENT. ALL RIGHTS RESERVED
3
, Question #194Topic 1
A company recently released a new type of internet-connected sensor. The
company is expecting to sell thousands of sensors, which are designed to
stream high volumes of data each second to a central location. A solutions
architect must design a solution that ingests and stores data so that
engineering teams can analyze it in near-real time with millisecond
responsiveness.Which solution should the solutions architect recommend?
A. Use an Amazon SQS queue to ingest the data. Consume the data with
an AWS Lambda function, which then stores the data in Amazon Redshift.
B. Use an Amazon SQS queue to ingest the data. Consume the data with
an AWS Lambda function, which then stores the data in Amazon
DynamoDB.
C. Use Amazon Kinesis Data Streams to ingest the data. Consume the
data with an AWS Lambda function, which then stores the data in Amazon
Redshift.
D. Use Amazon Kinesis Data Streams to ingest the d - 🧠 ANSWER ✔✔D
Question #195Topic 1