Professional-Cloud-Architect
Exam
Google Cloud Architect Professional
Questions & Answers (Demo Version)
https://examslead.com/Professional-Cloud-Architect-practice-exam-dumps/
Buy Full Product Here:
PROFESSIONAL-CLOUD-ARCHITECT
Exam
Google Cloud Architect Professional Exam
Questions & Answers
Demo
,Questions & Answers PDF Page 3
Version: 14.0
Case Study: 1
TerramEarth Case 2
Company Overview
TerramEarth manufactures heavy equipment for the mining and agricultural industries. About 80% of
their business is from mining and 20% from agriculture. They currently have over 500 dealers and
service centers in 100 countries. Their mission is to build products that make their customers more
productive.
Solution Concept
There are 20 million TerramEarth vehicles in operation that collect 120 fields of data per second. Data
is stored locally on the vehicle and can be accessed for analysis when a vehicle is serviced. The data is
downloaded via a maintenance port. This same port can be used to adjust operational parameters,
allowing the vehicles to be upgraded in the field with new computing modules.
Approximately 200,000 vehicles are connected to a cellular network, allowing TerramEarth to collect
data directly. At a rate of 120 fields of data per second with 22 hours of operation per day,
TerramEarth collects a total of about 9 TB/day from these connected vehicles.
Existing Technical Environment
TerramEarth’s existing architecture is composed of Linux and Windows-based systems that reside in a
single U.S. west coast based data center. These systems gzip CSV files from the field and upload via
FTP, and place the data in their data warehouse. Because this process takes time, aggregated reports
are based on data that is 3 weeks old.
With this data, TerramEarth has been able to preemptively stock replacement parts and reduce
unplanned downtime of their vehicles by 60%. However, because the data is stale, some customers
are without their vehicles for up to 4 weeks while they wait for replacement parts.
Business Requirements
Decrease unplanned vehicle downtime to less than 1 week.
Support the dealer network with more data on how their customers use their equipment to better
position new products and services
Have the ability to partner with different companies – especially with seed and fertilizer suppliers in
the fast-growing agricultural business – to create compelling joint offerings for their customers.
Technical Requirements
Expand beyond a single datacenter to decrease latency to the American Midwest and east coast.
Create a backup strategy.
,Questions & Answers PDF Page 4
Increase security of data transfer from equipment to the datacenter.
Improve data in the data warehouse.
Use customer and equipment data to anticipate customer needs.
Application 1: Data ingest
A custom Python application reads uploaded datafiles from a single server, writes to the data
warehouse.
Compute:
Windows Server 2008 R2
- 16 CPUs
- 128 GB of RAM
- 10 TB local HDD storage
Application 2: Reporting
An off the shelf application that business analysts use to run a daily report to see what equipment
needs repair. Only 2 analysts of a team of 10 (5 west coast, 5 east coast) can connect to the reporting
application at a time.
Compute:
Off the shelf application. License tied to number of physical CPUs
- Windows Server 2008 R2
- 16 CPUs
- 32 GB of RAM
- 500 GB HDD
Data warehouse:
A single PostgreSQL server
- RedHat Linux
- 64 CPUs
- 128 GB of RAM
- 4x 6TB HDD in RAID 0
Executive Statement
Our competitive advantage has always been in the manufacturing process, with our ability to build
better vehicles for lower cost than our competitors. However, new products with different
approaches are constantly being developed, and I’m concerned that we lack the skills to undergo the
next wave of transformations in our industry. My goals are to build our skills while addressing
immediate market needs through incremental innovations.
,Questions & Answers PDF Page 5
Question: 1
For this question, refer to the TerramEarth case study. To be compliant with European GDPR
regulation, TerramEarth is required to delete data generated from its European customers after a
period of 36 months when it contains personal dat
a. In the new architecture, this data will be stored in both Cloud Storage and BigQuery. What should
you do?
A. Create a BigQuery table for the European data, and set the table retention period to 36 months.
For Cloud Storage, use gsutil to enable lifecycle management using a DELETE action with an Age
condition of 36 months.
B. Create a BigQuery table for the European data, and set the table retention period to 36 months.
For Cloud Storage, use gsutil to create a SetStorageClass to NONE action when with an Age
condition of 36 months.
C. Create a BigQuery time-partitioned table for the European data, and set the partition expiration
period to 36 months. For Cloud Storage, use gsutil to enable lifecycle management using a DELETE
action with an Age condition of 36 months.
D. Create a BigQuery time-partitioned table for the European data, and set the partition period to 36
months. For Cloud Storage, use gsutil to create a SetStorageClass to NONE action with an Age
condition of 36 months.
Answer: B
Question: 2
For this question, refer to the TerramEarth case study. TerramEarth has decided to store data files in
Cloud Storage. You need to configure Cloud Storage lifecycle rule to store 1 year of data and
minimize file storage cost.
Which two actions should you take?
A. Create a Cloud Storage lifecycle rule with Age: “30”, Storage Class: “Standard”, and Action: “Set
to Coldline”, and create a second GCS life-cycle rule with Age: “365”, Storage Class: “Coldline”, and
Action: “Delete”.
B. Create a Cloud Storage lifecycle rule with Age: “30”, Storage Class: “Coldline”, and Action: “Set to
Nearline”, and create a second GCS life-cycle rule with Age: “91”, Storage Class: “Coldline”, and
Action: “Set to Nearline”.
C. Create a Cloud Storage lifecycle rule with Age: “90”, Storage Class: “Standard”, and Action: “Set
to Nearline”, and create a second GCS life-cycle rule with Age: “91”, Storage Class: “Nearline”, and
Action: “Set to Coldline”.
D. Create a Cloud Storage lifecycle rule with Age: “30”, Storage Class: “Standard”, and Action: “Set
to Coldline”, and create a second GCS life-cycle rule with Age: “365”, Storage Class: “Nearline”, and
Action: “Delete”.
Answer: D
,Questions & Answers PDF Page 6
Question: 3
For this question, refer to the TerramEarth case study. You need to implement a reliable, scalable
GCP solution for the data warehouse for your company, TerramEarth. Considering the TerramEarth
business and technical requirements, what should you do?
A. Replace the existing data warehouse with BigQuery. Use table partitioning.
B. Replace the existing data warehouse with a Compute Engine instance with 96 CPUs.
C. Replace the existing data warehouse with BigQuery. Use federated data sources.
D. Replace the existing data warehouse with a Compute Engine instance with 96 CPUs. Add an
additional Compute Engine pre-emptible instance with 32 CPUs.
Answer: C
Case Study: 2
Mountkrik Games Case 2
Company Overview
Mountkirk Games makes online, session-based, multiplayer games for mobile platforms. They build
all of their games using some server-side integration. Historically, they have used cloud providers to
lease physical servers.
Due to the unexpected popularity of some of their games, they have had problems scaling their
global audience, application servers, MySQL databases, and analytics tools.
Their current model is to write game statistics to files and send them through an ETL tool that loads
them into a centralized MySQL database for reporting.
Solution Concept
Mountkirk Games is building a new game, which they expect to be very popular. They plan to deploy
the game’s backend on Google Compute Engine so they can capture streaming metrics, run intensive
analytics, and take advantage of its autoscaling server environment and integrate with a managed
NoSQL database.
Business Requirements
Increase to a global footprint.
Improve uptime – downtime is loss of players.
Increase efficiency of the cloud resources we use.
Reduce latency to all customers.
Technical Requirements
Requirements for Game Backend Platform
,Questions & Answers PDF Page 7
Dynamically scale up or down based on game activity.
Connect to a transactional database service to manage user profiles and game state.
Store game activity in a timeseries database service for future analysis.
As the system scales, ensure that data is not lost due to processing backlogs.
Run hardened Linux distro.
Requirements for Game Analytics Platform
Dynamically scale up or down based on game activity
Process incoming data on the fly directly from the game servers
Process data that arrives late because of slow mobile networks
Allow queries to access at least 10 TB of historical data
Process files that are regularly uploaded by users’ mobile devices
Executive Statement
Our last successful game did not scale well with our previous cloud provider, resulting in lower user
adoption and affecting the game’s reputation. Our investors want more key performance indicators
(KPIs) to evaluate the speed and stability of the game, as well as other metrics that provide deeper
insight into usage patterns so we can adapt the game to target users. Additionally, our current
technology stack cannot provide the scale we need, so we want to replace MySQL and move to an
environment that provides autoscaling, low latency load balancing, and frees us up from managing
physical servers.
Question: 4
For this question, refer to the Mountkirk Games case study. Mountkirk Games wants to migrate from
their current analytics and statistics reporting model to one that meets their technical requirements on
Google Cloud Platform.
Which two steps should be part of their migration plan? (Choose two.)
A. Evaluate the impact of migrating their current batch ETL code to Cloud Dataflow.
B. Write a schema migration plan to denormalize data for better performance in BigQuery.
C. Draw an architecture diagram that shows how to move from a single MySQL database to a MySQL
cluster.
D. Load 10 TB of analytics data from a previous game into a Cloud SQL instance, and run test queries
against the full dataset to confirm that they complete successfully.
E. Integrate Cloud Armor to defend against possible SQL injection attacks in analytics files uploaded
to Cloud Storage.
,Questions & Answers PDF Page 8
Answer: AB
Question: 5
For this question, refer to the Mountkirk Games case study. You need to analyze and define the
technical architecture for the compute workloads for your company, Mountkirk Games. Considering
the Mountkirk Games business and technical requirements, what should you do?
A. Create network load balancers. Use preemptible Compute Engine instances.
B. Create network load balancers. Use non-preemptible Compute Engine instances.
C. Create a global load balancer with managed instance groups and autoscaling policies. Use
preemptible Compute Engine instances.
D. Create a global load balancer with managed instance groups and autoscaling policies. Use non-
preemptible Compute Engine instances.
Answer: C
Question: 6
For this question, refer to the Mountkirk Games case study. Mountkirk Games wants to design their
solution for the future in order to take advantage of cloud and technology improvements as they
become available. Which two steps should they take? (Choose two.)
A. Store as much analytics and game activity data as financially feasible today so it can be used to train
machine learning models to predict user behavior in the future.
B. Begin packaging their game backend artifacts in container images and running them on Kubernetes Engine
to improve the availability to scale up or down based on game activity.
C. Set up a CI/CD pipeline using Jenkins and Spinnaker to automate canary deployments and improve
development velocity.
D. Adopt a schema versioning tool to reduce downtime when adding new game features that require storing
additional player data in the database.
E. Implement a weekly rolling maintenance process for the Linux virtual machines so they can apply critical
kernel patches and package updates and reduce the risk of 0-day vulnerabilities.
Answer: CE
Question: 7
For this question, refer to the Mountkirk Games case study. Mountkirk Games wants you to design a
way to test the analytics platform’s resilience to changes in mobile network latency. What should you
do?
A. Deploy failure injection software to the game analytics platform that can inject additional latency to
mobile client analytics traffic.
B. Build a test client that can be run from a mobile phone emulator on a Compute Engine virtual
, Questions & Answers PDF Page 9
machine, and run multiple copies in Google Cloud Platform regions all over the world to generate
realistic traffic.
C. Add the ability to introduce a random amount of delay before beginning to process analytics files
uploaded from mobile devices.
D. Create an opt-in beta of the game that runs on players' mobile devices and collects response times
from analytics endpoints running in Google Cloud Platform regions all over the world.
Answer: C
Question: 8
For this question, refer to the Mountkirk Games case study. You need to analyze and define the
technical architecture for the database workloads for your company, Mountkirk Games. Considering
the business and technical requirements, what should you do?
A. Use Cloud SQL for time series data, and use Cloud Bigtable for historical data queries.
B. Use Cloud SQL to replace MySQL, and use Cloud Spanner for historical data queries.
C. Use Cloud Bigtable to replace MySQL, and use BigQuery for historical data queries.
D. Use Cloud Bigtable for time series data, use Cloud Spanner for transactional data, and use
BigQuery for historical data queries.
Answer: D
Question: 9
For this question, refer to the Mountkirk Games case study. Which managed storage option meets
Mountkirk’s technical requirement for storing game activity in a time series database service?
A. Cloud Bigtable
B. Cloud Spanner
C. BigQuery
D. Cloud Datastore
Answer: A