Google Cloud Architect Professional Professional Cloud Architect Exam Practice Test

Page: 1 / 14
Total 276 questions
Question 1

The application reliability team at your company has added a debug feature to their backend service to send all server events to Google Cloud Storage for eventual analysis. The event records are at least 50 KB and at most 15 MB and are expected to peak at 3,000 events per second. You want to minimize data loss.

Which process should you implement?



Question 2

For this question, refer to the Mountkirk Games case study. You need to analyze and define the technical architecture for the database workloads for your company, Mountkirk Games. Considering the business and technical requirements, what should you do?



Question 3

You have deployed an application to Kubernetes Engine, and are using the Cloud SQL proxy container to

make the Cloud SQL database available to the services running on Kubernetes. You are notified that the

application is reporting database connection issues. Your company policies require a post-mortem. What should you do?



Answer : C


Question 4

You are creating a solution to remove backup files older than 90 days from your backup Cloud Storage bucket. You want to optimize ongoing Cloud Storage spend. What should you do?



Question 5

You have a Python web application with many dependencies that requires 0.1 CPU cores and 128 MB of memory to operate in production. You want to monitor and maximize machine utilization. You also to reliably deploy new versions of the application. Which set of steps should you take?



Question 6

For this question, refer to the JencoMart case study.

JencoMart has decided to migrate user profile storage to Google Cloud Datastore and the application servers to Google Compute Engine (GCE). During the migration, the existing infrastructure will need access to Datastore to upload the dat

a. What service account key-management strategy should you recommend?



Answer : A

https://cloud.google.com/iam/docs/understanding-service-accounts

Migrating data to Google Cloud Platform

Let's say that you have some data processing that happens on another cloud provider and you want to transfer the processed data to Google Cloud Platform. You can use a service account from the virtual machines on the external cloud to push the data to Google Cloud Platform. To do this, you must create and download a service account key when you create the service account and then use that key from the external process to call the Cloud Platform APIs.


https://cloud.google.com/iam/docs/understanding-service-accounts#migrating_data_to_google_cloud_platform

Question 7

You write a Python script to connect to Google BigQuery from a Google Compute Engine virtual machine. The script is printing errors that it cannot connect to BigQuery. What should you do to fix the script?



Answer : B

The error is most like caused by the access scope issue. When create new instance, you have the default Compute engine default service account but most serves access including BigQuery is not enable. Create an instance Most access are not enabled by default You have default service account but don't have the permission (scope) you can stop the instance, edit, change scope and restart it to enable the scope access. Of course, if you Run your script on a new virtual machine with the BigQuery access scope enabled, it also works

https://cloud.google.com/compute/docs/access/service-accounts


Page:    1 / 14   
Total 276 questions