Summer Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: dumps65

Google Associate-Cloud-Engineer Dumps

Google Cloud Certified - Associate Cloud Engineer Questions and Answers

Question 1

You received a JSON file that contained a private key of a Service Account in order to get access to several resources in a Google Cloud project. You downloaded and installed the Cloud SDK and want to use this private key for authentication and authorization when performing gcloud commands. What should you do?

Options:

A.

Use the command gcloud auth login and point it to the private key

B.

Use the command gcloud auth activate-service-account and point it to the private key

C.

Place the private key file in the installation directory of the Cloud SDK and rename it to "credentials ison"

D.

Place the private key file in your home directory and rename it to ‘’GOOGLE_APPUCATION_CREDENTiALS".

Question 2

You have production and test workloads that you want to deploy on Compute Engine. Production VMs need to be in a different subnet than the test VMs. All the VMs must be able to reach each other over internal IP without creating additional routes. You need to set up VPC and the 2 subnets. Which configuration meets these requirements?

Options:

A.

Create a single custom VPC with 2 subnets. Create each subnet in a different region and with a different CIDR range.

B.

Create a single custom VPC with 2 subnets. Create each subnet in the same region and with the same CIDR range.

C.

Create 2 custom VPCs, each with a single subnet. Create each subnet is a different region and with a different CIDR range.

D.

Create 2 custom VPCs, each with a single subnet. Create each subnet in the same region and with the same CIDR range.

Question 3

You need to configure optimal data storage for files stored in Cloud Storage for minimal cost. The files are used in a mission-critical analytics pipeline that is used continually. The users are in Boston, MA (United States). What should you do?

Options:

A.

Configure regional storage for the region closest to the users Configure a Nearline storage class

B.

Configure regional storage for the region closest to the users Configure a Standard storage class

C.

Configure dual-regional storage for the dual region closest to the users Configure a Nearline storage class

D.

Configure dual-regional storage for the dual region closest to the users Configure a Standard storage class

Question 4

You are building a multi-player gaming application that will store game information in a database. As the popularity of the application increases, you are concerned about delivering consistent performance. You need to ensure an optimal gaming performance for global users, without increasing the management complexity. What should you do?

Options:

A.

Use Cloud SQL database with cross-region replication to store game statistics in the EU, US, and APAC regions.

B.

Use Cloud Spanner to store user data mapped to the game statistics.

C.

Use BigQuery to store game statistics with a Redis on Memorystore instance in the front to provide global consistency.

D.

Store game statistics in a Bigtable database partitioned by username.

Question 5

You need to host an application on a Compute Engine instance in a project shared with other teams. You want to prevent the other teams from accidentally causing downtime on that application. Which feature should you use?

Options:

A.

Use a Shielded VM.

B.

Use a Preemptible VM.

C.

Use a sole-tenant node.

D.

Enable deletion protection on the instance.

Question 6

Your auditor wants to view your organization's use of data in Google Cloud. The auditor is most interested in auditing who accessed data in Cloud Storage buckets. You need to help the auditor access the data they need. What should you do?

Options:

A.

Assign the appropriate permissions, and then use Cloud Monitoring to review metrics

B.

Use the export logs API to provide the Admin Activity Audit Logs in the format they want

C.

Turn on Data Access Logs for the buckets they want to audit, and Then build a query in the log viewer that filters on Cloud Storage

D.

Assign the appropriate permissions, and then create a Data Studio report on Admin Activity Audit Logs

Question 7

You are using Data Studio to visualize a table from your data warehouse that is built on top of BigQuery. Data is appended to the data warehouse during the day. At night, the daily summary is recalculated by overwriting the table. You just noticed that the charts in Data Studio are broken, and you want to analyze the problem. What should you do?

Options:

A.

Use the BigQuery interface to review the nightly Job and look for any errors

B.

Review the Error Reporting page in the Cloud Console to find any errors.

C.

In Cloud Logging create a filter for your Data Studio report

D.

Use the open source CLI tool. Snapshot Debugger, to find out why the data was not refreshed correctly.

Question 8

You are the organization and billing administrator for your company. The engineering team has the Project Creator role on the organization. You do not want the engineering team to be able to link projects to the billing account. Only the finance team should be able to link a project to a billing account, but they should not be able to make any other changes to projects. What should you do?

Options:

A.

Assign the finance team only the Billing Account User role on the billing account.

B.

Assign the engineering team only the Billing Account User role on the billing account.

C.

Assign the finance team the Billing Account User role on the billing account and the Project Billing Manager role on the organization.

D.

Assign the engineering team the Billing Account User role on the billing account and the Project Billing Manager role on the organization.

Question 9

The core business of your company is to rent out construction equipment at a large scale. All the equipment that is being rented out has been equipped with multiple sensors that send event information every few seconds. These signals can vary from engine status, distance traveled, fuel level, and more. Customers are billed based on the consumption monitored by these sensors. You expect high throughput – up to thousands of events per hour per device – and need to retrieve consistent data based on the time of the event. Storing and retrieving individual signals should be atomic. What should you do?

Options:

A.

Create a file in Cloud Storage per device and append new data to that file.

B.

Create a file in Cloud Filestore per device and append new data to that file.

C.

Ingest the data into Datastore. Store data in an entity group based on the device.

D.

Ingest the data into Cloud Bigtable. Create a row key based on the event timestamp.

Question 10

You are performing a monthly security check of your Google Cloud environment and want to know who has access to view data stored in your Google Cloud

Project. What should you do?

Options:

A.

Enable Audit Logs for all APIs that are related to data storage.

B.

Review the IAM permissions for any role that allows for data access.

C.

Review the Identity-Aware Proxy settings for each resource.

D.

Create a Data Loss Prevention job.

Question 11

You are using Google Kubernetes Engine with autoscaling enabled to host a new application. You want to expose this new application to the public, using HTTPS on a public IP address. What should you do?

Options:

A.

Create a Kubernetes Service of type NodePort for your application, and a Kubernetes Ingress to expose this Service via a Cloud Load Balancer.

B.

Create a Kubernetes Service of type ClusterIP for your application. Configure the public DNS name of your application using the IP of this Service.

C.

Create a Kubernetes Service of type NodePort to expose the application on port 443 of each node of the Kubernetes cluster. Configure the public DNS name of your application with the IP of every node of the cluster to achieve load-balancing.

D.

Create a HAProxy pod in the cluster to load-balance the traffic to all the pods of the application. Forward the public traffic to HAProxy with an iptable rule. Configure the DNS name of your application using the public IP of the node HAProxy is running on.

Question 12

You created a Kubernetes deployment by running kubectl run nginx image=nginx replicas=1. After a few days, you decided you no longer want this deployment. You identified the pod and deleted it by running kubectl delete pod. You noticed the pod got recreated.

  • $ kubectl get pods
  • NAME READY STATUS RESTARTS AGE
  • nginx-84748895c4-nqqmt 1/1 Running 0 9m41s
  • $ kubectl delete pod nginx-84748895c4-nqqmt
  • pod nginx-84748895c4-nqqmt deleted
  • $ kubectl get pods
  • NAME READY STATUS RESTARTS AGE
  • nginx-84748895c4-k6bzl 1/1 Running 0 25s

What should you do to delete the deployment and avoid pod getting recreated?

Options:

A.

kubectl delete deployment nginx

B.

kubectl delete –deployment=nginx

C.

kubectl delete pod nginx-84748895c4-k6bzl –no-restart 2

D.

kubectl delete inginx

Question 13

You installed the Google Cloud CLI on your workstation and set the proxy configuration. However, you are worried that your proxy credentials will be recorded in the gcloud CLI logs. You want to prevent your proxy credentials from being logged What should you do?

Options:

A.

Configure username and password by using gcloud configure set proxy/username and gcloud configure set proxy/ proxy/password commands.

B.

Encode username and password in sha256 encoding, and save it to a text file. Use filename as a value in the gcloud configure set core/custom_ca_certs_file command.

C.

Provide values for CLOUDSDK_USERNAME and CLOUDSDK_PASSWORD in the gcloud CLI tool configure file.

D.

Set the CLOUDSDK_PROXY_USERNAME and CLOUDSDK_PROXY PASSWORD properties by using environment variables in your command line tool.

Question 14

You want to select and configure a solution for storing and archiving data on Google Cloud Platform. You need to support compliance objectives for data from one geographic location. This data is archived after 30 days and needs to be accessed annually. What should you do?

Options:

A.

Select Multi-Regional Storage. Add a bucket lifecycle rule that archives data after 30 days to Coldline Storage.

B.

Select Multi-Regional Storage. Add a bucket lifecycle rule that archives data after 30 days to Nearline Storage.

C.

Select Regional Storage. Add a bucket lifecycle rule that archives data after 30 days to Nearline Storage.

D.

Select Regional Storage. Add a bucket lifecycle rule that archives data after 30 days to Coldline Storage.

Question 15

Your organization has user identities in Active Directory. Your organization wants to use Active Directory as their source of truth for identities. Your organization wants to have full control over the Google accounts used by employees for all Google services, including your Google Cloud Platform (GCP) organization. What should you do?

Options:

A.

Use Google Cloud Directory Sync (GCDS) to synchronize users into Cloud Identity.

B.

Use the cloud Identity APIs and write a script to synchronize users to Cloud Identity.

C.

Export users from Active Directory as a CSV and import them to Cloud Identity via the Admin Console.

D.

Ask each employee to create a Google account using self signup. Require that each employee use their company email address and password.

Question 16

You will have several applications running on different Compute Engine instances in the same project. You want to specify at a more granular level the service account each instance uses when calling Google Cloud APIs. What should you do?

Options:

A.

When creating the instances, specify a Service Account for each instance

B.

When creating the instances, assign the name of each Service Account as instance metadata

C.

After starting the instances, use gcloud compute instances update to specify a Service Account for each instance

D.

After starting the instances, use gcloud compute instances update to assign the name of the relevant Service Account as instance metadata

Question 17

You need to select and configure compute resources for a set of batch processing jobs. These jobs take around 2 hours to complete and are run nightly. You want to minimize service costs. What should you do?

Options:

A.

Select Google Kubernetes Engine. Use a single-node cluster with a small instance type.

B.

Select Google Kubernetes Engine. Use a three-node cluster with micro instance types.

C.

Select Compute Engine. Use preemptible VM instances of the appropriate standard machine type.

D.

Select Compute Engine. Use VM instance types that support micro bursting.

Question 18

A colleague handed over a Google Cloud Platform project for you to maintain. As part of a security checkup, you want to review who has been granted the Project Owner role. What should you do?

Options:

A.

In the console, validate which SSH keys have been stored as project-wide keys.

B.

Navigate to Identity-Aware Proxy and check the permissions for these resources.

C.

Enable Audit Logs on the IAM & admin page for all resources, and validate the results.

D.

Use the command gcloud projects get–iam–policy to view the current role assignments.

Question 19

Your company's security vulnerability management policy wonts 3 member of the security team to have visibility into vulnerabilities and other OS metadata for a specific Compute Engine instance This Compute Engine instance hosts a critical application in your Goggle Cloud project. You need to implement your company's security vulnerability management policy. What should you dc?

Options:

A.

• Ensure that the Ops Agent Is Installed on the Compute Engine instance.

• Create a custom metric in the Cloud Monitoring dashboard.

• Provide the security team member with access to this dashboard.

B.

• Ensure that the Ops Agent is installed on tie Compute Engine instance.

• Provide the security team member roles/configure.inventoryViewer permission.

C.

• Ensure that the OS Config agent Is Installed on the Compute Engine instance.

• Provide the security team member roles/configure.vulnerabilityViewer permission.

D.

• Ensure that the OS Config agent is installed on the Compute Engine instance

• Create a log sink Co a BigQuery dataset.

• Provide the security team member with access to this dataset.

Question 20

Your company uses a large number of Google Cloud services centralized in a single project. All teams have specific projects for testing and development. The DevOps team needs access to all of the production services in order to perform their job. You want to prevent Google Cloud product changes from broadening their permissions in the future. You want to follow Google-recommended practices. What should you do?

Options:

A.

Grant all members of the DevOps team the role of Project Editor on the organization level.

B.

Grant all members of the DevOps team the role of Project Editor on the production project.

C.

Create a custom role that combines the required permissions. Grant the DevOps team the custom role on the production project.

D.

Create a custom role that combines the required permissions. Grant the DevOps team the custom role on the organization level.

Question 21

You have created a new project in Google Cloud through the gcloud command line interface (CLI) and linked a billing account. You need to create a new Compute

Engine instance using the CLI. You need to perform the prerequisite steps. What should you do?

Options:

A.

Create a Cloud Monitoring Workspace.

B.

Create a VPC network in the project.

C.

Enable the compute googleapis.com API.

D.

Grant yourself the IAM role of Compute Admin.

Question 22

Your customer wants you to create a secure website with autoscaling based on the compute instance CPU load. You want to enhance performance by storing static content in Cloud Storage. Which resources are needed to distribute the user traffic?

Options:

A.

An internal HTTP(S) load balancer together with Identity-Aware Proxy to allow only HTTPS traffic.

B.

An external HTTP(S) load balancer to distribute the load and a URL map to target the requests for the static content to the Cloud Storage backend. Install the HTTPS certificates on the instance.

C.

An external HTTP(S) load balancer with a managed SSL certificate to distribute the load and a URL map to target the requests for the static content to the Cloud Storage backend.

D.

An external network load balancer pointing to the backend instances to distribute the load evenly. The web servers will forward the request to the Cloud Storage as needed.

Question 23

You recently discovered that your developers are using many service account keys during their development process. While you work on a long term improvement, you need to quickly implement a process to enforce short-lived service account credentials in your company. You have the following requirements:

• All service accounts that require a key should be created in a centralized project called pj-sa.

• Service account keys should only be valid for one day.

You need a Google-recommended solution that minimizes cost. What should you do?

Options:

A.

Implement a Cloud Run job to rotate all service account keys periodically in pj-sa. Enforce an org policy to deny service account key creation with an exception to pj-sa.

B.

Implement a Kubernetes Cronjob to rotate all service account keys periodically. Disable attachment of

service accounts to resources in all projects with an exception to pj-sa.

C.

Enforce an org policy constraint allowing the lifetime of service account keys to be 24 hours. Enforce an org policy constraint denying service account key creation with an exception on pj-sa.

D.

Enforce a DENY org policy constraint over the lifetime of service account keys for 24 hours. Disable attachment of service accounts to resources in all projects with an exception to pj-sa.

Question 24

During a recent audit of your existing Google Cloud resources, you discovered several users with email addresses outside of your Google Workspace domain.

You want to ensure that your resources are only shared with users whose email addresses match your domain. You need to remove any mismatched users, and you want to avoid having to audit your resources to identify mismatched users. What should you do?

Options:

A.

Create a Cloud Scheduler task to regularly scan your projects and delete mismatched users.

B.

Create a Cloud Scheduler task to regularly scan your resources and delete mismatched users.

C.

Set an organizational policy constraint to limit identities by domain to automatically remove mismatched users.

D.

Set an organizational policy constraint to limit identities by domain, and then retroactively remove the existing mismatched users.

Question 25

A company wants to build an application that stores images in a Cloud Storage bucket and wants to generate thumbnails as well as resize the images. They want to use a google managed service that can scale up and scale down to zero automatically with minimal effort. You have been asked to recommend a service. Which GCP service would you suggest?

Options:

A.

Google Compute Engine

B.

Google App Engine

C.

Cloud Functions

D.

Google Kubernetes Engine

Question 26

You are working with a user to set up an application in a new VPC behind a firewall. The user is concerned about data egress. You want to configure the fewest open egress ports. What should you do?

Options:

A.

Set up a low-priority (65534) rule that blocks all egress and a high-priority rule (1000) that allows only the appropriate ports.

B.

Set up a high-priority (1000) rule that pairs both ingress and egress ports.

C.

Set up a high-priority (1000) rule that blocks all egress and a low-priority (65534) rule that allows only the appropriate ports.

D.

Set up a high-priority (1000) rule to allow the appropriate ports.

Question 27

You have an application that uses Cloud Spanner as a database backend to keep current state information about users. Cloud Bigtable logs all events triggered by users. You export Cloud Spanner data to Cloud Storage during daily backups. One of your analysts asks you to join data from Cloud Spanner and Cloud Bigtable for specific users. You want to complete this ad hoc request as efficiently as possible. What should you do?

Options:

A.

Create a dataflow job that copies data from Cloud Bigtable and Cloud Storage for specific users.

B.

Create a dataflow job that copies data from Cloud Bigtable and Cloud Spanner for specific users.

C.

Create a Cloud Dataproc cluster that runs a Spark job to extract data from Cloud Bigtable and Cloud Storage for specific users.

D.

Create two separate BigQuery external tables on Cloud Storage and Cloud Bigtable. Use the BigQuery console to join these tables through user fields, and apply appropriate filters.

Question 28

Your company developed a mobile game that is deployed on Google Cloud. Gamers are connecting to the game with their personal phones over the Internet. The game sends UDP packets to update the servers about the gamers' actions while they are playing in multiplayer mode. Your game backend can scale over multiple virtual machines (VMs), and you want to expose the VMs over a single IP address. What should you do?

Options:

A.

Configure an SSL Proxy load balancer in front of the application servers.

B.

Configure an Internal UDP load balancer in front of the application servers.

C.

Configure an External HTTP(s) load balancer in front of the application servers.

D.

Configure an External Network load balancer in front of the application servers.

Question 29

You are building a pipeline to process time-series data. Which Google Cloud Platform services should you put in boxes 1,2,3, and 4?

as

Options:

A.

Cloud Pub/Sub, Cloud Dataflow, Cloud Datastore, BigQuery

B.

Firebase Messages, Cloud Pub/Sub, Cloud Spanner, BigQuery

C.

Cloud Pub/Sub, Cloud Storage, BigQuery, Cloud Bigtable

D.

Cloud Pub/Sub, Cloud Dataflow, Cloud Bigtable, BigQuery

Question 30

You are the team lead of a group of 10 developers. You provided each developer with an individual Google Cloud Project that they can use as their personal sandbox to experiment with different Google Cloud solutions. You want to be notified if any of the developers are spending above $500 per month on their sandbox environment. What should you do?

Options:

A.

Create a single budget for all projects and configure budget alerts on this budget.

B.

Create a separate billing account per sandbox project and enable BigQuery billing exports. Create a Data Studio dashboard to plot the spending per billing account.

C.

Create a budget per project and configure budget alerts on all of these budgets.

D.

Create a single billing account for all sandbox projects and enable BigQuery billing exports. Create a Data Studio dashboard to plot the spending per project.

Question 31

You need to create a custom IAM role for use with a GCP service. All permissions in the role must be suitable for production use. You also want to clearly share with your organization the status of the custom role. This will be the first version of the custom role. What should you do?

Options:

A.

Use permissions in your role that use the ‘supported’ support level for role permissions. Set the role stage to ALPHA while testing the role permissions.

B.

Use permissions in your role that use the ‘supported’ support level for role permissions. Set the role stage to BETA while testing the role permissions.

C.

Use permissions in your role that use the ‘testing’ support level for role permissions. Set the role stage to ALPHA while testing the role permissions.

D.

Use permissions in your role that use the ‘testing’ support level for role permissions. Set the role stage to BETA while testing the role permissions.

Question 32

You are designing an application that lets users upload and share photos. You expect your application to grow really fast and you are targeting a worldwide audience. You want to delete uploaded photos after 30 days. You want to minimize costs while ensuring your application is highly available. Which GCP storage solution should you choose?

Options:

A.

Persistent SSD on VM instances.

B.

Cloud Filestore.

C.

Multiregional Cloud Storage bucket.

D.

Cloud Datastore database.

Question 33

Your company is using Google Workspace to manage employee accounts. Anticipated growth will increase the number of personnel from 100 employees to 1.000 employees within 2 years. Most employees will need access to your company's Google Cloud account. The systems and processes will need to support 10x growth without performance degradation, unnecessary complexity, or security issues. What should you do?

Options:

A.

Migrate the users to Active Directory. Connect the Human Resources system to Active Directory. Turn on Google Cloud Directory Sync (GCDS) for Cloud Identity. Turn on Identity Federation from Cloud Identity to Active Directory.

B.

Organize the users in Cloud Identity into groups. Enforce multi-factor authentication in Cloud Identity.

C.

Turn on identity federation between Cloud Identity and Google Workspace. Enforce multi-factor authentication for domain wide delegation.

D.

Use a third-party identity provider service through federation. Synchronize the users from Google Workplace to the third-party provider in real time.

Question 34

You have a virtual machine that is currently configured with 2 vCPUs and 4 GB of memory. It is running out of memory. You want to upgrade the virtual machine to have 8 GB of memory. What should you do?

Options:

A.

Rely on live migration to move the workload to a machine with more memory.

B.

Use gcloud to add metadata to the VM. Set the key to required-memory-size and the value to 8 GB.

C.

Stop the VM, change the machine type to n1-standard-8, and start the VM.

D.

Stop the VM, increase the memory to 8 GB, and start the VM.

Question 35

You need to enable traffic between multiple groups of Compute Engine instances that are currently running two different GCP projects. Each group of Compute Engine instances is running in its own VPC. What should you do?

Options:

A.

Verify that both projects are in a GCP Organization. Create a new VPC and add all instances.

B.

Verify that both projects are in a GCP Organization. Share the VPC from one project and request that the Compute Engine instances in the other project use this shared VPC.

C.

Verify that you are the Project Administrator of both projects. Create two new VPCs and add all instances.

D.

Verify that you are the Project Administrator of both projects. Create a new VPC and add all instances.

Question 36

You are migrating a production-critical on-premises application that requires 96 vCPUs to perform its task. You want to make sure the application runs in a similar environment on GCP. What should you do?

Options:

A.

When creating the VM, use machine type n1-standard-96.

B.

When creating the VM, use Intel Skylake as the CPU platform.

C.

Create the VM using Compute Engine default settings. Use gcloud to modify the running instance to have 96 vCPUs.

D.

Start the VM using Compute Engine default settings, and adjust as you go based on Rightsizing Recommendations.

Question 37

Your company implemented BigQuery as an enterprise data warehouse. Users from multiple business units run queries on this data warehouse. However, you notice that query costs for BigQuery are very high, and you need to control costs. Which two methods should you use? (Choose two.)

Options:

A.

Split the users from business units to multiple projects.

B.

Apply a user- or project-level custom query quota for BigQuery data warehouse.

C.

Create separate copies of your BigQuery data warehouse for each business unit.

D.

Split your BigQuery data warehouse into multiple data warehouses for each business unit.

E.

Change your BigQuery query model from on-demand to flat rate. Apply the appropriate number of slots to each Project.

Question 38

You are in charge of provisioning access for all Google Cloud users in your organization. Your company recently acquired a startup company that has their own Google Cloud organization. You need to ensure that your Site Reliability Engineers (SREs) have the same project permissions in the startup company's organization as in your own organization. What should you do?

Options:

A.

In the Google Cloud console for your organization, select Create role from selection, and choose destination as the startup company's organization

B.

In the Google Cloud console for the startup company, select Create role from selection and choose source as the startup company's Google Cloud organization.

C.

Use the gcloud iam roles copy command, and provide the Organization ID of the startup company's

Google Cloud Organization as the destination.

D.

Use the gcloud iam roles copy command, and provide the project IDs of all projects in the startup company s organization as the destination.

Question 39

You are storing sensitive information in a Cloud Storage bucket. For legal reasons, you need to be able to record all requests that read any of the stored data. You want to make sure you comply with these requirements. What should you do?

Options:

A.

Enable the Identity Aware Proxy API on the project.

B.

Scan the bucker using the Data Loss Prevention API.

C.

Allow only a single Service Account access to read the data.

D.

Enable Data Access audit logs for the Cloud Storage API.

Question 40

You need to set up a policy so that videos stored in a specific Cloud Storage Regional bucket are moved to Coldline after 90 days, and then deleted after one year from their creation. How should you set up the policy?

Options:

A.

Use Cloud Storage Object Lifecycle Management using Age conditions with SetStorageClass and Delete actions. Set the SetStorageClass action to 90 days and the Delete action to 275 days (365 – 90)

B.

Use Cloud Storage Object Lifecycle Management using Age conditions with SetStorageClass and Delete actions. Set the SetStorageClass action to 90 days and the Delete action to 365 days.

C.

Use gsutil rewrite and set the Delete action to 275 days (365-90).

D.

Use gsutil rewrite and set the Delete action to 365 days.

Question 41

You have a Bigtable instance that consists of three nodes that store personally identifiable information (Pll) data. You need to log all read or write operations, including any metadata or configuration reads of this database table, in your company's Security Information and Event Management (SIEM) system. What should you do?

Options:

A.

• Navigate to Cloud Mentioning in the Google Cloud console, and create a custom monitoring job for the

Bigtable instance to track all changes.

• Create an alert by using webhook endpoints. with the SIEM endpoint as a receiver

B.

■ Navigate to the Audit Logs page in the Google Cloud console, and enable Data Read. Data Write and Admin Read logs for the Bigtable instance

• Create a Pub/Sub topic as a Cloud Logging sink destination, and add your SIEM as a subscriber to the topic.

C.

• Install the Ops Agent on the Bigtable instance during configuration. K

• Create a service account with read permissions for the Bigtable instance.

• Create a custom Dataflow job with this service account to export logs to the company's SIEM system.

D.

• Navigate to the Audit Logs page in the Google Cloud console, and enable Admin Write logs for the

Biglable instance.

• Create a Cloud Functions instance to export logs from Cloud Logging to your SIEM.

Question 42

You have developed a containerized web application that will serve Internal colleagues during business hours. You want to ensure that no costs are incurred outside of the hours the application is used. You have just created a new Google Cloud project and want to deploy the application. What should you do?

Options:

A.

Deploy the container on Cloud Run for Anthos, and set the minimum number of instances to zero

B.

Deploy the container on Cloud Run (fully managed), and set the minimum number of instances to zero.

C.

Deploy the container on App Engine flexible environment with autoscaling. and set the value min_instances to zero in the app yaml

D.

Deploy the container on App Engine flexible environment with manual scaling, and set the value instances to zero in the app yaml

Question 43

You recently received a new Google Cloud project with an attached billing account where you will work. You need to create instances, set firewalls, and store data in Cloud Storage. You want to follow Google-recommended practices. What should you do?

Options:

A.

Use the gcloud CLI services enable cloudresourcemanager.googleapis.com command to enable all resources.

B.

Use the gcloud services enable compute.googleapis.com command to enable Compute Engine and the gcloud services enable storage-api.googleapis.com command to enable the Cloud Storage APIs.

C.

Open the Google Cloud console and enable all Google Cloud APIs from the API dashboard.

D.

Open the Google Cloud console and run gcloud init --project in a Cloud Shell.

Question 44

Your company has embraced a hybrid cloud strategy where some of the applications are deployed on Google Cloud. A Virtual Private Network (VPN) tunnel connects your Virtual Private Cloud (VPC) in Google Cloud with your company's on-premises network. Multiple applications in Google Cloud need to connect to an on-premises database server, and you want to avoid having to change the IP configuration in all of your applications when the IP of the database changes.

What should you do?

Options:

A.

Configure Cloud NAT for all subnets of your VPC to be used when egressing from the VM instances.

B.

Create a private zone on Cloud DNS, and configure the applications with the DNS name.

C.

Configure the IP of the database as custom metadata for each instance, and query the metadata server.

D.

Query the Compute Engine internal DNS from the applications to retrieve the IP of the database.

Question 45

You have deployed an application on a Compute Engine instance. An external consultant needs to access the Linux-based instance. The consultant is connected to your corporate network through a VPN connection, but the consultant has no Google account. What should you do?

Options:

A.

Instruct the external consultant to use the gcloud compute ssh command line tool by using Identity-Aware Proxy to access the instance.

B.

Instruct the external consultant to use the gcloud compute ssh command line tool by using the public IP address of the instance to access it.

C.

Instruct the external consultant to generate an SSH key pair, and request the public key from the consultant.

Add the public key to the instance yourself, and have the consultant access the instance through SSH with their private key.

D.

Instruct the external consultant to generate an SSH key pair, and request the private key from the consultant.

Add the private key to the instance yourself, and have the consultant access the instance through SSH with their public key.

Question 46

You need to manage a Cloud Spanner Instance for best query performance. Your instance in production runs in a single Google Cloud region. You need to improve performance in the shortest amount of time. You want to follow Google best practices for service configuration. What should you do?

Options:

A.

Create an alert in Cloud Monitoring to alert when the percentage of high priority CPU utilization reaches 45% If you exceed this threshold, add nodes lo your instance.

B.

Create an alert in Cloud Monitoring to alert when the percentage to high priority CPU utilization reaches 45% Use database query statistics to identify queries that result in high CPU usage, and then rewrite those queries to optimize their resource usage

C.

Create an alert in Cloud Monitoring to alert when the percentage of high priority CPU utilization reaches 65% If you exceed this threshold, add nodes to your instance

D.

Create an alert in Cloud Monitoring to alert when the percentage of high priority CPU utilization reaches 65%. Use database query statistics to identity queries that result in high CPU usage, and then rewrite those queries to optimize their resource usage.

Question 47

You need to migrate invoice documents stored on-premises to Cloud Storage. The documents have the following storage requirements:

• Documents must be kept for five years.

• Up to five revisions of the same invoice document must be stored, to allow for corrections.

• Documents older than 365 days should be moved to lower cost storage tiers.

You want to follow Google-recommended practices to minimize your operational and development costs. What should you do?

Options:

A.

Enable retention policies on the bucket, and use Cloud Scheduler to invoke a Cloud Function to move or delete your documents based on their metadata.

B.

Enable retention policies on the bucket, use lifecycle rules to change the storage classes of the objects, set the number of versions, and delete old files.

C.

Enable object versioning on the bucket, and use Cloud Scheduler to invoke a Cloud Functions instance to move or delete your documents based on their metadata.

D.

Enable object versioning on the bucket, use lifecycle conditions to change the storage class of the objects, set the number of versions, and delete old files.

Question 48

You are working for a hospital that stores Its medical images in an on-premises data room. The hospital wants to use Cloud Storage for archival storage of these images. The hospital wants an automated process to upload any new medical images to Cloud Storage. You need to design and implement a solution. What should you do?

Options:

A.

Deploy a Dataflow job from the batch template "Datastore lo Cloud Storage" Schedule the batch job on the desired interval

B.

In the Cloud Console, go to Cloud Storage Upload the relevant images to the appropriate bucket

C.

Create a script that uses the gsutil command line interface to synchronize the on-premises storage with Cloud Storage Schedule the script as a cron job

D.

Create a Pub/Sub topic, and enable a Cloud Storage trigger for the Pub/Sub topic. Create an application that sends all medical images to the Pub/Sub lope

Question 49

You used the gcloud container clusters command to create two Google Cloud Kubernetes (GKE) clusters prod-cluster and dev-cluster.

• prod-cluster is a standard cluster.

• dev-cluster is an auto-pilot duster.

When you run the Kubect1 get nodes command, you only see the nodes from prod-cluster Which commands should you run to check the node status for dev-cluster?

Options:

A.

B.

C.

D.

Question 50

Your development team needs a new Jenkins server for their project. You need to deploy the server using the fewest steps possible. What should you do?

Options:

A.

Download and deploy the Jenkins Java WAR to App Engine Standard.

B.

Create a new Compute Engine instance and install Jenkins through the command line interface.

C.

Create a Kubernetes cluster on Compute Engine and create a deployment with the Jenkins Docker image.

D.

Use GCP Marketplace to launch the Jenkins solution.

Question 51

You deployed an App Engine application using gcloud app deploy, but it did not deploy to the intended project. You want to find out why this happened and where the application deployed. What should you do?

Options:

A.

Check the app.yaml file for your application and check project settings.

B.

Check the web-application.xml file for your application and check project settings.

C.

Go to Deployment Manager and review settings for deployment of applications.

D.

Go to Cloud Shell and run gcloud config list to review the Google Cloud configuration used for deployment.

Question 52

You are running a data warehouse on BigQuery. A partner company is offering a recommendation engine based on the data in your data warehouse. The partner company is also running their application on Google Cloud. They manage the resources in their own project, but they need access to the BigQuery dataset in your project. You want to provide the partner company with access to the dataset What should you do?

Options:

A.

Create a Service Account in your own project, and grant this Service Account access to BigGuery in your project

B.

Create a Service Account in your own project, and ask the partner to grant this Service Account access to BigQuery in their project

C.

Ask the partner to create a Service Account in their project, and have them give the Service Account access to BigQuery in their project

D.

Ask the partner to create a Service Account in their project, and grant their Service Account access to the BigQuery dataset in your project

Question 53

You are running multiple microservices in a Kubernetes Engine cluster. One microservice is rendering images. The microservice responsible for the image rendering requires a large amount of CPU time compared to the memory it requires. The other microservices are workloads that are optimized for n1-standard machine types. You need to optimize your cluster so that all workloads are using resources as efficiently as possible. What should you do?

Options:

A.

Assign the pods of the image rendering microservice a higher pod priority than the older microservices

B.

Create a node pool with compute-optimized machine type nodes for the image rendering microservice Use the node pool with general-purpose

machine type nodes for the other microservices

C.

Use the node pool with general-purpose machine type nodes for lite mage rendering microservice Create a nodepool with compute-optimized machine type nodes for the other microservices

D.

Configure the required amount of CPU and memory in the resource requests specification of the image rendering microservice deployment Keep the resource requests for the other microservices at the default

Question 54

The storage costs for your application logs have far exceeded the project budget. The logs are currently being retained indefinitely in the Cloud Storage bucket myapp-gcp-ace-logs. You have been asked to remove logs older than 90 days from your Cloud Storage bucket. You want to optimize ongoing Cloud Storage spend. What should you do?

Options:

A.

Write a script that runs gsutil Is -| – gs://myapp-gcp-ace-logs/** to find and remove items older than 90 days. Schedule the script with cron.

B.

Write a lifecycle management rule in JSON and push it to the bucket with gsutil lifecycle set config-json-file.

C.

Write a lifecycle management rule in XML and push it to the bucket with gsutil lifecycle set config-xml-file.

D.

Write a script that runs gsutil Is -Ir gs://myapp-gcp-ace-logs/** to find and remove items older than 90 days. Repeat this process every morning.

Question 55

You need to assign a Cloud Identity and Access Management (Cloud IAM) role to an external auditor. The auditor needs to have permissions to review your Google Cloud Platform (GCP) Audit Logs and also to review your Data Access logs. What should you do?

Options:

A.

Assign the auditor the IAM role roles/logging.privateLogViewer. Perform the export of logs to Cloud Storage.

B.

Assign the auditor the IAM role roles/logging.privateLogViewer. Direct the auditor to also review the logs for changes to Cloud IAM policy.

C.

Assign the auditor’s IAM user to a custom role that has logging.privateLogEntries.list

permission. Perform the export of logs to Cloud Storage.

D.

Assign the auditor’s IAM user to a custom role that has logging.privateLogEntries.list

permission. Direct the auditor to also review the logs for changes to Cloud IAM policy.

Question 56

You have a workload running on Compute Engine that is critical to your business. You want to ensure that the data on the boot disk of this workload is backed up regularly. You need to be able to restore a backup as quickly as possible in case of disaster. You also want older backups to be cleaned automatically to save on cost. You want to follow Google-recommended practices. What should you do?

Options:

A.

Create a Cloud Function to create an instance template.

B.

Create a snapshot schedule for the disk using the desired interval.

C.

Create a cron job to create a new disk from the disk using gcloud.

D.

Create a Cloud Task to create an image and export it to Cloud Storage.

Question 57

You have a developer laptop with the Cloud SDK installed on Ubuntu. The Cloud SDK was installed from the Google Cloud Ubuntu package repository. You want to test your application locally on your laptop with Cloud Datastore. What should you do?

Options:

A.

Export Cloud Datastore data using gcloud datastore export.

B.

Create a Cloud Datastore index using gcloud datastore indexes create.

C.

Install the google-cloud-sdk-datastore-emulator component using the apt get install command.

D.

Install the cloud-datastore-emulator component using the gcloud components install command.

Question 58

Your application development team has created Docker images for an application that will be deployed on Google Cloud. Your team does not want to manage the infrastructure associated with this application. You need to ensure that the application can scale automatically as it gains popularity. What should you do?

Options:

A.

Create an Instance template with the container image, and deploy a Managed Instance Group with Autoscaling.

B.

Upload Docker images to Artifact Registry, and deploy the application on Google Kubernetes Engine using Standard mode.

C.

Upload Docker images to the Cloud Storage, and deploy the application on Google Kubernetes Engine using Standard mode.

D.

Upload Docker images to Artifact Registry, and deploy the application on Cloud Run.

Question 59

You need to set a budget alert for use of Compute Engineer services on one of the three Google Cloud Platform projects that you manage. All three projects are linked to a single billing account. What should you do?

Options:

A.

Verify that you are the project billing administrator. Select the associated billing account and create a budget and alert for the appropriate project.

B.

Verify that you are the project billing administrator. Select the associated billing account and create a budget and a custom alert.

C.

Verify that you are the project administrator. Select the associated billing account and create a budget for the appropriate project.

D.

Verify that you are project administrator. Select the associated billing account and create a budget and a custom alert.

Question 60

You built an application on your development laptop that uses Google Cloud services. Your application uses Application Default Credentials for authentication and works fine on your development laptop. You want to migrate this application to a Compute Engine virtual machine (VM) and set up authentication using Google- recommended practices and minimal changes. What should you do?

Options:

A.

Assign appropriate access for Google services to the service account used by the Compute Engine VM.

B.

Create a service account with appropriate access for Google services, and configure the application to use this account.

C.

Store credentials for service accounts with appropriate access for Google services in a config file, and deploy this config file with your application.

D.

Store credentials for your user account with appropriate access for Google services in a config file, and deploy this config file with your application.

Question 61

You want to host your video encoding software on Compute Engine. Your user base is growing rapidly, and users need to be able 3 to encode their videos at any time without interruption or CPU limitations. You must ensure that your encoding solution is highly available, and you want to follow Google-recommended practices to automate operations. What should you do?

Options:

A.

Deploy your solution on multiple standalone Compute Engine instances, and increase the number of existing instances wnen CPU utilization on Cloud Monitoring reaches a certain threshold.

B.

Deploy your solution on multiple standalone Compute Engine instances, and replace existing instances with high-CPU

instances when CPU utilization on Cloud Monitoring reaches a certain threshold.

C.

Deploy your solution to an instance group, and increase the number of available instances whenever you see high CPU utilization in Cloud Monitoring.

D.

Deploy your solution to an instance group, and set the autoscaling based on CPU utilization.

Question 62

Your finance team wants to view the billing report for your projects. You want to make sure that the finance team does not get additional permissions to the project. What should you do?

Options:

A.

Add the group for the finance team to roles/billing user role.

B.

Add the group for the finance team to roles/billing admin role.

C.

Add the group for the finance team to roles/billing viewer role.

D.

Add the group for the finance team to roles/billing project/Manager role.

Question 63

You have an application on a general-purpose Compute Engine instance that is experiencing excessive disk read throttling on its Zonal SSD Persistent Disk. The application primarily reads large files from disk. The disk size is currently 350 GB. You want to provide the maximum amount of throughput while minimizing costs. What should you do?

Options:

A.

Increase the size of the disk to 1 TB.

B.

Increase the allocated CPU to the instance.

C.

Migrate to use a Local SSD on the instance.

D.

Migrate to use a Regional SSD on the instance.

Question 64

Your company has workloads running on Compute Engine and on-premises. The Google Cloud Virtual Private Cloud (VPC) is connected to your WAN over a Virtual Private Network (VPN). You need to deploy a new Compute Engine instance and ensure that no public Internet traffic can be routed to it. What should you do?

Options:

A.

Create the instance without a public IP address.

B.

Create the instance with Private Google Access enabled.

C.

Create a deny-all egress firewall rule on the VPC network.

D.

Create a route on the VPC to route all traffic to the instance over the VPN tunnel.

Question 65

You are managing a project for the Business Intelligence (BI) department in your company. A data pipeline ingests data into BigQuery via streaming. You want the users in the BI department to be able to run the custom SQL queries against the latest data in BigQuery. What should you do?

Options:

A.

Create a Data Studio dashboard that uses the related BigQuery tables as a source and give the BI team view access to the Data Studio dashboard.

B.

Create a Service Account for the BI team and distribute a new private key to each member of the BI team.

C.

Use Cloud Scheduler to schedule a batch Dataflow job to copy the data from BigQuery to the BI team's internal data warehouse.

D.

Assign the IAM role of BigQuery User to a Google Group that contains the members of the BI team.

Question 66

You are building a data lake on Google Cloud for your Internet of Things (loT) application. The loT application has millions of sensors that are constantly streaming structured and unstructured data to your backend in the cloud. You want to build a highly available and resilient architecture based on Google-recommended practices. What should you do?

Options:

A.

Stream data to Pub/Sub, and use Dataflow to send data to Cloud Storage

B.

Stream data to Pub/Sub. and use Storage Transfer Service to send data to BigQuery.

C.

Stream data to Dataflow, and use Storage Transfer Service to send data to BigQuery.

D.

Stream data to Dataflow, and use Dataprep by Trifacta to send data to Bigtable.

Question 67

Several employees at your company have been creating projects with Cloud Platform and paying for it with their personal credit cards, which the company reimburses. The company wants to centralize all these projects under a single, new billing account. What should you do?

Options:

A.

Contact cloud-billing@google.com with your bank account details and request a corporate billing account for your company.

B.

Create a ticket with Google Support and wait for their call to share your credit card details over the phone.

C.

In the Google Platform Console, go to the Resource Manage and move all projects to the root Organization.

D.

In the Google Cloud Platform Console, create a new billing account and set up a payment method.

Question 68

You create a Deployment with 2 replicas in a Google Kubernetes Engine cluster that has a single preemptible node pool. After a few minutes, you use kubectl to examine the status of your Pod and observe that one of them is still in Pending status:

as

What is the most likely cause?

Options:

A.

The pending Pod's resource requests are too large to fit on a single node of the cluster.

B.

Too many Pods are already running in the cluster, and there are not enough resources left to schedule the pending Pod.

C.

The node pool is configured with a service account that does not have permission to pull the container image used by the pending Pod.

D.

The pending Pod was originally scheduled on a node that has been preempted between the creation of the Deployment and your verification of the Pods’ status. It is currently being rescheduled on a new node.

Question 69

You need to manage a third-party application that will run on a Compute Engine instance. Other Compute Engine instances are already running with default configuration. Application installation files are hosted on Cloud Storage. You need to access these files from the new instance without allowing other virtual machines (VMs) to access these files. What should you do?

Options:

A.

Create the instance with the default Compute Engine service account Grant the service account permissions on Cloud Storage.

B.

Create the instance with the default Compute Engine service account Add metadata to the objects on Cloud Storage that matches the metadata on the new instance.

C.

Create a new service account and assig n this service account to the new instance Grant the service account permissions on Cloud Storage.

D.

Create a new service account and assign this service account to the new instance Add metadata to the objects on Cloud Storage that matches the metadata on the new instance.

Question 70

You have a Compute Engine instance hosting a production application. You want to receive an email if the instance consumes more than 90% of its CPU resources for more than 15 minutes. You want to use Google services. What should you do?

Options:

A.

1. Create a consumer Gmail account.

2.Write a script that monitors the CPU usage.

3.When the CPU usage exceeds the threshold, have that script send an email using the Gmail account and smtp.gmail.com on port 25 as SMTP server.

B.

1. Create a Stackdriver Workspace, and associate your Google Cloud Platform (GCP) project with it.

2.Create an Alerting Policy in Stackdriver that uses the threshold as a trigger condition.

3.Configure your email address in the notification channel.

C.

1. Create a Stackdriver Workspace, and associate your GCP project with it.

2.Write a script that monitors the CPU usage and sends it as a custom metric to Stackdriver.

3.Create an uptime check for the instance in Stackdriver.

D.

1. In Stackdriver Logging, create a logs-based metric to extract the CPU usage by using this regular expression: CPU Usage: ([0-9] {1,3}) %

2.In Stackdriver Monitoring, create an Alerting Policy based on this metric.

3.Configure your email address in the notification channel.

Question 71

All development (dev) teams in your organization are located in the United States. Each dev team has its own Google Cloud project. You want to restrict access so that each dev team can only create cloud resources in the United States (US). What should you do?

Options:

A.

Create a folder to contain all the dev projects Create an organization policy to limit resources in US locations.

B.

Create an organization to contain all the dev projects. Create an Identity and Access Management (IAM) policy to limit the resources in US regions.

C.

Create an Identity and Access Management

D.

Create an Identity and Access Management (IAM)policy to restrict the resources locations in all dev projects. Apply the policy to all dev roles.

Question 72

You have created a code snippet that should be triggered whenever a new file is uploaded to a Cloud Storage bucket. You want to deploy this code snippet. What should you do?

Options:

A.

Use App Engine and configure Cloud Scheduler to trigger the application using Pub/Sub.

B.

Use Cloud Functions and configure the bucket as a trigger resource.

C.

Use Google Kubernetes Engine and configure a CronJob to trigger the application using Pub/Sub.

D.

Use Dataflow as a batch job, and configure the bucket as a data source.

Question 73

You are operating a Google Kubernetes Engine (GKE) cluster for your company where different teams can run non-production workloads. Your Machine Learning (ML) team needs access to Nvidia Tesla P100 GPUs to train their models. You want to minimize effort and cost. What should you do?

Options:

A.

Ask your ML team to add the “accelerator: gpu” annotation to their pod specification.

B.

Recreate all the nodes of the GKE cluster to enable GPUs on all of them.

C.

Create your own Kubernetes cluster on top of Compute Engine with nodes that have GPUs. Dedicate this cluster to your ML team.

D.

Add a new, GPU-enabled, node pool to the GKE cluster. Ask your ML team to add the cloud.google.com/gke -accelerator: nvidia-tesla-p100 nodeSelector to their pod specification.

Question 74

Your company developed an application to deploy on Google Kubernetes Engine. Certain parts of the application are not fault-tolerant and are allowed to have downtime Other parts of the application are critical and must always be available. You need to configure a Goorj e Kubernfl:es Engine duster while optimizing for cost. What should you do?

Options:

A.

Create a cluster with a single node-pool by using standard VMs. Label the fault-tolerant Deployments as spot-true.

B.

Create a cluster with a single node-pool by using Spot VMs. Label the critical Deployments as spot-false.

C.

Create a cluster with both a Spot W node pool and a rode pool by using standard VMs Deploy the critical.

deployments on the Spot VM node pool and the fault; tolerant deployments on the node pool by using standard VMs.

D.

Create a cluster with both a Spot VM node pool and by using standard VMs. Deploy the critical deployments on the mode pool by using standard VMs and the fault-tolerant deployments on the Spot VM node pool.

Question 75

Your company runs its Linux workloads on Compute Engine instances. Your company will be working with a new operations partner that does not use Google Accounts. You need to grant access to the instances to your operations partner so they can maintain the installed tooling. What should you do?

Options:

A.

Enable Cloud IAP for the Compute Engine instances, and add the operations partner as a Cloud IAP Tunnel User.

B.

Tag all the instances with the same network tag. Create a firewall rule in the VPC to grant TCP access on port 22 for traffic from the operations partner to instances with the network tag.

C.

Set up Cloud VPN between your Google Cloud VPC and the internal network of the operations partner.

D.

Ask the operations partner to generate SSH key pairs, and add the public keys to the VM instances.

Question 76

Your organization uses G Suite for communication and collaboration. All users in your organization have a G Suite account. You want to grant some G Suite users access to your Cloud Platform project. What should you do?

Options:

A.

Enable Cloud Identity in the GCP Console for your domain.

B.

Grant them the required IAM roles using their G Suite email address.

C.

Create a CSV sheet with all users’ email addresses. Use the gcloud command line tool to convert them into Google Cloud Platform accounts.

D.

In the G Suite console, add the users to a special group called cloud-console-users@yourdomain.com. Rely on the default behavior of the Cloud Platform to grant users access if they are members of this group.

Question 77

You have a number of applications that have bursty workloads and are heavily dependent on topics to decouple publishing systems from consuming systems. Your company would like to go serverless to enable developers to focus on writing code without worrying about infrastructure. Your solution architect has already identified Cloud Pub/Sub as a suitable alternative for decoupling systems. You have been asked to identify a suitable GCP Serverless service that is easy to use with Cloud Pub/Sub. You want the ability to scale down to zero when there is no traffic in order to minimize costs. You want to follow Google recommended practices. What should you suggest?

Options:

A.

Cloud Run for Anthos

B.

Cloud Run

C.

App Engine Standard

D.

Cloud Functions.

Question 78

Your company set up a complex organizational structure on Google Could Platform. The structure includes hundreds of folders and projects. Only a few team members should be able to view the hierarchical structure. You need to assign minimum permissions to these team members and you want to follow Google-recommended practices. What should you do?

Options:

A.

Add the users to roles/browser role.

B.

Add the users to roles/iam.roleViewer role.

C.

Add the users to a group, and add this group to roles/browser role.

D.

Add the users to a group, and add this group to roles/iam.roleViewer role.

Question 79

Your company is moving its continuous integration and delivery (CI/CD) pipeline to Compute Engine instances. The pipeline will manage the entire cloud infrastructure through code. How can you ensure that the pipeline has appropriate permissions while your system is following security best practices?

Options:

A.

• Add a step for human approval to the CI/CD pipeline before the execution of the infrastructure

provisioning.

• Use the human approvals IAM account for the provisioning.

B.

• Attach a single service account to the compute instances.

• Add minimal rights to the service account.

• Allow the service account to impersonate a Cloud Identity user with elevated permissions to create, update, or delete resources.

C.

• Attach a single service account to the compute instances.

• Add all required Identity and Access Management (IAM) permissions to this service account to create, update, or delete resources

D.

• Create multiple service accounts, one for each pipeline with the appropriate minimal Identity and

Access Management (IAM) permissions.

• Use a secret manager service to store the key files of the service accounts.

• Allow the CI/CD pipeline to request the appropriate secrets during the execution of the pipeline.

Question 80

You want to add a new auditor to a Google Cloud Platform project. The auditor should be allowed to read, but not modify, all project items.

How should you configure the auditor's permissions?

Options:

A.

Create a custom role with view-only project permissions. Add the user's account to the custom role.

B.

Create a custom role with view-only service permissions. Add the user's account to the custom role.

C.

Select the built-in IAM project Viewer role. Add the user's account to this role.

D.

Select the built-in IAM service Viewer role. Add the user's account to this role.

Page: 1 / 27
Total 269 questions