Spring Sale Discount Flat 70% Offer - Ends in 0d 00h 00m 00s - Coupon code: 70diswrap

Google Professional-Cloud-Security-Engineer Dumps

Google Cloud Certified - Professional Cloud Security Engineer Questions and Answers

Question 1

Your company's storage team manages all product images within a specific Google Cloud project. To maintain control, you must isolate access to Cloud Storage for this project, allowing the storage team to manage restrictions at the project level. They must be restricted to using corporate computers. What should you do?

Options:

A.

Employ organization-level firewall rules to block all traffic to Cloud Storage. Create exceptions for specific service accounts used by the storage team within their project.

B.

Implement VPC Service Controls by establishing an organization-wide service perimeter with all projects. Configure ingress and egress rules to restrict access to Cloud Storage based on IP address ranges.

C.

Use Context-Aware Access. Create an access level that defines the required context. Apply it as an organization policy specifically at the project level, restricting access to Cloud Storage based on that context.

D.

Use Identity and Access Management (IAM) roles at the project level within the storage team's project. Grant the storage team granular permissions on the project's Cloud Storage resources.

Question 2

Your company is moving to Google Cloud. You plan to sync your users first by using Google Cloud Directory Sync (GCDS). Some employees have already created Google Cloud accounts by using their company email addresses that were created outside of GCDS. You must create your users on Cloud Identity.

What should you do?

Options:

A.

Configure GCDS and use GCDS search rules lo sync these users.

B.

Use the transfer tool to migrate unmanaged users.

C.

Write a custom script to identify existing Google Cloud users and call the Admin SDK Directory API to transfer their account.

D.

Configure GCDS and use GCDS exclusion rules to ensure users are not suspended.

Question 3

Your Google Cloud organization allows for administrative capabilities to be distributed to each team through provision of a Google Cloud project with Owner role (roles/ owner). The organization contains thousands of Google Cloud Projects Security Command Center Premium has surfaced multiple cpen_myscl_port findings. You are enforcing the guardrails and need to prevent these types of common misconfigurations.

What should you do?

Options:

A.

Create a firewall rule for each virtual private cloud (VPC) to deny traffic from 0 0 0 0/0 with priority 0.

B.

Create a hierarchical firewall policy configured at the organization to deny all connections from 0 0 0 0/0.

C.

Create a Google Cloud Armor security policy to deny traffic from 0 0 0 0/0.

D.

Create a hierarchical firewall policy configured at the organization to allow connections only from internal IP ranges

Question 4

You have stored company approved compute images in a single Google Cloud project that is used as an image repository. This project is protected with VPC Service Controls and exists in the perimeter along with other projects in your organization. This lets other projects deploy images from the image repository project. A team requires deploying a third-party disk image that is stored in an external Google Cloud organization. You need to grant read access to the disk image so that it can be deployed into the perimeter.

What should you do?

Options:

A.

• 1 Update the perimeter• 2 Configure the egressTo field to set identity Type to any_identity.• 3 Configure the egressFrom field to include the external Google Cloud project number as an allowed resource and the serviceName to compute. googleapis. com.

B.

* Allow the external project by using the organizational policyconstraints/compute.trustedlmageProjects.

C.

• 1 Update the perimeter• 2 Configure the egressTo field to include the external Google Cloud project number as an allowed resource and the serviceName to compute. googleapis. com.• 3 Configure the egressFrom field to set identity Type to any_idestity.

D.

• 1 Update the perimeter• 2 Configure the ingressFrcm field to set identityType to an-y_identity.• 3 Configure the ingressTo field to include the external Google Cloud project number as an allowed resource and the serviceName to compute.googleapis -com.

Question 5

You need to create a VPC that enables your security team to control network resources such as firewall rules. How should you configure the network to allow for separation of duties for network resources?

Options:

A.

Set up multiple VPC networks, and set up multi-NIC virtual appliances to connect the networks.

B.

Set up VPC Network Peering, and allow developers to peer their network with a Shared VPC.

C.

Set up a VPC in a project. Assign the Compute Network Admin role to the security team, and assign the Compute Admin role to the developers.

D.

Set up a Shared VPC where the security team manages the firewall rules, and share the network with developers via service projects.

Question 6

An organization receives an increasing number of phishing emails.

Which method should be used to protect employee credentials in this situation?

Options:

A.

Multifactor Authentication

B.

A strict password policy

C.

Captcha on login pages

D.

Encrypted emails

Question 7

You are onboarding new users into Cloud Identity and discover that some users have created consumer user accounts using the corporate domain name. How should you manage these consumer user accounts with Cloud Identity?

Options:

A.

Use Google Cloud Directory Sync to convert the unmanaged user accounts.

B.

Create a new managed user account for each consumer user account.

C.

Use the transfer tool for unmanaged user accounts.

D.

Configure single sign-on using a customer's third-party provider.

Question 8

Your organization needs to restrict the types of Google Cloud services that can be deployed within specific folders to enforce compliance requirements. You must apply these restrictions only to the designated folders, without affecting other parts of the resource hierarchy. You want to use the most efficient and simple method. What should you do?

Options:

A.

Implement IAM conditions on service account creation within each folder.

B.

Create a global organization policy at the organization level with the Restrict Resource Service Usage constraint, and apply exceptions for other folders.

C.

Create an organization policy at the folder level using the Restrict Resource Service Usage constraint, and define the allowed services per folder.

D.

Configure VPC Service Controls perimeters around each folder, and define the allowed services within the perimeter.

Question 9

Your organization enforces a custom organization policy that disables the use of Compute Engine VM instances with external IP addresses.1 However, a regulated business unit requires an exception to temporarily use external IPs for a third-party audit process. The regulated business workload must comply with least privilege principles and minimize policy drift. You need to ensure secure policy management and proper handling. What should you do?

Options:

A.

Apply the restrictive organization policy at the organization level. Create an IAM custom role with permissions to bypass organization policies. Assign the custom role to the regulated business team for the specific project.

B.

Modify the custom organization policy at the organization level to allow external IPs for all projects. Configure VPC firewall rules to restrict egress traffic except for the regulated business workload.

C.

Apply the custom organization policy at the organization level to restrict external IPs. Move the regulated business workload to a separate folder. Override the policy at that folder level.

D.

Create a folder. Apply the restrictive organization policy for non-regulated business workloads in the folder. Place the regulated business workload in that folder.

Question 10

Your organization is using Active Directory and wants to configure Security Assertion Markup Language (SAML). You must set up and enforce single sign-on (SSO) for all users.

What should you do?

Options:

A.

1. Manage SAML profile assignments.• 2. Enable OpenID Connect (OIDC) in your Active Directory (AD) tenant.• 3. Verify the domain.

B.

1. Create a new SAML profile.• 2. Upload the X.509 certificate.• 3. Enable the change password URL.• 4. Configure Entity ID and ACS URL in your IdP.

C.

1- Create a new SAML profile.• 2. Populate the sign-in and sign-out page URLs.• 3. Upload the X.509 certificate.• 4. Configure Entity ID and ACS URL in your IdP

D.

1. Configure prerequisites for OpenID Connect (OIDC) in your Active Directory (AD) tenant• 2. Verify the AD domain.• 3. Decide which users should use SAML.• 4. Assign the pre-configured profile to the select organizational units (OUs) and groups.

Question 11

Your organization uses Google Workspace Enterprise Edition tor authentication. You are concerned about employees leaving their laptops unattended for extended periods of time after authenticating into Google Cloud. You must prevent malicious people from using an employee's unattended laptop to modify their environment.

What should you do?

Options:

A.

Create a policy that requires employees to not leave their sessions open for long durations.

B.

Review and disable unnecessary Google Cloud APIs.

C.

Require strong passwords and 2SV through a security token or Google authenticate.

D.

Set the session length timeout for Google Cloud services to a shorter duration.

Question 12

Your organization wants to be continuously evaluated against CIS Google Cloud Computing Foundations Benchmark v1 3 0 (CIS Google Cloud Foundation 1 3). Some of the controls are irrelevant to your organization and must be disregarded in evaluation. You need to create an automated system or process to ensure that only the relevant controls are evaluated.

What should you do?

Options:

A.

Mark all security findings that are irrelevant with a tag and a value that indicates a security exception Select all marked findings and mute them on the console every time they appear Activate Security Command Center (SCC) Premium.

B.

Activate Security Command Center (SCC) Premium Create a rule to mute the security findings in SCC so they are not evaluated.

C.

Download all findings from Security Command Center (SCC) to a CSV file Mark the findings that are part of CIS Google Cloud Foundation 1 3 in the file Ignore the entries that are irrelevant and out of scope for the company.

D.

Ask an external audit company to provide independent reports including needed CIS benchmarks. In the scope of the audit clarify that some of the controls are not needed and must be disregarded.

Question 13

Your organization is developing a sophisticated machine learning (ML) model to predict customer behavior for targeted marketing campaigns. The BigQuery dataset used for training includes sensitive personal information. You must design the security controls around the AI/ML pipeline. Data privacy must be maintained throughout the model's lifecycle and you must ensure that personal data is not used in the training process Additionally, you must restrict access to the dataset to an authorized subset of people only. What should you do?

Options:

A.

Implement at-rest encryption by using customer-managed encryption keys (CMEK) for the pipeline. Implement strict Identity and Access Management (IAM) policies to control access to BigQuery.

B.

De-identify sensitive data before model training by using Cloud Data Loss Prevention (DLP) APIs, and implement strict Identity and Access Management (IAM) policies to control access to BigQuery.

C.

Implement Identity-Aware Proxy to enforce context-aware access to BigQuery and models based on user identity and device.

D.

Deploy the model on Confidential VMs for enhanced protection of data and code while in use. Implement strict Identity and Access Management (IAM) policies to control access to BigQuery.

Question 14

Your organization has 3 TB of information in BigQuery and Cloud SQL. You need to develop a cost-effective, scalable, and secure strategy to anonymize the personally identifiable information (PII) that exists today. What should you do?

Options:

A.

Scan your BigQuery and Cloud SQL data using the Cloud DLP data profiling feature. Use the data profiling results to create a de-identification strategy with either Cloud Sensitive Data Protection's de-identification templates or custom configurations.

B.

Create a new BigQuery dataset and Cloud SQL instance. Copy a small subset of the data to these new locations. Use Cloud Data Loss Prevention API to scan this subset for PII. Based on the results, create a custom anonymization script and apply the script to the entire 3 TB dataset in the original locations.

C.

Export all 3TB of data from BigQuery and Cloud SQL to Cloud Storage. Use Cloud Sensitive Data Protection to anonymize the exported data. Re-import the anonymized data back into BigQuery and Cloud SQL.

D.

Inspect a representative sample of the data in BigQuery and Cloud SQL to identify PII. Based on this analysis, develop a custom script to anonymize the identified PII.

Question 15

Your organization is moving virtual machines (VMs) to Google Cloud. You must ensure that operating system images that are used across your projects are trusted and meet your security requirements.

What should you do?

Options:

A.

Implement an organization policy to enforce that boot disks can only be created from images that come from the trusted image project.

B.

Create a Cloud Function that is automatically triggered when a new virtual machine is created from the trusted image repository Verify that the image is not deprecated.

C.

Implement an organization policy constraint that enables the Shielded VM service on all projects to enforce the trusted image repository usage.

D.

Automate a security scanner that verifies that no common vulnerabilities and exposures (CVEs) are present in your trusted image repository.

Question 16

The CISO of your highly regulated organization has mandated that all AI applications running in production must be based on Google first-party models. Your security team has now implemented the Model Garden's organization policy meant to centrally control access and user actions on these approved models at the production folder level. However, it appears that someone has overwritten the policy. This has allowed developers to access third-party models on a particular production project. You need to resolve the issue with a solution that prevents a repeat occurrence. What should you do?

Options:

A.

Withdraw the Organization Policy Administrator role from all non-security team principals at the organization level.

B.

Withdraw the Organization Policy Administrator role from all non-security team principals at the production folder level.

C.

Implement a security posture based on the secure_ai_extended template to notify the security team of any policy changes at the organization level.

D.

Implement a security posture based on the secure_ai_extended template to notify the security team of any policy changes at the production folder level.

Question 17

Your organization is using Google Workspace. Google Cloud, and a third-party SIEM. You need to export events such as user logins, successful logins, and failed logins to the SIEM. Logs need to be ingested in real time or near real-time. What should you do?

Options:

A.

Create a Cloud Logging sink to export relevant authentication logs to a Pub/Sub topic for SIEM subscription.

B.

Poll Cloud Logging for authentication events using the gcloud logging read tool. Forward the events to the SIEM.

C.

Configure Google Workspace to directly send logs to the API endpoint of the third-party SIEM.

D.

Create a Cloud Storage bucket as a sink for all logs. Configure the SIEM to periodically scan the bucket for new log files.

Question 18

You are a consultant for an organization that is considering migrating their data from its private cloud to Google Cloud. The organization’s compliance team is not familiar with Google Cloud and needs guidance on how compliance requirements will be met on Google Cloud. One specific compliance requirement is for customer data at rest to reside within specific geographic boundaries. Which option should you recommend for the organization to meet their data residency requirements on Google Cloud?

Options:

A.

Organization Policy Service constraints

B.

Shielded VM instances

C.

Access control lists

D.

Geolocation access controls

E.

Google Cloud Armor

Question 19

An application running on a Compute Engine instance needs to read data from a Cloud Storage bucket. Your team does not allow Cloud Storage buckets to be globally readable and wants to ensure the principle of least privilege.

Which option meets the requirement of your team?

Options:

A.

Create a Cloud Storage ACL that allows read-only access from the Compute Engine instance’s IP address and allows the application to read from the bucket without credentials.

B.

Use a service account with read-only access to the Cloud Storage bucket, and store the credentials to the service account in the config of the application on the Compute Engine instance.

C.

Use a service account with read-only access to the Cloud Storage bucket to retrieve the credentials from the instance metadata.

D.

Encrypt the data in the Cloud Storage bucket using Cloud KMS, and allow the application to decrypt the data with the KMS key.

Question 20

Your company recently published a security policy to minimize the usage of service account keys. On-premises Windows-based applications are interacting with Google Cloud APIs. You need to implement Workload Identity Federation (WIF) with your identity provider on-premises.

What should you do?

Options:

A.

Set up a workload identity pool with your corporate Active Directory Federation Service (ADFS) Configure a rule to let principals in the pool impersonate the Google Cloud service account.

B.

Set up a workload identity pool with your corporate Active Directory Federation Service (ADFS) Let all principals in the pool impersonate the Google Cloud service account.

C.

Set up a workload identity pool with an OpenID Connect (OIDC) service on the name machine Configure a rule to let principals in the pool impersonate the Google Cloud service account.

D.

Set up a workload identity pool with an OpenID Connect (OIDC) service on the same machine Let all principals in the pool impersonate the Google Cloud service account.

Question 21

Your organization recently activated the Security Command Center {SCO standard tier. There are a few Cloud Storage buckets that were accidentally made accessible to the public. You need to investigate the impact of the incident and remediate it.

What should you do?

Options:

A.

• 1 Remove the Identity and Access Management (IAM) granting access to allusers from the buckets• 2 Apply the organization policy storage. unifromBucketLevelAccess to prevent regressions• 3 Query the data access logs to report on unauthorized access

B.

• 1 Change bucket permissions to limit access• 2 Query the data access audit logs for any unauthorized access to the buckets• 3 After the misconfiguration is corrected mute the finding in the Security Command Center

C.

• 1 Change permissions to limit access for authorized users• 2 Enforce a VPC Service Controls perimeter around all the production projects to immediately stop any unauthorized access• 3 Review the administrator activity audit logs to report on any unauthorized access

D.

• 1 Change the bucket permissions to limit access• 2 Query the buckets usage logs to report on unauthorized access to the data• 3 Enforce the organization policy storage.publicAccessPrevention to avoid regressions

Question 22

You perform a security assessment on a customer architecture and discover that multiple VMs have public IP addresses. After providing a recommendation to remove the public IP addresses, you are told those VMs need to communicate to external sites as part of the customer's typical operations. What should you recommend to reduce the need for public IP addresses in your customer's VMs?

Options:

A.

Google Cloud Armor

B.

Cloud NAT

C.

Cloud Router

D.

Cloud VPN

Question 23

You define central security controls in your Google Cloud environment for one of the folders in your organization you set an organizational policy to deny the assignment of external IP addresses to VMs. Two days later you receive an alert about a new VM with an external IP address under that folder.

What could have caused this alert?

Options:

A.

The VM was created with a static external IP address that was reserved in the project before the organizational policy rule was set.

B.

The organizational policy constraint wasn't properly enforced and is running in "dry run mode.

C.

At project level, the organizational policy control has been overwritten with an 'allow' value.

D.

The policy constraint on the folder level does not have any effect because of an allow" value for that constraint on the organizational level.

Question 24

Your company has recently enabled Security Command Center at the organization level. You need to implement runtime threat detection for applications running in containers within projects residing in the production folder. Specifically, you need to be notified if additional libraries are loaded or malicious scripts are executed within these running containers. You need to configure Security Command Center to meet this requirement while ensuring findings are visible within Security Command Center. What should you do?

Options:

A.

Ensure that the containers in the production folder are running on hosts that are using Container-Optimized OS.

B.

Enable Container Threat Detection in Security Command Center Premium tier for the projects within the production folder.

C.

Configure Security Health Analytics within Security Command Center to monitor container runtime vulnerabilities in the production folder.

D.

Create log-based metrics and alerts in Cloud Logging and Cloud Monitoring for suspicious container activity within the production folder.

Question 25

You are setting up Cloud Identity for your company's Google Cloud organization. User accounts will be provisioned from Microsoft Entra ID through Directory Sync and there will be a single sign-on through Entra ID. You need to secure the super administrator accounts for the organization. Your solution must follow the principle of least privilege and implement strong authentication. What should you do?

Options:

A.

Create dedicated accounts for super administrators. Ensure that 2-step verification is enforced for the super administrator accounts in Entra ID.

B.

Create dedicated accounts for super administrators. Enforce Google 2-step verification for the super administrator accounts.

C.

Create accounts that combine the organization administrator and the super administrator privileges. Ensure that 2-step verification is enforced for the super administrator accounts in Entra ID.

D.

Create accounts that combine the organization administrators and the super administrator privileges. Enforce Google 2-step verification for the super administrator accounts.

Question 26

Your team wants to centrally manage GCP IAM permissions from their on-premises Active Directory Service. Your team wants to manage permissions by AD group membership.

What should your team do to meet these requirements?

Options:

A.

Set up Cloud Directory Sync to sync groups, and set IAM permissions on the groups.

B.

Set up SAML 2.0 Single Sign-On (SSO), and assign IAM permissions to the groups.

C.

Use the Cloud Identity and Access Management API to create groups and IAM permissions from Active Directory.

D.

Use the Admin SDK to create groups and assign IAM permissions from Active Directory.

Question 27

Your organization operates a hybrid cloud environment and has recently deployed a private Artifact Registry repository in Google Cloud. On-premises developers cannot resolve the Artifact Registry hostname and therefore cannot push or pull artifacts. You've verified the following:

Connectivity to Google Cloud is established by Cloud VPN or Cloud Interconnect.

No custom DNS configurations exist on-premises.

There is no route to the internet from the on-premises network.

You need to identify the cause and enable the developers to push and pull artifacts. What is likely causing the issue and what should you do to fix the issue?

Options:

A.

Artifact Registry requires external HTTP/HTTPS access. Create a new firewall rule allowing ingress traffic on ports 80 and 443 from the developer's IP ranges.

B.

Private Google Access is not enabled for the subnet hosting the Artifact Registry. Enable Private Google Access for the appropriate subnet.

C.

On-premises DNS servers lack the necessary records to resolve private Google API domains. Create DNS records for restricted.googleapis.com or private.googleapis.com pointing to Google's published IP ranges.

D.

Developers must be granted the artifactregistry.writer IAM role. Grant the relevant developer group this role.

Question 28

You are responsible for managing your company’s identities in Google Cloud. Your company enforces 2-Step Verification (2SV) for all users. You need to reset a user’s access, but the user lost their second factor for 2SV. You want to minimize risk. What should you do?

Options:

A.

On the Google Admin console, select the appropriate user account, and generate a backup code to allow the user to sign in. Ask the user to update their second factor.

B.

On the Google Admin console, temporarily disable the 2SV requirements for all users. Ask the user to log in and add their new second factor to their account. Re-enable the 2SV requirement for all users.

C.

On the Google Admin console, select the appropriate user account, and temporarily disable 2SV for this account Ask the user to update their second factor, and then re-enable 2SV for this account.

D.

On the Google Admin console, use a super administrator account to reset the user account's credentials. Ask the user to update their credentials after their first login.

Question 29

Your company uses Google Cloud and has publicly exposed network assets. You want to discover the assets and perform a security audit on these assets by using a software tool in the least amount of time.

What should you do?

Options:

A.

Run a platform security scanner on all instances in the organization.

B.

Notify Google about the pending audit and wait for confirmation before performing the scan.

C.

Contact a Google approved security vendor to perform the audit.

D.

Identify all external assets by using Cloud Asset Inventory and then run a network security scanner against them.

Question 30

Your organization hosts a financial services application running on Compute Engine instances for a third-party company. The third-party company’s servers that will consume the application also run on Compute Engine in a separate Google Cloud organization. You need to configure a secure network connection between the Compute Engine instances. You have the following requirements:

The network connection must be encrypted.

The communication between servers must be over private IP addresses.

What should you do?

Options:

A.

Configure a Cloud VPN connection between your organization's VPC network and the third party's that is controlled by VPC firewall rules.

B.

Configure a VPC peering connection between your organization's VPC network and the third party's that is controlled by VPC firewall rules.

C.

Configure a VPC Service Controls perimeter around your Compute Engine instances, and provide access to the third party via an access level.

D.

Configure an Apigee proxy that exposes your Compute Engine-hosted application as an API, and is encrypted with TLS which allows access only to the third party.

Question 31

Last week, a company deployed a new App Engine application that writes logs to BigQuery. No other workloads are running in the project. You need to validate that all data written to BigQuery was done using the App Engine Default Service Account.

What should you do?

Options:

A.

1. Use StackDriver Logging and filter on BigQuery Insert Jobs.2.Click on the email address in line with the App Engine Default Service Account in the authentication field.3.Click Hide Matching Entries.4.Make sure the resulting list is empty.

B.

1. Use StackDriver Logging and filter on BigQuery Insert Jobs.2.Click on the email address in line with the App Engine Default Service Account in the authentication field.3.Click Show Matching Entries.4.Make sure the resulting list is empty.

C.

1. In BigQuery, select the related dataset.2. Make sure the App Engine Default Service Account is the only account that can write to the dataset.

D.

1. Go to the IAM section on the project.2. Validate that the App Engine Default Service Account is the only account that has a role that can write to BigQuery.

Question 32

How should a customer reliably deliver Stackdriver logs from GCP to their on-premises SIEM system?

Options:

A.

Send all logs to the SIEM system via an existing protocol such as syslog.

B.

Configure every project to export all their logs to a common BigQuery DataSet, which will be queried by the SIEM system.

C.

Configure Organizational Log Sinks to export logs to a Cloud Pub/Sub Topic, which will be sent to the SIEM via Dataflow.

D.

Build a connector for the SIEM to query for all logs in real time from the GCP RESTful JSON APIs.

Question 33

Your financial services company has an audit requirement under a strict regulatory framework that requires comprehensive, immutable audit trails for all administrative and data access activity that ensures that data is kept for seven years. Your current logging is fragmented across individual projects. You need to establish a centralized, tamper-proof, long-term logging solution accessible for audits. What should you do?

Options:

A.

Implement Pub/Sub to stream all audit logs from each project in real-time to an external Security Information and Event Management (SIEM) for long-term analysis.

B.

Establish organization-level Cloud Logging sinks to export Cloud Audit Logs to a dedicated Cloud Storage bucket with object retention lock.

C.

Enable Security Command Center across the organization to gain centralized visibility into threats and manage compliance posture for all Google Cloud projects.

D.

Individually configure Cloud Audit Logs for all Google Cloud services in each project. Store the logs in regional Cloud Logging buckets with 30-day retention policies.

Question 34

Your organization’s Google Cloud VMs are deployed via an instance template that configures them with a public IP address in order to host web services for external users. The VMs reside in a service project that is attached to a host (VPC) project containing one custom Shared VPC for the VMs. You have been asked to reduce the exposure of the VMs to the internet while continuing to service external users. You have already recreated the instance template without a public IP address configuration to launch the managed instance group (MIG). What should you do?

Options:

A.

Deploy a Cloud NAT Gateway in the service project for the MIG.

B.

Deploy a Cloud NAT Gateway in the host (VPC) project for the MIG.

C.

Deploy an external HTTP(S) load balancer in the service project with the MIG as a backend.

D.

Deploy an external HTTP(S) load balancer in the host (VPC) project with the MIG as a backend.

Question 35

You have the following resource hierarchy. There is an organization policy at each node in the hierarchy as shown. Which load balancer types are denied in VPC A?

as

Options:

A.

All load balancer types are denied in accordance with the global node’s policy.

B.

INTERNAL_TCP_UDP, INTERNAL_HTTP_HTTPS is denied in accordance with the folder’s policy.

C.

EXTERNAL_TCP_PROXY, EXTERNAL_SSL_PROXY are denied in accordance with the project’s policy.

D.

EXTERNAL_TCP_PROXY, EXTERNAL_SSL_PROXY, INTERNAL_TCP_UDP, and INTERNAL_HTTP_HTTPS are denied in accordance with the folder and project’s policies.

Question 36

Your organization is rolling out a new continuous integration and delivery (CI/CD) process to deploy infrastructure and applications in Google Cloud Many teams will use their own instances of the CI/CD workflow It will run on Google Kubernetes Engine (GKE) The CI/CD pipelines must be designed to securely access Google Cloud APIs

What should you do?

Options:

A.

• 1 Create a dedicated service account for the CI/CD pipelines• 2 Run the deployment pipelines in a dedicated nodes pool in the GKE cluster• 3 Use the service account that you created as identity for the nodes in the pool to authenticate to the Google Cloud APIs

B.

• 1 Create service accounts for each deployment pipeline• 2 Generate private keys for the service accounts• 3 Securely store the private keys as Kubernetes secrets accessible only by the pods that run the specific deploy pipeline

C.

* 1 Create individual service accounts (or each deployment pipeline• 2 Add an identifier for the pipeline in the service account naming convention• 3 Ensure each pipeline runs on dedicated pods• 4 Use workload identity to map a deployment pipeline pod with a service account

D.

• 1 Create two service accounts one for the infrastructure and one for the application deployment• 2 Use workload identities to let the pods run the two pipelines and authenticate with the service accounts• 3 Run the infrastructure and application pipelines in separate namespaces

Question 37

Your security team wants to implement a defense-in-depth approach to protect sensitive data stored in a Cloud Storage bucket. Your team has the following requirements:

The Cloud Storage bucket in Project A can only be readable from Project B.

The Cloud Storage bucket in Project A cannot be accessed from outside the network.

Data in the Cloud Storage bucket cannot be copied to an external Cloud Storage bucket.

What should the security team do?

Options:

A.

Enable domain restricted sharing in an organization policy, and enable uniform bucket-level access on the Cloud Storage bucket.

B.

Enable VPC Service Controls, create a perimeter around Projects A and B. and include the Cloud Storage API in the Service Perimeter configuration.

C.

Enable Private Access in both Project A and B's networks with strict firewall rules that allow communication between the networks.

D.

Enable VPC Peering between Project A and B's networks with strict firewall rules that allow communication between the networks.

Question 38

You want to limit the images that can be used as the source for boot disks. These images will be stored in a dedicated project.

What should you do?

Options:

A.

Use the Organization Policy Service to create a compute.trustedimageProjects constraint on the organization level. List the trusted project as the whitelist in an allow operation.

B.

Use the Organization Policy Service to create a compute.trustedimageProjects constraint on the organization level. List the trusted projects as the exceptions in a deny operation.

C.

In Resource Manager, edit the project permissions for the trusted project. Add the organization as member with the role: Compute Image User.

D.

In Resource Manager, edit the organization permissions. Add the project ID as member with the role: Compute Image User.

Question 39

Your global defense company is migrating top-secret classified data to BigQuery and Cloud Storage. National security regulations demand that master encryption key material never leaves the accredited on-premises cryptographic hardware. You must retain the unilateral ability to revoke data access, independent of any cloud provider. What should you do?

Options:

A.

Use customer-supplied encryption keys (CSEKs) by providing your own encryption keys with each data operation in Cloud Storage and BigQuery.

B.

Use customer-managed encryption keys (CMEKs) for the BigQuery datasets and Cloud Storage buckets. Store the keys in Cloud Key Management Service (Cloud KMS).

C.

Import existing on-premises master encryption keys into Cloud Key Management Service (Cloud KMS). Use the imported keys for BigQuery and Cloud Storage encryption.

D.

Configure Cloud External Key Manager (Cloud EKM) for the BigQuery datasets and Cloud Storage buckets. Integrate EKM with your existing on-premises hardware security modules (HSMs).

Question 40

Your company is developing a new application for your organization. The application consists of two Cloud Run services, service A and service B. Service A provides a web-based user front-end. Service B provides back-end services that are called by service A. You need to set up identity and access management for the application. Your solution should follow the principle of least privilege. What should you do?

Options:

A.

Create a new service account with the permissions to run service A and service B. Require authentication for service B. Permit only the new service account to call the backend.

B.

Create two separate service accounts. Grant one service account the permissions to execute service A, and grant the other service account the permissions to execute service B. Require authentication for service B. Permit only the service account for service A to call the back-end.

C.

Use the Compute Engine default service account to run service A and service B. Require authentication for service B. Permit only the default service account to call the backend.

D.

Create three separate service accounts. Grant one service account the permissions to execute service A. Grant the second service account the permissions to run service B. Grant the third service account the permissions to communicate between both services A and B. Require authentication for service B. Call the back-end by authenticating with a service account key for the third service account.

Question 41

Your Google Cloud environment has one organization node, one folder named Apps." and several projects within that folder The organizational node enforces the constraints/iam.allowedPolicyMemberDomains organization policy, which allows members from the terramearth.com organization The "Apps" folder enforces the constraints/iam.allowedPolicyMemberDomains organization policy, which allows members from the flowlogistic.com organization. It also has the inheritFromParent: false property.

You attempt to grant access to a project in the Apps folder to the user testuser@terramearth.com.

What is the result of your action and why?

Options:

A.

The action fails because a constraints/iam.allowedPolicyMemberDomains organization policy mustbe defined on the current project to deactivate the constraint temporarily.

B.

The action fails because a constraints/iam.allowedPolicyMemberDomains organization policy is in place and only members from the flowlogistic.com organization are allowed.

C.

The action succeeds because members from both organizations, terramearth. com or flowlogistic.com, are allowed on projects in the "Apps" folder

D.

The action succeeds and the new member is successfully added to the project's Identity and Access Management (1AM) policy because all policies are inherited by underlying folders and projects.

Question 42

An organization is starting to move its infrastructure from its on-premises environment to Google Cloud Platform (GCP). The first step the organization wants to take is to migrate its current data backup and disaster recovery solutions to GCP for later analysis. The organization’s production environment will remain on- premises for an indefinite time. The organization wants a scalable and cost-efficient solution.

Which GCP solution should the organization use?

Options:

A.

BigQuery using a data pipeline job with continuous updates

B.

Cloud Storage using a scheduled task and gsutil

C.

Compute Engine Virtual Machines using Persistent Disk

D.

Cloud Datastore using regularly scheduled batch upload jobs

Question 43

You have created an OS image that is hardened per your organization’s security standards and is being stored in a project managed by the security team. As a Google Cloud administrator, you need to make sure all VMs in your Google Cloud organization can only use that specific OS image while minimizing operational overhead. What should you do? (Choose two.)

Options:

A.

Grant users the compuce.imageUser role in their own projects.

B.

Grant users the compuce.imageUser role in the OS image project.

C.

Store the image in every project that is spun up in your organization.

D.

Set up an image access organization policy constraint, and list the security team managed project in the projects allow list.

E.

Remove VM instance creation permission from users of the projects, and only allow you and your team to create VM instances.

Question 44

While migrating your organization’s infrastructure to GCP, a large number of users will need to access GCP Console. The Identity Management team already has a well-established way to manage your users and want to keep using your existing Active Directory or LDAP server along with the existing SSO password.

What should you do?

Options:

A.

Manually synchronize the data in Google domain with your existing Active Directory or LDAP server.

B.

Use Google Cloud Directory Sync to synchronize the data in Google domain with your existing Active Directory or LDAP server.

C.

Users sign in directly to the GCP Console using the credentials from your on-premises Kerberos compliant identity provider.

D.

Users sign in using OpenID (OIDC) compatible IdP, receive an authentication token, then use that token to log in to the GCP Console.

Question 45

You are setting up a CI/CD pipeline to deploy containerized applications to your production clusters on Google Kubernetes Engine (GKE). You need to prevent containers with known vulnerabilities from being deployed. You have the following requirements for your solution:

Must be cloud-native

Must be cost-efficient

Minimize operational overhead

How should you accomplish this? (Choose two.)

Options:

A.

Create a Cloud Build pipeline that will monitor changes to your container templates in a Cloud Source Repositories repository. Add a step to analyze Container Analysis results before allowing the build to continue.

B.

Use a Cloud Function triggered by log events in Google Cloud's operations suite to automatically scan your container images in Container Registry.

C.

Use a cron job on a Compute Engine instance to scan your existing repositories for known vulnerabilities and raise an alert if a non-compliant container image is found.

D.

Deploy Jenkins on GKE and configure a CI/CD pipeline to deploy your containers to Container Registry. Add a step to validate your container images before deploying your container to the cluster.

E.

In your CI/CD pipeline, add an attestation on your container image when no vulnerabilities have been found. Use a Binary Authorization policy to block deployments of containers with no attestation in your cluster.

Question 46

Your organization has an operational image classification model running on a managed AI service on Google Cloud. You are in a configuration review with stakeholders and must describe the security responsibilities for the image classification model. What should you do?

Options:

A.

Explain the development of custom network firewalls around the image classification service for deep intrusion detection and prevention. Describe vulnerability scanning tools for known vulnerabilities.

B.

Explain Google's shared responsibility model. Focus the configuration review on Identity and Access Management (IAM) permissions, secure data upload/download procedures, and monitoring logs for any potential malicious activity.

C.

Explain that using platform-as-a-service (PaaS) transfers security concerns to Google. Describe the need for strict API usage limits to protect against unexpected usage and billing spikes.

D.

Explain the security aspects of the code that transforms user-uploaded images using Google's service. Define Cloud IAM for fine-grained access control within the development team.

Question 47

Your organization leverages folders to represent different teams within your Google Cloud environment. To support Infrastructure as Code (IaC) practices, each team receives a dedicated service account upon onboarding. You want to ensure that teams have comprehensive permissions to manage resources within their assigned folders while adhering to the principle of least privilege. You must design the permissions for these team-based service accounts in the most effective way possible. What should you do?​

Options:

A.

Grant each service account the folder administrator role on its respective folder.​

B.

Grant each service account the project creator role at the organization level and use folder-level IAM conditions to restrict project creation to specific folders.​Reddit

C.

Assign each service account the project editor role at the organization level and instruct teams to use IAM bindings at the folder level for fine-grained permissions.​

D.

Assign each service account the folder IAM administrator role on its respective folder to allow teams to create and manage additional custom roles if needed.​

Question 48

You are exporting application logs to Cloud Storage. You encounter an error message that the log sinks don't support uniform bucket-level access policies. How should you resolve this error?

Options:

A.

Change the access control model for the bucket

B.

Update your sink with the correct bucket destination.

C.

Add the roles/logging.logWriter Identity and Access Management (IAM) role to the bucket for the log sink identity.

D.

Add the roles/logging.bucketWriter Identity and Access Management (IAM) role to the bucket for the log sink identity.

Question 49

Your application is deployed as a highly available cross-region solution behind a global external HTTP(S) load balancer. You notice significant spikes in traffic from multiple IP addresses but it is unknown whether the IPs are malicious. You are concerned about your application's availability. You want to limit traffic from these clients over a specified time interval.

What should you do?

Options:

A.

Configure a rate_based_ban action by using Google Cloud Armor and set the ban_duration_sec parameter to the specified time interval.

B.

Configure a deny action by using Google Cloud Armor to deny the clients that issued too many requests over the specified time interval.

C.

Configure a throttle action by using Google Cloud Armor to limit the number of requests per client over a specified time interval.

D.

Configure a firewall rule in your VPC to throttle traffic from the identified IP addresses.

Question 50

You are the project owner for a regulated workload that runs in a project you own and manage as an Identity and Access Management (IAM) admin. For an upcoming audit, you need to provide access reviews evidence. Which tool should you use?

Options:

A.

Policy Troubleshooter

B.

Policy Analyzer

C.

IAM Recommender

D.

Policy Simulator

Question 51

All logs in your organization are aggregated into a centralized Google Cloud logging project for analysis and long-term retention.4 While most of the log data can be viewed by operations teams, there are specific sensitive fields (i.e., protoPayload.authenticationinfo.principalEmail) that contain identifiable information that should be restricted only to security teams. You need to implement a solution that allows different teams to view their respective application logs in the centralized logging project. It must also restrict access to specific sensitive fields within those logs to only a designated security group. Your solution must ensure that other fields in the same log entry remain visible to other authorized groups. What should you do?

Options:

A.

Configure field-level access in Cloud Logging by defining data access policies that specify sensitive fields and the authorized principals.

B.

Use Cloud IAM custom roles with specific permissions on logging.privateLogEntries.list. Define field-level access within the custom role's conditions.

C.

Implement a log sink to exclude sensitive fields before logs are sent to the centralized logging project. Create separate sinks for sensitive data.

D.

Create a BigQuery authorized view on the exported log sink to filter out the sensitive fields based on user groups.

Question 52

An organization is moving applications to Google Cloud while maintaining a few mission-critical applications on-premises. The organization must transfer the data at a bandwidth of at least 50 Gbps. What should they use to ensure secure continued connectivity between sites?

Options:

A.

Dedicated Interconnect

B.

Cloud Router

C.

Cloud VPN

D.

Partner Interconnect

Question 53

Which Identity-Aware Proxy role should you grant to an Identity and Access Management (IAM) user to access HTTPS resources?

Options:

A.

Security Reviewer

B.

lAP-Secured Tunnel User

C.

lAP-Secured Web App User

D.

Service Broker Operator

Question 54

You are auditing all your Google Cloud resources in the production project. You want to identity all principals who can change firewall rules.

What should you do?

Options:

A.

Use Policy Analyzer lo query the permissions compute, firewalls, create ofcompute, firewalls. Create of compute,firewalls.delete.

B.

Reference the Security Health Analytics - Firewall Vulnerability Findings in the Security Command Center.

C.

Use Policy Analyzer to query the permissions compute, firewalls, get of compute, firewalls, list.

D.

Use Firewall Insights to understand your firewall rules usage patterns.

Question 55

Which type of load balancer should you use to maintain client IP by default while using the standard network tier?

Options:

A.

SSL Proxy

B.

TCP Proxy

C.

Internal TCP/UDP

D.

TCP/UDP Network

Question 56

Your organization operates in a highly regulated industry and uses multiple Google Cloud services. You need to identify potential risks to regulatory compliance. Which situation introduces the greatest risk?

Options:

A.

Principals have broad IAM roles allowing the creation and management of Compute Engine VMs without a pre-defined hardening process.

B.

Sensitive data is stored in a Cloud Storage bucket with the uniform bucket-level access setting enabled.

C.

The security team mandates the use of customer-managed encryption keys (CMEK) for all data classified as sensitive.

D.

The audit team needs access to Cloud Audit Logs related to managed services like BigQuery.

Question 57

You need to set up two network segments: one with an untrusted subnet and the other with a trusted subnet. You want to configure a virtual appliance such as a next-generation firewall (NGFW) to inspect all traffic between the two network segments. How should you design the network to inspect the traffic?

Options:

A.

1. Set up one VPC with two subnets: one trusted and the other untrusted.2. Configure a custom route for all traffic (0.0.0.0/0) pointed to the virtual appliance.

B.

1. Set up one VPC with two subnets: one trusted and the other untrusted.2. Configure a custom route for all RFC1918 subnets pointed to the virtual appliance.

C.

1. Set up two VPC networks: one trusted and the other untrusted, and peer them together.2. Configure a custom route on each network pointed to the virtual appliance.

D.

1. Set up two VPC networks: one trusted and the other untrusted.2. Configure a virtual appliance using multiple network interfaces, with each interface connected to one of the VPC networks.

Question 58

A business unit at a multinational corporation signs up for GCP and starts moving workloads into GCP. The business unit creates a Cloud Identity domain with an organizational resource that has hundreds of projects.

Your team becomes aware of this and wants to take over managing permissions and auditing the domain resources.

Which type of access should your team grant to meet this requirement?

Options:

A.

Organization Administrator

B.

Security Reviewer

C.

Organization Role Administrator

D.

Organization Policy Administrator

Question 59

Options:

A.

Configure IAM permissions on individual Model Garden to restrict access to specific models.

B.

Regularly audit user activity logs in Vertex AI to identify and revoke access to unapproved models.

C.

Train custom models within your Vertex AI project and restrict user access to these models.

D.

Implement an organization policy that restricts the vertexai.allowedModels constraint.

Question 60

Your company is migrating a customer database that contains personally identifiable information (PII) to Google Cloud. To prevent accidental exposure, this data must be protected at rest. You need to ensure that all PII is automatically discovered and redacted, or pseudonymized, before any type of analysis. What should you do?

Options:

A.

Implement Cloud Armor to protect the database from external threats and configure firewall rules to restrict network access to only authorized internal IP addresses.

B.

Configure Sensitive Data Protection to scan the database for PII using both predefined and custom infoTypes and to mask sensitive data.8

C.

Use Cloud KMS to encrypt the database at rest with a customer-managed encryption key (CMEK). Implement VPC Service Controls.

D.

Create Cloud Storage buckets with object versioning enabled, and use IAM policies to restrict access to the data. Use Data Loss Prevention API (DLP API) on the buckets to scan for sensitive data and generate detection alerts.9

Question 61

Your organization has an application hosted in Cloud Run. You must control access to the application by using Cloud Identity-Aware Proxy (IAP) with these requirements:

Only users from the AppDev group may have access.

Access must be restricted to internal network IP addresses.

What should you do?

Options:

A.

Configure IAP to enforce multi-factor authentication (MFA) for all users and use network intrusion detection systems (NIDS) to block unauthorized access attempts.

B.

Configure firewall rules to limit access to IAP based on the AppDev group and source IP addresses.

C.

Create an access level that includes conditions for internal IP address ranges and AppDev groups. Apply this access level to the application's IAP policy.

D.

Deploy a VPN gateway and instruct the AppDev group to connect to the company network before accessing the application.

Question 62

As adoption of the Cloud Data Loss Prevention (DLP) API grows within the company, you need to optimize usage to reduce cost. DLP target data is stored in Cloud Storage and BigQuery. The location and region are identified as a suffix in the resource name.

Which cost reduction options should you recommend?

Options:

A.

Set appropriate rowsLimit value on BigQuery data hosted outside the US and set appropriate bytesLimitPerFile value on multiregional Cloud Storage buckets.

B.

Set appropriate rowsLimit value on BigQuery data hosted outside the US, and minimize transformation units on multiregional Cloud Storage buckets.

C.

Use rowsLimit and bytesLimitPerFile to sample data and use CloudStorageRegexFileSet to limit scans.

D.

Use FindingLimits and TimespanContfig to sample data and minimize transformation units.

Question 63

You are creating a new infrastructure CI/CD pipeline to deploy hundreds of ephemeral projects in your Google Cloud organization to enable your users to interact with Google Cloud. You want to restrict the use of the default networks in your organization while following Google-recommended best practices. What should you do?

Options:

A.

Enable the constraints/compute.skipDefaultNetworkCreation organization policy constraint at the organization level.

B.

Create a cron job to trigger a daily Cloud Function to automatically delete all default networks for each project.

C.

Grant your users the 1AM Owner role at the organization level. Create a VPC Service Controls perimeter around the project that restricts the compute.googleapis.com API.

D.

Only allow your users to use your CI/CD pipeline with a predefined set of infrastructure templates they can deploy to skip the creation of the default networks.

Question 64

Options:

A.

Implement a Cloud Function that scans the environment variables multiple times a day. and creates a finding in Security Command Center if secrets are discovered.

B.

Implement regular peer reviews to assess the environment variables and identify secrets in your Cloud Functions. Raise a security incident if secrets are discovered.

C.

Use Sensitive Data Protection to scan the environment variables multiple times per day. and create a finding in Security Command Center if secrets are discovered.

D.

Integrate dynamic application security testing into the CI/CD pipeline that scans the application code for the Cloud Functions. Fail the build process if secrets are discovered.

Question 65

A customer wants to make it convenient for their mobile workforce to access a CRM web interface that is hosted on Google Cloud Platform (GCP). The CRM can only be accessed by someone on the corporate network. The customer wants to make it available over the internet. Your team requires an authentication layer in front of the application that supports two-factor authentication

Which GCP product should the customer implement to meet these requirements?

Options:

A.

Cloud Identity-Aware Proxy

B.

Cloud Armor

C.

Cloud Endpoints

D.

Cloud VPN

Question 66

You are part of a security team investigating a compromised service account key. You need to audit which new resources were created by the service account.

What should you do?

Options:

A.

Query Data Access logs.

B.

Query Admin Activity logs.

C.

Query Access Transparency logs.

D.

Query Stackdriver Monitoring Workspace.

Question 67

Your organization must comply with the regulation to keep instance logging data within Europe. Your workloads will be hosted in the Netherlands in region europe-west4 in a new project. You must configure Cloud Logging to keep your data in the country.

What should you do?

Options:

A.

Configure the organization policy constraint gcp.resourceLocations to europe-west4.

B.

Set the logging storage region to eurcpe-west4 by using the gcloud CLI logging settings update.

C.

Create a new tog bucket in europe-west4. and redirect the _Def auit bucKet to the new bucket.

D.

Configure log sink to export all logs into a Cloud Storage bucket in europe-west4.

Question 68

Your organization operates in a highly regulated environment and has a stringent set of compliance requirements for protecting customer data. You must encrypt data while in use to meet regulations. What should you do?

Options:

A.

Use customer-managed encryption keys (CMEK) and Cloud KSM to enable your organization to control their keys for data encryption in Cloud SQL

B.

Enable the use of customer-supplied encryption keys (CSEK) keys in the Google Compute Engine VMs to give your organization maximum control over their VM disk encryption.

C.

Establish a trusted execution environment with a Confidential VM.

D.

Use a Shielded VM to ensure a secure boot with integrity monitoring for the application environment.

Question 69

Your organization previously stored files in Cloud Storage by using Google Managed Encryption Keys (GMEK). but has recently updated the internal policy to require Customer Managed Encryption Keys (CMEK). You need to re-encrypt the files quickly and efficiently with minimal cost.

What should you do?

Options:

A.

Encrypt the files locally, and then use gsutil to upload the files to a new bucket.

B.

Copy the files to a new bucket with CMEK enabled in a secondary region

C.

Reupload the files to the same Cloud Storage bucket specifying a key file by using gsutil.

D.

Change the encryption type on the bucket to CMEK, and rewrite the objects

Question 70

A company has redundant mail servers in different Google Cloud Platform regions and wants to route customers to the nearest mail server based on location.

How should the company accomplish this?

Options:

A.

Configure TCP Proxy Load Balancing as a global load balancing service listening on port 995.

B.

Create a Network Load Balancer to listen on TCP port 995 with a forwarding rule to forward traffic basedon location.

C.

Use Cross-Region Load Balancing with an HTTP(S) load balancer to route traffic to the nearest region.

D.

Use Cloud CDN to route the mail traffic to the closest origin mail server based on client IP address.

Question 71

You want to use the gcloud command-line tool to authenticate using a third-party single sign-on (SSO) SAML identity provider. Which options are necessary to ensure that authentication is supported by the third-party identity provider (IdP)? (Choose two.)

Options:

A.

SSO SAML as a third-party IdP

B.

Identity Platform

C.

OpenID Connect

D.

Identity-Aware Proxy

E.

Cloud Identity

Question 72

Your organization develops software involved in many open source projects and is concerned about software supply chain threats You need to deliver provenance for the build to demonstrate the software is untampered.

What should you do?

Options:

A.

• 1- Generate Supply Chain Levels for Software Artifacts (SLSA) level 3 assurance by using Cloud Build.• 2. View the build provenance in the Security insights side panel within the Google Cloud console.

B.

• 1. Review the software process.• 2. Generate private and public key pairs and use Pretty Good Privacy (PGP) protocols to sign the output software artifacts together with a file containing the address of your enterprise and point of contact.• 3. Publish the PGP signed attestation to your public web page.

C.

• 1, Publish the software code on GitHub as open source.• 2. Establish a bug bounty program, and encourage the open source community to review, report, and fix the vulnerabilities.

D.

• 1. Hire an external auditor to review and provide provenance• 2. Define the scope and conditions.• 3. Get support from the Security department or representative.• 4. Publish the attestation to your public web page.

Question 73

A company migrated their entire data/center to Google Cloud Platform. It is running thousands of instances across multiple projects managed by different departments. You want to have a historical record of what was running in Google Cloud Platform at any point in time.

What should you do?

Options:

A.

Use Resource Manager on the organization level.

B.

Use Forseti Security to automate inventory snapshots.

C.

Use Stackdriver to create a dashboard across all projects.

D.

Use Security Command Center to view all assets across the organization.

Question 74

Your organization is migrating its primary web application from on-premises to Google Kubernetes Engine (GKE). You must advise the development team on how to grant their applications access to Google Cloud services from within GKE according to security recommended practices. What should you do?

Options:

A.

Create an application-specific IAM service account and generate a user-managed service account key for it. Inject the key to the workload by storing it as a Kubernetes secret within the same namespace as the application.

B.

Enable Workload Identity for GKE. Assign a Kubernetes service account to the application and configure that Kubernetes service account to act as an Identity and Access Management (IAM) service account. Grant the required roles to the IAM service account.

C.

Configure the GKE nodes to use the default Compute Engine service account.

D.

Create a user-managed service account with only the roles required for the specific workload. Assign this service account to the GKE nodes.

Question 75

A company allows every employee to use Google Cloud Platform. Each department has a Google Group, with

all department members as group members. If a department member creates a new project, all members of that department should automatically have read-only access to all new project resources. Members of any other department should not have access to the project. You need to configure this behavior.

What should you do to meet these requirements?

Options:

A.

Create a Folder per department under the Organization. For each department’s Folder, assign the Project Viewer role to the Google Group related to that department.

B.

Create a Folder per department under the Organization. For each department’s Folder, assign the Project Browser role to the Google Group related to that department.

C.

Create a Project per department under the Organization. For each department’s Project, assign the Project Viewer role to the Google Group related to that department.

D.

Create a Project per department under the Organization. For each department’s Project, assign the Project Browser role to the Google Group related to that department.

Question 76

Your organization has had a few recent DDoS attacks. You need to authenticate responses to domain name lookups. Which Google Cloud service should you use?

Options:

A.

Cloud DNS with DNSSEC

B.

Cloud NAT

C.

HTTP(S) Load Balancing

D.

Google Cloud Armor

Question 77

Your organization is worried about recent news headlines regarding application vulnerabilities in production applications that have led to security breaches. You want to automatically scan your deployment pipeline for vulnerabilities and ensure only scanned and verified containers can run in the environment. What should you do?

Options:

A.

Enable Binary Authorization and create attestations of scans.

B.

Use gcloud artifacts docker images describe LOCATION-docker.pkg.dev/PROJECT_ID/REPOSITORY/IMAGE_ID@sha256:HASH --show-package-vulnerability in your CI/CD pipeline, and trigger a pipeline failure for critical vulnerabilities.

C.

Use Kubernetes role-based access control (RBAC) as the source of truth for cluster access by granting "container clusters.get" to limited users. Restrict deployment access by allowing these users to generate a kubeconfig file containing the configuration access to the GKE cluster.

D.

Enforce the use of Cloud Code for development so users receive real-time security feedback on vulnerable libraries and dependencies before they check in their code.

Question 78

An office manager at your small startup company is responsible for matching payments to invoices and creating billing alerts. For compliance reasons, the office manager is only permitted to have the Identity and Access Management (IAM) permissions necessary for these tasks. Which two IAM roles should the office manager have? (Choose two.)

Options:

A.

Organization Administrator

B.

Project Creator

C.

Billing Account Viewer

D.

Billing Account Costs Manager

E.

Billing Account User

Question 79

You are a security administrator at your company. Per Google-recommended best practices, you implemented the domain restricted sharing organization policy to allow only required domains to access your projects. An engineering team is now reporting that users at an external partner outside your organization domain cannot be granted access to the resources in a project. How should you make an exception for your partner's domain while following the stated best practices?

Options:

A.

Turn off the domain restriction sharing organization policy. Set the policy value to "Allow All."

B.

Turn off the domain restricted sharing organization policy. Provide the external partners with the required permissions using Google's Identity and Access Management (IAM) service.

C.

Turn off the domain restricted sharing organization policy. Add each partner's Google Workspace customer ID to a Google group, add the Google group as an exception under the organization policy, and then turn the policy back on.

D.

Turn off the domain restricted sharing organization policy. Set the policy value to "Custom." Add each external partner's Cloud Identity or Google Workspace customer ID as an exception under the organization policy, and then turn the policy back on.

Question 80

A customer deployed an application on Compute Engine that takes advantage of the elastic nature of cloud computing.

How can you work with Infrastructure Operations Engineers to best ensure that Windows Compute Engine VMs are up to date with all the latest OS patches?

Options:

A.

Build new base images when patches are available, and use a CI/CD pipeline to rebuild VMs, deploying incrementally.

B.

Federate a Domain Controller into Compute Engine, and roll out weekly patches via Group Policy Object.

C.

Use Deployment Manager to provision updated VMs into new serving Instance Groups (IGs).

D.

Reboot all VMs during the weekly maintenance window and allow the StartUp Script to download the latest patches from the internet.

Question 81

Your team wants to make sure Compute Engine instances running in your production project do not have public IP addresses. The frontend application Compute Engine instances will require public IPs. The product engineers have the Editor role to modify resources. Your team wants to enforce this requirement.

How should your team meet these requirements?

Options:

A.

Enable Private Access on the VPC network in the production project.

B.

Remove the Editor role and grant the Compute Admin IAM role to the engineers.

C.

Set up an organization policy to only permit public IPs for the front-end Compute Engine instances.

D.

Set up a VPC network with two subnets: one with public IPs and one without public IPs.

Question 82

You work for a healthcare provider that is expanding into the cloud to store and process sensitive patient data. You must ensure the chosen Google Cloud configuration meets these strict regulatory requirements:​

Data must reside within specific geographic regions.​

Certain administrative actions on patient data require explicit approval from designated compliance officers.​

Access to patient data must be auditable.​

What should you do?

Options:

A.

Select multiple standard Google Cloud regions for high availability. Implement Access Control Lists (ACLs) on individual storage objects containing patient data. Enable Cloud Audit Logs.​

B.

Deploy an Assured Workloads environment in multiple regions for redundancy. Utilize custom IAM roles with granular permissions. Isolate network-level data by using VPC Service Controls.​

C.

Deploy an Assured Workloads environment in an approved region. Configure Access Approval for sensitive operations on patient data. Enable both Cloud Audit Logs and Access Transparency.​

D.

Select a standard Google Cloud region. Restrict access to patient data based on user location and job function by using Access Context Manager. Enable both Cloud Audit Logging and Access Transparency.​

Question 83

You work for a large organization that recently implemented a 100GB Cloud Interconnect connection between your Google Cloud and your on-premises edge router. While routinely checking the connectivity, you noticed that the connection is operational but there is an error message that indicates MACsec is operationally down. You need to resolve this error. What should you do?

Options:

A.

Ensure that the active pre-shared key created for MACsec is not expired on both the on-premises and Google edge routers.

B.

Ensure that the active pre-shared key matches on both the on-premises and Google edge routers.

C.

Ensure that the Cloud Interconnect connection supports MACsec.

D.

Ensure that the on-premises router is not down.

Question 84

Users are reporting an outage on your public-facing application that is hosted on Compute Engine. You suspect that a recent change to your firewall rules is responsible. You need to test whether your firewall rules are working properly. What should you do?

Options:

A.

Enable Firewall Rules Logging on the latest rules that were changed. Use Logs Explorer to analyze whether the rules are working correctly.

B.

Connect to a bastion host in your VPC. Use a network traffic analyzer to determine at which point your requests are being blocked.

C.

In a pre-production environment, disable all firewall rules individually to determine which one is blocking user traffic.

D.

Enable VPC Flow Logs in your VPC. Use Logs Explorer to analyze whether the rules are working correctly.

Question 85

Your company has multiple teams needing access to specific datasets across various Google Cloud data services for different projects. You need to ensure that team members can only access the data relevant to their projects and prevent unauthorized access to sensitive information within BigQuery, Cloud Storage, and Cloud SQL. What should you do?

Options:

A.

Grant project-level group permissions by using specific Cloud IAM roles. Use BigQuery authorized views. Cloud Storage uniform bucket-level access, and Cloud SQL database roles.

B.

Configure an access level to control access to the Google Cloud console for users managing these data services. Require multi-factor authentication for all access attempts.

C.

Use VPC Service Controls to create security perimeters around the projects for BigQuery. Cloud Storage, and Cloud SQL services. restricting access based on the network origin of the requests.

D.

Enable project-level data access logs for BigQuery. Cloud Storage, and Cloud SQL. Configure log sinks to export these logs to Security Command Center to identify unauthorized access attempts.

Question 86

A customer needs to launch a 3-tier internal web application on Google Cloud Platform (GCP). The customer’s internal compliance requirements dictate that end-user access may only be allowed if the traffic seems to originate from a specific known good CIDR. The customer accepts the risk that their application will only have SYN flood DDoS protection. They want to use GCP’s native SYN flood protection.

Which product should be used to meet these requirements?

Options:

A.

Cloud Armor

B.

VPC Firewall Rules

C.

Cloud Identity and Access Management

D.

Cloud CDN

Question 87

You are a security engineer at a finance company. Your organization plans to store data on Google Cloud, but your leadership team is worried about the security of their highly sensitive data Specifically, your

company is concerned about internal Google employees' ability to access your company's data on Google Cloud. What solution should you propose?

Options:

A.

Use customer-managed encryption keys.

B.

Use Google's Identity and Access Management (IAM) service to manage access controls on Google Cloud.

C.

Enable Admin activity logs to monitor access to resources.

D.

Enable Access Transparency logs with Access Approval requests for Google employees.

Question 88

A customer terminates an engineer and needs to make sure the engineer's Google account is automatically deprovisioned.

What should the customer do?

Options:

A.

Use the Cloud SDK with their directory service to remove their IAM permissions in Cloud Identity.

B.

Use the Cloud SDK with their directory service to provision and deprovision users from Cloud Identity.

C.

Configure Cloud Directory Sync with their directory service to provision and deprovision users from Cloud Identity.

D.

Configure Cloud Directory Sync with their directory service to remove their IAM permissions in Cloud Identity.

Question 89

Your organization operates Virtual Machines (VMs) with only private IPs in the Virtual Private Cloud (VPC) with internet access through Cloud NAT Everyday, you must patch all VMs with critical OS updates and provide summary reports

What should you do?

Options:

A.

Validate that the egress firewall rules allow any outgoing traffic Log in to each VM and execute OS specific update commands Configure the Cloud Scheduler job to update with critical patches daily for daily updates.

B.

Ensure that VM Manager is installed and running on the VMs. In the OS patch management service. configure the patch jobs to update with critical patches daily.

C.

Assign public IPs to VMs. Validate that the egress firewall rules allow any outgoing traffic Log in to each VM. and configure a daily cron job to enable for OS updates at night during low activity periods.

D.

Copy the latest patches to the Cloud Storage bucket. Log in to each VM. download the patches from the bucket, and install them.

Question 90

You are deploying regulated workloads on Google Cloud. The regulation has data residency and data access requirements. It also requires that support is provided from the same geographical location as where the data resides.

What should you do?

Options:

A.

Enable Access Transparency Logging.

B.

Deploy resources only to regions permitted by data residency requirements

C.

Use Data Access logging and Access Transparency logging to confirm that no users are accessing data from another region.

D.

Deploy Assured Workloads.

Question 91

A customer has 300 engineers. The company wants to grant different levels of access and efficiently manage IAM permissions between users in the development and production environment projects.

Which two steps should the company take to meet these requirements? (Choose two.)

Options:

A.

Create a project with multiple VPC networks for each environment.

B.

Create a folder for each development and production environment.

C.

Create a Google Group for the Engineering team, and assign permissions at the folder level.

D.

Create an Organizational Policy constraint for each folder environment.

E.

Create projects for each environment, and grant IAM rights to each engineering user.

Question 92

In an effort for your company messaging app to comply with FIPS 140-2, a decision was made to use GCP compute and network services. The messaging app architecture includes a Managed Instance Group (MIG) that controls a cluster of Compute Engine instances. The instances use Local SSDs for data caching and UDP for instance-to-instance communications. The app development team is willing to make any changes necessary to comply with the standard

Which options should you recommend to meet the requirements?

Options:

A.

Encrypt all cache storage and VM-to-VM communication using the BoringCrypto module.

B.

Set Disk Encryption on the Instance Template used by the MIG to customer-managed key and use BoringSSL for all data transit between instances.

C.

Change the app instance-to-instance communications from UDP to TCP and enable BoringSSL on clients' TLS connections.

D.

Set Disk Encryption on the Instance Template used by the MIG to Google-managed Key and use BoringSSL library on all instance-to-instance communications.

Question 93

You are a Cloud Identity administrator for your organization. In your Google Cloud environment groups are used to manage user permissions. Each application team has a dedicated group Your team is responsible for creating these groups and the application teams can manage the team members on their own through the Google Cloud console. You must ensure that the application teams can only add users from within your organization to their groups.

What should you do?

Options:

A.

Change the configuration of the relevant groups in the Google Workspace Admin console to prevent external users from being added to the group.

B.

Set an Identity and Access Management (1AM) policy that includes a condition that restricts group membership to user principals that belong to your organization.

C.

Define an Identity and Access Management (IAM) deny policy that denies the assignment of principals that are outside your organization to the groups in scope.

D.

Export the Cloud Identity logs to BigQuery Configure an alert for external members added to groups Have the alert trigger a Cloud Function instance that removes the external members from the group.

Question 94

Your organization acquired a new workload. The Web and Application (App) servers will be running on Compute Engine in a newly created custom VPC. You are responsible for configuring a secure network communication solution that meets the following requirements:

Only allows communication between the Web and App tiers.

Enforces consistent network security when autoscaling the Web and App tiers.

Prevents Compute Engine Instance Admins from altering network traffic.

What should you do?

Options:

A.

1. Configure all running Web and App servers with respective network tags.2. Create an allow VPC firewall rule that specifies the target/source with respective network tags.

B.

1. Configure all running Web and App servers with respective service accounts.2. Create an allow VPC firewall rule that specifies the target/source with respective service accounts.

C.

1. Re-deploy the Web and App servers with instance templates configured with respective network tags.2. Create an allow VPC firewall rule that specifies the target/source with respective network tags.

D.

1. Re-deploy the Web and App servers with instance templates configured with respective service accounts.2. Create an allow VPC firewall rule that specifies the target/source with respective service accounts.

Question 95

A customer is running an analytics workload on Google Cloud Platform (GCP) where Compute Engine instances are accessing data stored on Cloud Storage. Your team wants to make sure that this workload will not be able to access, or be accessed from, the internet.

Which two strategies should your team use to meet these requirements? (Choose two.)

Options:

A.

Configure Private Google Access on the Compute Engine subnet

B.

Avoid assigning public IP addresses to the Compute Engine cluster.

C.

Make sure that the Compute Engine cluster is running on a separate subnet.

D.

Turn off IP forwarding on the Compute Engine instances in the cluster.

E.

Configure a Cloud NAT gateway.

Page: 1 / 30
Total 297 questions