Weekend Sale Discount Flat 70% Offer - Ends in 0d 00h 00m 00s - Coupon code: 70diswrap

Google Professional-Cloud-Security-Engineer Dumps

Google Cloud Certified - Professional Cloud Security Engineer Questions and Answers

Question 1

Your organization is using Google Workspace. Google Cloud, and a third-party SIEM. You need to export events such as user logins, successful logins, and failed logins to the SIEM. Logs need to be ingested in real time or near real-time. What should you do?

Options:

A.

Create a Cloud Logging sink to export relevant authentication logs to a Pub/Sub topic for SIEM subscription.

B.

Poll Cloud Logging for authentication events using the gcloud logging read tool. Forward the events to the SIEM.

C.

Configure Google Workspace to directly send logs to the API endpoint of the third-party SIEM.

D.

Create a Cloud Storage bucket as a sink for all logs. Configure the SIEM to periodically scan the bucket for new log files.

Question 2

A company is backing up application logs to a Cloud Storage bucket shared with both analysts and the administrator. Analysts should only have access to logs that do not contain any personally identifiable information (PII). Log files containing PII should be stored in another bucket that is only accessible by the administrator.

What should you do?

Options:

A.

Use Cloud Pub/Sub and Cloud Functions to trigger a Data Loss Prevention scan every time a file is uploaded to the shared bucket. If the scan detects PII, have the function move into a Cloud Storage bucket only accessible by the administrator.

B.

Upload the logs to both the shared bucket and the bucket only accessible by the administrator. Create ajob trigger using the Cloud Data Loss Prevention API. Configure the trigger to delete any files from the shared bucket that contain PII.

C.

On the bucket shared with both the analysts and the administrator, configure Object Lifecycle Management to delete objects that contain any PII.

D.

On the bucket shared with both the analysts and the administrator, configure a Cloud Storage Trigger that is only triggered when PII data is uploaded. Use Cloud Functions to capture the trigger and delete such files.

Question 3

Your company has deployed an application on Compute Engine. The application is accessible by clients on port 587. You need to balance the load between the different instances running the application. The connection should be secured using TLS, and terminated by the Load Balancer.

What type of Load Balancing should you use?

Options:

A.

Network Load Balancing

B.

HTTP(S) Load Balancing

C.

TCP Proxy Load Balancing

D.

SSL Proxy Load Balancing

Question 4

You manage a fleet of virtual machines (VMs) in your organization. You have encountered issues with lack of patching in many VMs. You need to automate regular patching in your VMs and view the patch management data across multiple projects.

What should you do?

Choose 2 answers

Options:

A.

Deploy patches with VM Manager by using OS patch management

B.

View patch management data in VM Manager by using OS patch management.

C.

Deploy patches with Security Command Center by using Rapid Vulnerability Detection.

D.

View patch management data in a Security Command Center dashboard.

E.

View patch management data in Artifact Registry.

Question 5

Your company requires the security and network engineering teams to identify all network anomalies within and across VPCs, internal traffic from VMs to VMs, traffic between end locations on the internet and VMs, and traffic between VMs to Google Cloud services in production. Which method should you use?

Options:

A.

Define an organization policy constraint.

B.

Configure packet mirroring policies.

C.

Enable VPC Flow Logs on the subnet.

D.

Monitor and analyze Cloud Audit Logs.

Question 6

You’re developing the incident response plan for your company. You need to define the access strategy that your DevOps team will use when reviewing and investigating a deployment issue in your Google Cloud environment. There are two main requirements:

    Least-privilege access must be enforced at all times.

    The DevOps team must be able to access the required resources only during the deployment issue.

How should you grant access while following Google-recommended best practices?

Options:

A.

Assign the Project Viewer Identity and Access Management (1AM) role to the DevOps team.

B.

Create a custom 1AM role with limited list/view permissions, and assign it to the DevOps team.

C.

Create a service account, and grant it the Project Owner 1AM role. Give the Service Account User Role on this service account to the DevOps team.

D.

Create a service account, and grant it limited list/view permissions. Give the Service Account User Role on this service account to the DevOps team.

Question 7

A customer deploys an application to App Engine and needs to check for Open Web Application Security Project (OWASP) vulnerabilities.

Which service should be used to accomplish this?

Options:

A.

Cloud Armor

B.

Google Cloud Audit Logs

C.

Cloud Security Scanner

D.

Forseti Security

Question 8

Your company operates an application instance group that is currently deployed behind a Google Cloud load balancer in us-central-1 and is configured to use the Standard Tier network. The infrastructure team wants to expand to a second Google Cloud region, us-east-2. You need to set up a single external IP address to distribute new requests to the instance groups in both regions.

What should you do?

Options:

A.

Change the load balancer backend configuration to use network endpoint groups instead of instance groups.

B.

Change the load balancer frontend configuration to use the Premium Tier network, and add the new instance group.

C.

Create a new load balancer in us-east-2 using the Standard Tier network, and assign a static external IP address.

D.

Create a Cloud VPN connection between the two regions, and enable Google Private Access.

Question 9

An organization's security and risk management teams are concerned about where their responsibility lies for certain production workloads they are running in Google Cloud Platform (GCP), and where Google's responsibility lies. They are mostly running workloads using Google Cloud's Platform-as-a-Service (PaaS) offerings, including App Engine primarily.

Which one of these areas in the technology stack would they need to focus on as their primary responsibility when using App Engine?

Options:

A.

Configuring and monitoring VPC Flow Logs

B.

Defending against XSS and SQLi attacks

C.

Manage the latest updates and security patches for the Guest OS

D.

Encrypting all stored data

Question 10

An organization’s typical network and security review consists of analyzing application transit routes, request handling, and firewall rules. They want to enable their developer teams to deploy new applications without the overhead of this full review.

How should you advise this organization?

Options:

A.

Use Forseti with Firewall filters to catch any unwanted configurations in production.

B.

Mandate use of infrastructure as code and provide static analysis in the CI/CD pipelines to enforce policies.

C.

Route all VPC traffic through customer-managed routers to detect malicious patterns in production.

D.

All production applications will run on-premises. Allow developers free rein in GCP as their dev and QA platforms.

Question 11

Your company’s detection and response team requires break-glass access to the Google Cloud organization in the event of a security investigation. At the end of each day, all security group membership is removed. You need to automate user provisioning to a Cloud Identity security group. You have created a service account to provision group memberships. Your solution must follow Google-recommended practices and comply with the principle of least privilege. What should you do?

Options:

A.

In Google Workspace, grant the service account client ID access to the scope, https://www.googleapis.com/auth/admin.directory.group, by using domain-wide delegation, and use a service account key.

B.

In Google Workspace, grant the service account client ID access to the scope, https://www.googleapis.com/auth/admin.directory.group, by using domain-wide delegation. Use Application Default Credentials with the resource-attached service account.

C.

In Google Workspace, grant the Groups Editor role to the service account. Enable the Cloud Identity API. Use a service account key.

D.

In Google Workspace, grant the Groups Editor role to the service account, enable the Cloud Identity API, and use Application Default Credentials with the resource-attached service account.

Question 12

You are managing a set of Google Cloud projects that are contained in a folder named Data Warehouse A new data analysis team has been approved to perform data analysis for all BigQuery data in the projects within the Data Warehouse folder. They should only be able to read the data and not have permissions to modify or delete the data. You want to reduce the operational overhead of provisioning access while adhering to the principle of least privilege. What should you do?

Options:

A.

Grant the BigQuery Data Viewer role at the dataset level for each BigQuery dataset within each project in the Data Warehouse folder

B.

Grant the BigQuery Data Viewer role at the Data Warehouse folder.

C.

Grant the BigQuery Data Viewer role at the project level for each project within the Data Warehouse folder.

D.

Grant the BigQuery Metadata Viewer role at the Data Warehouse folder

Question 13

Your company has deployed an artificial intelligence model in a central project. As this model has a lot of sensitive intellectual property and must be kept strictly isolated from the internet, you must expose the model endpoint only to a defined list of projects in your organization. What should you do?

Options:

A.

Create a central project to host Shared VPC networks that are provided to all other projects. Centrally administer all firewall rules in this project to grant access to the model.

B.

Activate Private Google Access in both the model project as well as in each project that needs to connect to the model. Create a firewall policy to allow connectivity to Private Google Access addresses.

C.

Within the model project, create an internal Application Load Balancer that points to the model endpoint. Expose this load balancer with Private Service Connect to a configured list of projects.

D.

Within the model project, create an external Application Load Balancer that points to the model endpoint. Create a Cloud Armor policy to restrict IP addresses to Google Cloud.

Question 14

A DevOps team will create a new container to run on Google Kubernetes Engine. As the application will be internet-facing, they want to minimize the attack surface of the container.

What should they do?

Options:

A.

Use Cloud Build to build the container images.

B.

Build small containers using small base images.

C.

Delete non-used versions from Container Registry.

D.

Use a Continuous Delivery tool to deploy the application.

Question 15

You have a highly sensitive BigQuery workload that contains personally identifiable information (Pll) that you want to ensure is not accessible from the internet. To prevent data exfiltration only requests from authorized IP addresses are allowed to query your BigQuery tables.

What should you do?

Options:

A.

Use service perimeter and create an access level based on the authorized source IP address as the condition.

B.

Use Google Cloud Armor security policies defining an allowlist of authorized IP addresses at the global HTTPS load balancer.

C.

Use the Restrict allowed Google Cloud APIs and services organization policy constraint along with Cloud Data Loss Prevention (DLP).

D.

Use the Restrict Resource service usage organization policy constraint along with Cloud Data Loss Prevention (DLP).

Question 16

You define central security controls in your Google Cloud environment for one of the folders in your organization you set an organizational policy to deny the assignment of external IP addresses to VMs. Two days later you receive an alert about a new VM with an external IP address under that folder.

What could have caused this alert?

Options:

A.

The VM was created with a static external IP address that was reserved in the project before the organizational policy rule was set.

B.

The organizational policy constraint wasn't properly enforced and is running in "dry run mode.

C.

At project level, the organizational policy control has been overwritten with an 'allow' value.

D.

The policy constraint on the folder level does not have any effect because of an allow" value for that constraint on the organizational level.

Question 17

Your organization is using GitHub Actions as a continuous integration and delivery (Cl/CD) platform. You must enable access to Google Cloud resources from the Cl/CD pipelines in the most secure way.

What should you do?

Options:

A.

Create a service account key and add it to the GitHub pipeline configuration file.

B.

Create a service account key and add it to the GitHub repository content.

C.

Configure a Google Kubernetes Engine cluster that uses Workload Identity to supply credentials to GitHub.

D.

Configure workload identity federation to use GitHub as an identity pool provider.

Question 18

Your company has been creating users manually in Cloud Identity to provide access to Google Cloud resources. Due to continued growth of the environment, you want to authorize the Google Cloud Directory Sync (GCDS) instance and integrate it with your on-premises LDAP server to onboard hundreds of users. You are required to:

Replicate user and group lifecycle changes from the on-premises LDAP server in Cloud Identity.

Disable any manually created users in Cloud Identity.

You have already configured the LDAP search attributes to include the users and security groups in scope for Google Cloud. What should you do next to complete this solution?

Options:

A.

1. Configure the option to suspend domain users not found in LDAP.2. Set up a recurring GCDS task.

B.

1. Configure the option to delete domain users not found in LDAP.2. Run GCDS after user and group lifecycle changes.

C.

1. Configure the LDAP search attributes to exclude manually created Cloud Identity users not found in LDAP.2. Set up a recurring GCDS task.

D.

1. Configure the LDAP search attributes to exclude manually created Cloud identity users not found in LDAP.2. Run GCDS after user and group lifecycle changes.

Question 19

You are a Cloud Identity administrator for your organization. In your Google Cloud environment groups are used to manage user permissions. Each application team has a dedicated group Your team is responsible for creating these groups and the application teams can manage the team members on their own through the Google Cloud console. You must ensure that the application teams can only add users from within your organization to their groups.

What should you do?

Options:

A.

Change the configuration of the relevant groups in the Google Workspace Admin console to prevent external users from being added to the group.

B.

Set an Identity and Access Management (1AM) policy that includes a condition that restricts group membership to user principals that belong to your organization.

C.

Define an Identity and Access Management (IAM) deny policy that denies the assignment of principals that are outside your organization to the groups in scope.

D.

Export the Cloud Identity logs to BigQuery Configure an alert for external members added to groups Have the alert trigger a Cloud Function instance that removes the external members from the group.

Question 20

Your company plans to move most of its IT infrastructure to Google Cloud. They want to leverage their existing on-premises Active Directory as an identity provider for Google Cloud. Which two steps should you take to integrate the company’s on-premises Active Directory with Google Cloud and configure access management? (Choose two.)

Options:

A.

Use Identity Platform to provision users and groups to Google Cloud.

B.

Use Cloud Identity SAML integration to provision users and groups to Google Cloud.

C.

Install Google Cloud Directory Sync and connect it to Active Directory and Cloud Identity.

D.

Create Identity and Access Management (1AM) roles with permissions corresponding to each Active Directory group.

E.

Create Identity and Access Management (1AM) groups with permissions corresponding to each Active Directory group.

Question 21

You will create a new Service Account that should be able to list the Compute Engine instances in the project. You want to follow Google-recommended practices.

What should you do?

Options:

A.

Create an Instance Template, and allow the Service Account Read Only access for the Compute Engine Access Scope.

B.

Create a custom role with the permission compute.instances.list and grant the Service Account this role.

C.

Give the Service Account the role of Compute Viewer, and use the new Service Account for all instances.

D.

Give the Service Account the role of Project Viewer, and use the new Service Account for all instances.

Question 22

An organization receives an increasing number of phishing emails.

Which method should be used to protect employee credentials in this situation?

Options:

A.

Multifactor Authentication

B.

A strict password policy

C.

Captcha on login pages

D.

Encrypted emails

Question 23

You must ensure that the keys used for at-rest encryption of your data are compliant with your organization's security controls. One security control mandates that keys get rotated every 90 days. You must implement an effective detection strategy to validate if keys are rotated as required. What should you do?​

Options:

A.

Analyze the crypto key versions of the keys by using data from Cloud Asset Inventory. If an active key is older than 90 days, send an alert message through your incident notification channel.​

B.

Identify keys that have not been rotated by using Security Health Analytics. If a key is not rotated after 90 days, a finding in Security Command Center is raised.​

C.

Assess the keys in the Cloud Key Management Service by implementing code in Cloud Run. If a key is not rotated after 90 days, raise a finding in Security Command Center.​

D.

Define a metric that checks for timely key updates by using Cloud Logging. If a key is not rotated after 90 days, send an alert message through your incident notification channel.​

Question 24

You have been tasked with inspecting IP packet data for invalid or malicious content. What should you do?

Options:

A.

Use Packet Mirroring to mirror traffic to and from particular VM instances. Perform inspection using security software that analyzes the mirrored traffic.

B.

Enable VPC Flow Logs for all subnets in the VPC. Perform inspection on the Flow Logs data using Cloud Logging.

C.

Configure the Fluentd agent on each VM Instance within the VPC. Perform inspection on the log data using Cloud Logging.

D.

Configure Google Cloud Armor access logs to perform inspection on the log data.

Question 25

You manage your organization’s Security Operations Center (SOC). You currently monitor and detect network traffic anomalies in your VPCs based on network logs. However, you want to explore your environment using network payloads and headers. Which Google Cloud product should you use?

Options:

A.

Cloud IDS

B.

VPC Service Controls logs

C.

VPC Flow Logs

D.

Google Cloud Armor

E.

Packet Mirroring

Question 26

A business unit at a multinational corporation signs up for GCP and starts moving workloads into GCP. The business unit creates a Cloud Identity domain with an organizational resource that has hundreds of projects.

Your team becomes aware of this and wants to take over managing permissions and auditing the domain resources.

Which type of access should your team grant to meet this requirement?

Options:

A.

Organization Administrator

B.

Security Reviewer

C.

Organization Role Administrator

D.

Organization Policy Administrator

Question 27

You manage your organization's Security Operations Center (SOC). You currently monitor and detect network traffic anomalies in your Google Cloud VPCs based on packet header information. However, you want the capability to explore network flows and their payload to aid investigations. Which Google Cloud product should you use?

Options:

A.

Marketplace IDS

B.

VPC Flow Logs

C.

VPC Service Controls logs

D.

Packet Mirroring

E.

Google Cloud Armor Deep Packet Inspection

Question 28

You have noticed an increased number of phishing attacks across your enterprise user accounts. You want to implement the Google 2-Step Verification (2SV) option that uses a cryptographic signature to authenticate a user and verify the URL of the login page. Which Google 2SV option should you use?

Options:

A.

Titan Security Keys

B.

Google prompt

C.

Google Authenticator app

D.

Cloud HSM keys

Question 29

You work for an ecommerce company that stores sensitive customer data across multiple Google Cloud regions. The development team has built a new 3-tier application to process orders and must integrate the application into the production environment. You must design the network architecture to ensure strong security boundaries and isolation for the new application, facilitate secure remote maintenance by authorized third-party vendors, and follow the principle of least privilege. What should you do?

Options:

A.

Create separate VPC networks for each tier. Use VPC peering between application tiers and other required VPCs. Provide vendors with SSH keys and root access only to the instances within the VPC for maintenance purposes.

B.

Create a single VPC network and create different subnets for each tier. Create a new Google project specifically for the third-party vendors and grant the network admin role to the vendors. Deploy a VPN appliance and rely on the vendors' configurations to secure third-party access.

C.

Create separate VPC networks for each tier. Use VPC peering between application tiers and other required VPCs. Enable Identity-Aware Proxy (IAP) for remote access to management resources, limiting access to authorized vendors.

D.

Create a single VPC network and create different subnets for each tier. Create a new Google project specifically for the third-party vendors. Grant the vendors ownership of that project and the ability to modify the Shared VPC configuration.

Question 30

Your company has multiple teams needing access to specific datasets across various Google Cloud data services for different projects. You need to ensure that team members can only access the data relevant to their projects and prevent unauthorized access to sensitive information within BigQuery, Cloud Storage, and Cloud SQL. What should you do?

Options:

A.

Grant project-level group permissions by using specific Cloud IAM roles. Use BigQuery authorized views. Cloud Storage uniform bucket-level access, and Cloud SQL database roles.

B.

Configure an access level to control access to the Google Cloud console for users managing these data services. Require multi-factor authentication for all access attempts.

C.

Use VPC Service Controls to create security perimeters around the projects for BigQuery. Cloud Storage, and Cloud SQL services. restricting access based on the network origin of the requests.

D.

Enable project-level data access logs for BigQuery. Cloud Storage, and Cloud SQL. Configure log sinks to export these logs to Security Command Center to identify unauthorized access attempts.

Question 31

Your organization uses the top-tier folder to separate application environments (prod and dev). The developers need to see all application development audit logs but they are not permitted to review production logs. Your security team can review all logs in production and development environments. You must grant Identity and Access Management (1AM) roles at the right resource level tor the developers and security team while you ensure least privilege.

What should you do?

Options:

A.

• 1 Grant logging, viewer rote to the security team at the organization resource level.• 2 Grant logging, viewer rote to the developer team at the folder resource level that contains all the dev projects.

B.

• 1 Grant logging. viewer rote to the security team at the organization resource level.• 2 Grant logging. admin role to the developer team at the organization resource level.

C.

• 1 Grant logging.admin role to the security team at the organization resource level.• 2 Grant logging. viewer rote to the developer team at the folder resource level that contains all the dev projects.

D.

• 1 Grant logging.admin role to the security team at the organization resource level.• 2 Grant logging.admin role to the developer team at the organization resource level.

Question 32

A company is deploying their application on Google Cloud Platform. Company policy requires long-term data to be stored using a solution that can automatically replicate data over at least two geographic places.

Which Storage solution are they allowed to use?

Options:

A.

Cloud Bigtable

B.

Cloud BigQuery

C.

Compute Engine SSD Disk

D.

Compute Engine Persistent Disk

Question 33

A company migrated their entire data/center to Google Cloud Platform. It is running thousands of instances across multiple projects managed by different departments. You want to have a historical record of what was running in Google Cloud Platform at any point in time.

What should you do?

Options:

A.

Use Resource Manager on the organization level.

B.

Use Forseti Security to automate inventory snapshots.

C.

Use Stackdriver to create a dashboard across all projects.

D.

Use Security Command Center to view all assets across the organization.

Question 34

You run applications on Cloud Run. You already enabled container analysis for vulnerability scanning. However, you are concerned about the lack of control on the applications that are deployed. You must ensure that only trusted container images are deployed on Cloud Run.

What should you do?

Choose 2 answers

Options:

A.

Enable Binary Authorization on the existing Kubernetes cluster.

B.

Set the organization policy constraint constraints/run. allowedBinaryAuthorizationPolicie tothe list of allowed Binary Authorization policy names.

C.

Set the organization policy constraint constraints/compute.trustedimageProjects to the list ofprotects that contain the trusted container images.

D.

Enable Binary Authorization on the existing Cloud Run service.

E.

Use Cloud Run breakglass to deploy an image that meets the Binary Authorization policy by default.

Question 35

A patch for a vulnerability has been released, and a DevOps team needs to update their running containers in Google Kubernetes Engine (GKE).

How should the DevOps team accomplish this?

Options:

A.

Use Puppet or Chef to push out the patch to the running container.

B.

Verify that auto upgrade is enabled; if so, Google will upgrade the nodes in a GKE cluster.

C.

Update the application code or apply a patch, build a new image, and redeploy it.

D.

Configure containers to automatically upgrade when the base image is available in Container Registry.

Question 36

A customer implements Cloud Identity-Aware Proxy for their ERP system hosted on Compute Engine. Their security team wants to add a security layer so that the ERP systems only accept traffic from Cloud Identity- Aware Proxy.

What should the customer do to meet these requirements?

Options:

A.

Make sure that the ERP system can validate the JWT assertion in the HTTP requests.

B.

Make sure that the ERP system can validate the identity headers in the HTTP requests.

C.

Make sure that the ERP system can validate the x-forwarded-for headers in the HTTP requests.

D.

Make sure that the ERP system can validate the user’s unique identifier headers in the HTTP requests.

Question 37

Your organization wants to be General Data Protection Regulation (GDPR) compliant You want to ensure that your DevOps teams can only create Google Cloud resources in the Europe regions.

What should you do?

Options:

A.

Use the org policy constraint "Restrict Resource Service Usage'* on your Google Cloud organization node.

B.

Use Identity and Access Management (1AM) custom roles to ensure that your DevOps team can only create resources in the Europe regions

C.

Use the org policy constraint Google Cloud Platform - Resource Location Restriction" on your Google Cloudorganization node.

D.

Use Identity-Aware Proxy (IAP) with Access Context Manager to restrict the location of Google Cloud resources.

Question 38

An organization adopts Google Cloud Platform (GCP) for application hosting services and needs guidance on setting up password requirements for their Cloud Identity account. The organization has a password policy requirement that corporate employee passwords must have a minimum number of characters.

Which Cloud Identity password guidelines can the organization use to inform their new requirements?

Options:

A.

Set the minimum length for passwords to be 8 characters.

B.

Set the minimum length for passwords to be 10 characters.

C.

Set the minimum length for passwords to be 12 characters.

D.

Set the minimum length for passwords to be 6 characters.

Question 39

An employer wants to track how bonus compensations have changed over time to identify employee outliers and correct earning disparities. This task must be performed without exposing the sensitive compensation data for any individual and must be reversible to identify the outlier.

Which Cloud Data Loss Prevention API technique should you use to accomplish this?

Options:

A.

Generalization

B.

Redaction

C.

CryptoHashConfig

D.

CryptoReplaceFfxFpeConfig

Question 40

You are implementing a new web application on Google Cloud that will be accessed from your on-premises network. To provide protection from threats like malware, you must implement transport layer security (TLS) interception for incoming traffic to your application. What should you do?​

Options:

A.

Configure Secure Web Proxy. Offload the TLS traffic in the load balancer, inspect the traffic, and forward the traffic to the web application.​

B.

Configure an internal proxy load balancer. Offload the TLS traffic in the load balancer, inspect the traffic, and forward the traffic to the web application.​

C.

Configure a hierarchical firewall policy. Enable TLS interception by using Cloud Next Generation Firewall (NGFW) Enterprise.​

D.

Configure a VPC firewall rule. Enable TLS interception by using Cloud Next Generation Firewall (NGFW) Enterprise.​

Question 41

Your organization recently activated the Security Command Center {SCO standard tier. There are a few Cloud Storage buckets that were accidentally made accessible to the public. You need to investigate the impact of the incident and remediate it.

What should you do?

Options:

A.

• 1 Remove the Identity and Access Management (IAM) granting access to allusers from the buckets• 2 Apply the organization policy storage. unifromBucketLevelAccess to prevent regressions• 3 Query the data access logs to report on unauthorized access

B.

• 1 Change bucket permissions to limit access• 2 Query the data access audit logs for any unauthorized access to the buckets• 3 After the misconfiguration is corrected mute the finding in the Security Command Center

C.

• 1 Change permissions to limit access for authorized users• 2 Enforce a VPC Service Controls perimeter around all the production projects to immediately stop any unauthorized access• 3 Review the administrator activity audit logs to report on any unauthorized access

D.

• 1 Change the bucket permissions to limit access• 2 Query the buckets usage logs to report on unauthorized access to the data• 3 Enforce the organization policy storage.publicAccessPrevention to avoid regressions

Question 42

You are in charge of migrating a legacy application from your company datacenters to GCP before the current maintenance contract expires. You do not know what ports the application is using and no documentation is available for you to check. You want to complete the migration without putting your environment at risk.

What should you do?

Options:

A.

Migrate the application into an isolated project using a “Lift & Shift” approach. Enable all internal TCP traffic using VPC Firewall rules. Use VPC Flow logs to determine what traffic should be allowed for theapplication to work properly.

B.

Migrate the application into an isolated project using a “Lift & Shift” approach in a custom network. Disable all traffic within the VPC and look at the Firewall logs to determine what traffic should be allowed for the application to work properly.

C.

Refactor the application into a micro-services architecture in a GKE cluster. Disable all traffic from outside the cluster using Firewall Rules. Use VPC Flow logs to determine what traffic should be allowed for the application to work properly.

D.

Refactor the application into a micro-services architecture hosted in Cloud Functions in an isolated project.Disable all traffic from outside your project using Firewall Rules. Use VPC Flow logs to determine what traffic should be allowed for the application to work properly.

Question 43

A database administrator notices malicious activities within their Cloud SQL instance. The database administrator wants to monitor the API calls that read the configuration or metadata of resources. Which logs should the database administrator review?

Options:

A.

Admin Activity

B.

System Event

C.

Access Transparency

D.

Data Access

Question 44

Your Google Cloud organization allows for administrative capabilities to be distributed to each team through provision of a Google Cloud project with Owner role (roles/ owner). The organization contains thousands of Google Cloud Projects Security Command Center Premium has surfaced multiple cpen_myscl_port findings. You are enforcing the guardrails and need to prevent these types of common misconfigurations.

What should you do?

Options:

A.

Create a firewall rule for each virtual private cloud (VPC) to deny traffic from 0 0 0 0/0 with priority 0.

B.

Create a hierarchical firewall policy configured at the organization to deny all connections from 0 0 0 0/0.

C.

Create a Google Cloud Armor security policy to deny traffic from 0 0 0 0/0.

D.

Create a hierarchical firewall policy configured at the organization to allow connections only from internal IP ranges

Question 45

Users are reporting an outage on your public-facing application that is hosted on Compute Engine. You suspect that a recent change to your firewall rules is responsible. You need to test whether your firewall rules are working properly. What should you do?

Options:

A.

Enable Firewall Rules Logging on the latest rules that were changed. Use Logs Explorer to analyze whether the rules are working correctly.

B.

Connect to a bastion host in your VPC. Use a network traffic analyzer to determine at which point your requests are being blocked.

C.

In a pre-production environment, disable all firewall rules individually to determine which one is blocking user traffic.

D.

Enable VPC Flow Logs in your VPC. Use Logs Explorer to analyze whether the rules are working correctly.

Question 46

You need to implement an encryption at-rest strategy that reduces key management complexity for non-sensitive data and protects sensitive data while providing the flexibility of controlling the key residency and rotation schedule. FIPS 140-2 L1 compliance is required for all data types. What should you do?

Options:

A.

Encrypt non-sensitive data and sensitive data with Cloud External Key Manager.

B.

Encrypt non-sensitive data and sensitive data with Cloud Key Management Service

C.

Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud External Key Manager.

D.

Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud Key Management Service.

Question 47

Your company requires the security and network engineering teams to identify all network anomalies and be able to capture payloads within VPCs. Which method should you use?

Options:

A.

Define an organization policy constraint.

B.

Configure packet mirroring policies.

C.

Enable VPC Flow Logs on the subnet.

D.

Monitor and analyze Cloud Audit Logs.

Question 48

An organization is migrating from their current on-premises productivity software systems to G Suite. Some network security controls were in place that were mandated by a regulatory body in their region for their previous on-premises system. The organization’s risk team wants to ensure that network security controls are maintained and effective in G Suite. A security architect supporting this migration has been asked to ensure that network security controls are in place as part of the new shared responsibility model between the organization and Google Cloud.

What solution would help meet the requirements?

Options:

A.

Ensure that firewall rules are in place to meet the required controls.

B.

Set up Cloud Armor to ensure that network security controls can be managed for G Suite.

C.

Network security is a built-in solution and Google’s Cloud responsibility for SaaS products like G Suite.

D.

Set up an array of Virtual Private Cloud (VPC) networks to control network security as mandated by the relevant regulation.

Question 49

You are part of a security team investigating a compromised service account key. You need to audit which new resources were created by the service account.

What should you do?

Options:

A.

Query Data Access logs.

B.

Query Admin Activity logs.

C.

Query Access Transparency logs.

D.

Query Stackdriver Monitoring Workspace.

Question 50

Your organization uses Google Workspace Enterprise Edition tor authentication. You are concerned about employees leaving their laptops unattended for extended periods of time after authenticating into Google Cloud. You must prevent malicious people from using an employee's unattended laptop to modify their environment.

What should you do?

Options:

A.

Create a policy that requires employees to not leave their sessions open for long durations.

B.

Review and disable unnecessary Google Cloud APIs.

C.

Require strong passwords and 2SV through a security token or Google authenticate.

D.

Set the session length timeout for Google Cloud services to a shorter duration.

Question 51

Your security team wants to reduce the risk of user-managed keys being mismanaged and compromised. To achieve this, you need to prevent developers from creating user-managed service account keys for projects in their organization. How should you enforce this?

Options:

A.

Configure Secret Manager to manage service account keys.

B.

Enable an organization policy to disable service accounts from being created.

C.

Enable an organization policy to prevent service account keys from being created.

D.

Remove the iam.serviceAccounts.getAccessToken permission from users.

Question 52

Your organization recently deployed a new application on Google Kubernetes Engine. You need to deploy a solution to protect the application. The solution has the following requirements:

Scans must run at least once per week

Must be able to detect cross-site scripting vulnerabilities

Must be able to authenticate using Google accounts

Which solution should you use?

Options:

A.

Google Cloud Armor

B.

Web Security Scanner

C.

Security Health Analytics

D.

Container Threat Detection

Question 53

A large e-retailer is moving to Google Cloud Platform with its ecommerce website. The company wants to ensure payment information is encrypted between the customer’s browser and GCP when the customers checkout online.

What should they do?

Options:

A.

Configure an SSL Certificate on an L7 Load Balancer and require encryption.

B.

Configure an SSL Certificate on a Network TCP Load Balancer and require encryption.

C.

Configure the firewall to allow inbound traffic on port 443, and block all other inbound traffic.

D.

Configure the firewall to allow outbound traffic on port 443, and block all other outbound traffic.

Question 54

You are setting up a CI/CD pipeline to deploy containerized applications to your production clusters on Google Kubernetes Engine (GKE). You need to prevent containers with known vulnerabilities from being deployed. You have the following requirements for your solution:

Must be cloud-native

Must be cost-efficient

Minimize operational overhead

How should you accomplish this? (Choose two.)

Options:

A.

Create a Cloud Build pipeline that will monitor changes to your container templates in a Cloud Source Repositories repository. Add a step to analyze Container Analysis results before allowing the build to continue.

B.

Use a Cloud Function triggered by log events in Google Cloud's operations suite to automatically scan your container images in Container Registry.

C.

Use a cron job on a Compute Engine instance to scan your existing repositories for known vulnerabilities and raise an alert if a non-compliant container image is found.

D.

Deploy Jenkins on GKE and configure a CI/CD pipeline to deploy your containers to Container Registry. Add a step to validate your container images before deploying your container to the cluster.

E.

In your CI/CD pipeline, add an attestation on your container image when no vulnerabilities have been found. Use a Binary Authorization policy to block deployments of containers with no attestation in your cluster.

Question 55

You are setting up Cloud Identity for your company's Google Cloud organization. User accounts will be provisioned from Microsoft Entra ID through Directory Sync, and there will be single sign-on through Entra ID. You need to secure the super administrator accounts for the organization. Your solution must follow the principle of least privilege and implement strong authentication. What should you do?

Options:

A.

Create dedicated accounts for super administrators. Ensure that 2-step verification is enforced for the super administrator accounts in Entra ID.

B.

Create dedicated accounts for super administrators. Enforce Google 2-step verification for the super administrator accounts.

C.

Create accounts that combine the organization administrator and the super administrator privileges. Ensure that 2-step verification is enforced for the super administrator accounts in Entra ID.

D.

Create accounts that combine the organization administrators and the super administrator privileges. Enforce Google 2-step verification for the super administrator accounts.

Question 56

A website design company recently migrated all customer sites to App Engine. Some sites are still in progress and should only be visible to customers and company employees from any location.

Which solution will restrict access to the in-progress sites?

Options:

A.

Upload an .htaccess file containing the customer and employee user accounts to App Engine.

B.

Create an App Engine firewall rule that allows access from the customer and employee networks and denies all other traffic.

C.

Enable Cloud Identity-Aware Proxy (IAP), and allow access to a Google Group that contains the customer and employee user accounts.

D.

Use Cloud VPN to create a VPN connection between the relevant on-premises networks and the company’s GCP Virtual Private Cloud (VPC) network.

Question 57

Your organization relies heavily on virtual machines (VMs) in Compute Engine. Due to team growth and resource demands. VM sprawl is becoming problematic. Maintaining consistent security hardening and timely package updates poses an increasing challenge. You need to centralize VM image management and automate the enforcement of security baselines throughout the virtual machine lifecycle. What should you do?

Options:

A.

Activate Security Command Center Enterprise. Use VM discovery and posture management features to monitor hardening state and trigger automatic responses upon detection of issues.B. Create a Cloud Build trigger to build a pipeline that generates hardened VM images. Run vulnerability scans in the pipeline, and store images with passing scans in a registry. Use instance templates pointing to this registry.

B.

Configure the sole-tenancy feature in Compute Engine for all projects. Set up custom organization policies in Policy Controller to restrict the operating systems and image sources that teams are allowed to use.

C.

Use VM Manager to automatically distribute and apply patches to VMs across your projects. Integrate VM Manager with hardened. organization-standard VM images stored in a central repository.

Question 58

You are responsible for protecting highly sensitive data in BigQuery. Your operations teams need access to this data, but given privacy regulations, you want to ensure that they cannot read the sensitive fields such as email addresses and first names. These specific sensitive fields should only be available on a need-to-know basis to the HR team. What should you do?

Options:

A.

Perform data masking with the DLP API and store that data in BigQuery for later use.

B.

Perform data redaction with the DLP API and store that data in BigQuery for later use.

C.

Perform data inspection with the DLP API and store that data in BigQuery for later use.

D.

Perform tokenization for Pseudonymization with the DLP API and store that data in BigQuery for later use.

Question 59

You work for a large organization where each business unit has thousands of users. You need to delegate management of access control permissions to each business unit. You have the following requirements:

Each business unit manages access controls for their own projects.

Each business unit manages access control permissions at scale.

Business units cannot access other business units' projects.

Users lose their access if they move to a different business unit or leave the company.

Users and access control permissions are managed by the on-premises directory service.

What should you do? (Choose two.)

Options:

A.

Use VPC Service Controls to create perimeters around each business unit's project.

B.

Organize projects in folders, and assign permissions to Google groups at the folder level.

C.

Group business units based on Organization Units (OUs) and manage permissions based on OUs.

D.

Create a project naming convention, and use Google's IAM Conditions to manage access based on the prefix of project names.

E.

Use Google Cloud Directory Sync to synchronize users and group memberships in Cloud Identity.

Question 60

An engineering team is launching a web application that will be public on the internet. The web application is hosted in multiple GCP regions and will be directed to the respective backend based on the URL request.

Your team wants to avoid exposing the application directly on the internet and wants to deny traffic from a specific list of malicious IP addresses

Which solution should your team implement to meet these requirements?

Options:

A.

Cloud Armor

B.

Network Load Balancing

C.

SSL Proxy Load Balancing

D.

NAT Gateway

Question 61

You are consulting with a client that requires end-to-end encryption of application data (including data in transit, data in use, and data at rest) within Google Cloud. Which options should you utilize to accomplish this? (Choose two.)

Options:

A.

External Key Manager

B.

Customer-supplied encryption keys

C.

Hardware Security Module

D.

Confidential Computing and Istio

E.

Client-side encryption

Question 62

Your organization wants to be continuously evaluated against CIS Google Cloud Computing Foundations Benchmark v1 3 0 (CIS Google Cloud Foundation 1 3). Some of the controls are irrelevant to your organization and must be disregarded in evaluation. You need to create an automated system or process to ensure that only the relevant controls are evaluated.

What should you do?

Options:

A.

Mark all security findings that are irrelevant with a tag and a value that indicates a security exception Select all marked findings and mute them on the console every time they appear Activate Security Command Center (SCC) Premium.

B.

Activate Security Command Center (SCC) Premium Create a rule to mute the security findings in SCC so they are not evaluated.

C.

Download all findings from Security Command Center (SCC) to a CSV file Mark the findings that are part of CIS Google Cloud Foundation 1 3 in the file Ignore the entries that are irrelevant and out of scope for the company.

D.

Ask an external audit company to provide independent reports including needed CIS benchmarks. In the scope of the audit clarify that some of the controls are not needed and must be disregarded.

Question 63

Options:

A.

Do not use Vertex AI for sensitive data. Use only public data with minimal privacy requirements.

B.

Contact Google support to opt out of model tuning.

C.

Do nothing. Vertex AI foundation models are frozen by default and do not use your data for model-tuning purposes.

D.

Encrypt your data by using customer-managed encryption keys (CMEK) to have full control over encryption key access.

Question 64

Employees at your company use their personal computers to access your organization s Google Cloud console. You need to ensure that users can only access the Google Cloud console from their corporate-issued devices and verify that they have a valid enterprise certificate

What should you do?

Options:

A.

Implement an Identity and Access Management (1AM) conditional policy to verify the device certificate

B.

Implement a VPC firewall policy Activate packet inspection and create an allow rule to validate and verify the device certificate.

C.

Implement an organization policy to verify the certificate from the access context.

D.

Implement an Access Policy in BeyondCorp Enterprise to verify the device certificate Create an access binding with the access policy just created.

Question 65

A manager wants to start retaining security event logs for 2 years while minimizing costs. You write a filter to select the appropriate log entries.

Where should you export the logs?

Options:

A.

BigQuery datasets

B.

Cloud Storage buckets

C.

StackDriver logging

D.

Cloud Pub/Sub topics

Question 66

You are migrating an on-premises data warehouse to BigQuery Cloud SQL, and Cloud Storage. You need to configure security services in the data warehouse. Your company compliance policies mandate that the data warehouse must:

• Protect data at rest with full lifecycle management on cryptographic keys

• Implement a separate key management provider from data management

• Provide visibility into all encryption key requests

What services should be included in the data warehouse implementation?

Choose 2 answers

Options:

A.

Customer-managed encryption keys

B.

Customer-Supplied Encryption Keys

C.

Key Access Justifications

D.

Access Transparency and Approval

E.

Cloud External Key Manager

Question 67

You have been tasked with configuring Security Command Center for your organization’s Google Cloud environment. Your security team needs to receive alerts of potential crypto mining in the organization’s compute environment and alerts for common Google Cloud misconfigurations that impact security. Which Security Command Center features should you use to configure these alerts? (Choose two.)

Options:

A.

Event Threat Detection

B.

Container Threat Detection

C.

Security Health Analytics

D.

Cloud Data Loss Prevention

E.

Google Cloud Armor

Question 68

Your company is moving to Google Cloud. You plan to sync your users first by using Google Cloud Directory Sync (GCDS). Some employees have already created Google Cloud accounts by using their company email addresses that were created outside of GCDS. You must create your users on Cloud Identity.

What should you do?

Options:

A.

Configure GCDS and use GCDS search rules lo sync these users.

B.

Use the transfer tool to migrate unmanaged users.

C.

Write a custom script to identify existing Google Cloud users and call the Admin SDK Directory API to transfer their account.

D.

Configure GCDS and use GCDS exclusion rules to ensure users are not suspended.

Question 69

Your organization is using Vertex AI Workbench Instances. You must ensure that newly deployed instances are automatically kept up-to-date and that users cannot accidentally alter settings in the operating system. What should you do?

Options:

A.

Enable the VM Manager and ensure the corresponding Google Compute Engine instances are added.

B.

Enforce the disableRootAccess and requireAutoUpgradeSchedule organization policies for newly deployed instances.

C.

Assign the AI Notebooks Runner and AI Notebooks Viewer roles to the users of the AI Workbench Instances.

D.

Implement a firewall rule that prevents Secure Shell access to the corresponding Google Compute Engine instances by using tags.

Question 70

Your organization deploys a large number of containerized applications on Google Kubernetes Engine (GKE). Node updates are currently applied manually. Audit findings show that a critical patch has not been installed due to a missed notification. You need to design a more reliable, cloud-first, and scalable process for node updates. What should you do?​

Options:

A.

Migrate the cluster infrastructure to a self-managed Kubernetes environment for greater control over the patching process.​

B.

Develop a custom script to continuously check for patch availability, download patches, and apply the patches across all components of the cluster.​

C.

Schedule a daily reboot for all nodes to automatically upgrade.​

D.

Configure node auto-upgrades for node pools in the maintenance windows.​

Question 71

When working with agents in a support center via online chat, an organization’s customers often share pictures of their documents with personally identifiable information (PII). The organization that owns the support center is concerned that the PII is being stored in their databases as part of the regular chat logs they retain for

review by internal or external analysts for customer service trend analysis.

Which Google Cloud solution should the organization use to help resolve this concern for the customer while still maintaining data utility?

Options:

A.

Use Cloud Key Management Service (KMS) to encrypt the PII data shared by customers before storing it for analysis.

B.

Use Object Lifecycle Management to make sure that all chat records with PII in them are discarded and not saved for analysis.

C.

Use the image inspection and redaction actions of the DLP API to redact PII from the images before storing them for analysis.

D.

Use the generalization and bucketing actions of the DLP API solution to redact PII from the texts before storing them for analysis.

Question 72

Your organization has had a few recent DDoS attacks. You need to authenticate responses to domain name lookups. Which Google Cloud service should you use?

Options:

A.

Cloud DNS with DNSSEC

B.

Cloud NAT

C.

HTTP(S) Load Balancing

D.

Google Cloud Armor

Question 73

Your organization needs to restrict the types of Google Cloud services that can be deployed within specific folders to enforce compliance requirements. You must apply these restrictions only to the designated folders without affecting other parts of the resource hierarchy. You want to use the most efficient and simple method. What should you do?

Options:

A.

Create an organization policy at the folder level using the "Restrict Resource Service Usage" constraint and define the allowed services per folder.

B.

Implement IAM conditions on service account creation within each folder.

C.

Create a global organization policy at the organization level with the "Restrict Resource Service Usage" constraint and apply exceptions for other folders.

D.

Configure VPC Service Controls perimeters around each folder and define the allowed services within the perimeter.

Question 74

Your company is concerned about unauthorized parties gaming access to the Google Cloud environment by using a fake login page. You must implement a solution to protect against person-in-the-middle attacks.

Which security measure should you use?

Options:

A.

Text message or phone call code

B.

Security key

C.

Google Authenticator application

D.

Google prompt

Question 75

You are designing a new governance model for your organization's secrets that are stored in Secret Manager. Currently, secrets for Production and Non-Production applications are stored and accessed using service accounts. Your proposed solution must:

Provide granular access to secrets

Give you control over the rotation schedules for the encryption keys that wrap your secrets

Maintain environment separation

Provide ease of management

Which approach should you take?

Options:

A.

1. Use separate Google Cloud projects to store Production and Non-Production secrets.2. Enforce access control to secrets using project-level identity and Access Management (IAM) bindings.3. Use customer-managed encryption keys to encrypt secrets.

B.

1. Use a single Google Cloud project to store both Production and Non-Production secrets.2. Enforce access control to secrets using secret-level Identity and Access Management (IAM) bindings.3. Use Google-managed encryption keys to encrypt secrets.

C.

1. Use separate Google Cloud projects to store Production and Non-Production secrets.2. Enforce access control to secrets using secret-level Identity and Access Management (IAM) bindings.3. Use Google-managed encryption keys to encrypt secrets.

D.

1. Use a single Google Cloud project to store both Production and Non-Production secrets.2. Enforce access control to secrets using project-level Identity and Access Management (IAM) bindings.3. Use customer-managed encryption keys to encrypt secrets.

Question 76

You discovered that sensitive personally identifiable information (PII) is being ingested to your Google Cloud environment in the daily ETL process from an on-premises environment to your BigQuery datasets. You need to redact this data to obfuscate the PII, but need to re-identify it for data analytics purposes. Which components should you use in your solution? (Choose two.)

Options:

A.

Secret Manager

B.

Cloud Key Management Service

C.

Cloud Data Loss Prevention with cryptographic hashing

D.

Cloud Data Loss Prevention with automatic text redaction

E.

Cloud Data Loss Prevention with deterministic encryption using AES-SIV

Question 77

Your financial services company has an audit requirement under a strict regulatory framework that requires comprehensive, immutable audit trails for all administrative and data access activity that ensures that data is kept for seven years. Your current logging is fragmented across individual projects. You need to establish a centralized, tamper-proof, long-term logging solution accessible for audits. What should you do?

Options:

A.

Implement Pub/Sub to stream all audit logs from each project in real-time to an external Security Information and Event Management (SIEM) for long-term analysis.

B.

Establish organization-level Cloud Logging sinks to export Cloud Audit Logs to a dedicated Cloud Storage bucket with object retention lock.

C.

Enable Security Command Center across the organization to gain centralized visibility into threats and manage compliance posture for all Google Cloud projects.

D.

Individually configure Cloud Audit Logs for all Google Cloud services in each project. Store the logs in regional Cloud Logging buckets with 30-day retention policies.

Question 78

A customer wants to move their sensitive workloads to a Compute Engine-based cluster using Managed Instance Groups (MIGs). The jobs are bursty and must be completed quickly. They have a requirement to be able to manage and rotate the encryption keys.

Which boot disk encryption solution should you use on the cluster to meet this customer’s requirements?

Options:

A.

Customer-supplied encryption keys (CSEK)

B.

Customer-managed encryption keys (CMEK) using Cloud Key Management Service (KMS)

C.

Encryption by default

D.

Pre-encrypting files before transferring to Google Cloud Platform (GCP) for analysis

Question 79

Your company uses Google Cloud and has publicly exposed network assets. You want to discover the assets and perform a security audit on these assets by using a software tool in the least amount of time.

What should you do?

Options:

A.

Run a platform security scanner on all instances in the organization.

B.

Notify Google about the pending audit and wait for confirmation before performing the scan.

C.

Contact a Google approved security vendor to perform the audit.

D.

Identify all external assets by using Cloud Asset Inventory and then run a network security scanner against them.

Question 80

Your organization must follow the Payment Card Industry Data Security Standard (PCI DSS). To prepare for an audit, you must detect deviations at an infrastructure-as-a-service level in your Google Cloud landing zone. What should you do?

Options:

A.

Create a data profile covering all payment-relevant data types. Configure Data Discovery and a risk analysis job in Google Cloud Sensitive Data Protection to analyze findings.​

B.

Use the Google Cloud Compliance Reports Manager to download the latest version of the PCI DSS report. Analyze the report to detect deviations.​

C.

Create an Assured Workloads folder in your Google Cloud organization. Migrate existing projects into the folder and monitor for deviations in the PCI DSS.​

D.

Activate Security Command Center Premium. Use the Compliance Monitoring product to filter findings that may not be PCI DSS compliant.​

Question 81

Your organization leverages folders to represent different teams within your Google Cloud environment. To support Infrastructure as Code (IaC) practices, each team receives a dedicated service account upon onboarding. You want to ensure that teams have comprehensive permissions to manage resources within their assigned folders while adhering to the principle of least privilege. You must design the permissions for these team-based service accounts in the most effective way possible. What should you do?​

Options:

A.

Grant each service account the folder administrator role on its respective folder.​

B.

Grant each service account the project creator role at the organization level and use folder-level IAM conditions to restrict project creation to specific folders.​Reddit

C.

Assign each service account the project editor role at the organization level and instruct teams to use IAM bindings at the folder level for fine-grained permissions.​

D.

Assign each service account the folder IAM administrator role on its respective folder to allow teams to create and manage additional custom roles if needed.​

Question 82

You need to set up two network segments: one with an untrusted subnet and the other with a trusted subnet. You want to configure a virtual appliance such as a next-generation firewall (NGFW) to inspect all traffic between the two network segments. How should you design the network to inspect the traffic?

Options:

A.

1. Set up one VPC with two subnets: one trusted and the other untrusted.2. Configure a custom route for all traffic (0.0.0.0/0) pointed to the virtual appliance.

B.

1. Set up one VPC with two subnets: one trusted and the other untrusted.2. Configure a custom route for all RFC1918 subnets pointed to the virtual appliance.

C.

1. Set up two VPC networks: one trusted and the other untrusted, and peer them together.2. Configure a custom route on each network pointed to the virtual appliance.

D.

1. Set up two VPC networks: one trusted and the other untrusted.2. Configure a virtual appliance using multiple network interfaces, with each interface connected to one of the VPC networks.

Question 83

Your organization operates in a highly regulated industry and uses multiple Google Cloud services. You need to identify potential risks to regulatory compliance. Which situation introduces the greatest risk?

Options:

A.

Principals have broad IAM roles allowing the creation and management of Compute Engine VMs without a pre-defined hardening process.

B.

Sensitive data is stored in a Cloud Storage bucket with the uniform bucket-level access setting enabled.

C.

The security team mandates the use of customer-managed encryption keys (CMEK) for all data classified as sensitive.

D.

The audit team needs access to Cloud Audit Logs related to managed services like BigQuery.

Question 84

A retail customer allows users to upload comments and product reviews. The customer needs to make sure the text does not include sensitive data before the comments or reviews are published.

Which Google Cloud Service should be used to achieve this?

Options:

A.

Cloud Key Management Service

B.

Cloud Data Loss Prevention API

C.

BigQuery

D.

Cloud Security Scanner

Question 85

Your organization enforces a custom organization policy that disables the use of Compute Engine VM instances with external IP addresses. However, a regulated business unit requires an exception to temporarily use external IPs for a third-party audit process. The regulated business workload must comply with least privilege principles and minimize policy drift. You need to ensure secure policy management and proper handling. What should you do?

Options:

A.

Create a folder. Apply the restrictive organization policy for non-regulated business workloads in the folder. Place the regulated business workload in that folder.

B.

Apply the custom organization policy at the organization level to restrict external IPs. Move the regulated business workload to a separate folder. Override the policy at that folder level.

C.

Create an IAM custom role with permissions to bypass organization policies. Assign the custom role to the regulated business team for the specific project.

D.

Modify the custom organization policy at the organization level to allow external IPs for all projects. Configure VPC firewall rules to restrict egress traffic except for the regulated business workload.

Question 86

You have stored company approved compute images in a single Google Cloud project that is used as an image repository. This project is protected with VPC Service Controls and exists in the perimeter along with other projects in your organization. This lets other projects deploy images from the image repository project. A team requires deploying a third-party disk image that is stored in an external Google Cloud organization. You need to grant read access to the disk image so that it can be deployed into the perimeter.

What should you do?

Options:

A.

• 1 Update the perimeter• 2 Configure the egressTo field to set identity Type to any_identity.• 3 Configure the egressFrom field to include the external Google Cloud project number as an allowed resource and the serviceName to compute. googleapis. com.

B.

* Allow the external project by using the organizational policyconstraints/compute.trustedlmageProjects.

C.

• 1 Update the perimeter• 2 Configure the egressTo field to include the external Google Cloud project number as an allowed resource and the serviceName to compute. googleapis. com.• 3 Configure the egressFrom field to set identity Type to any_idestity.

D.

• 1 Update the perimeter• 2 Configure the ingressFrcm field to set identityType to an-y_identity.• 3 Configure the ingressTo field to include the external Google Cloud project number as an allowed resource and the serviceName to compute.googleapis -com.

Question 87

You are responsible for the operation of your company's application that runs on Google Cloud. The database for the application will be maintained by an external partner. You need to give the partner team access to the database. This access must be restricted solely to the database and cannot extend to any other resources within your company's network. Your solution should follow Google-recommended practices. What should you do?

Options:

A.

Add a public IP address to the application's database. Create database users for each of the partner's employees. Securely distribute the credentials for these users to the partner team.

B.

Create accounts for the partner team in your corporate identity provider. Synchronize these accounts with Google Cloud Identity. Grant the accounts access to the database.

C.

Ask the partner team to set up Cloud Identity accounts within their own corporate environment and identity provider. Grant the partner’s Cloud Identity accounts access to the database.

D.

Configure Workforce Identity Federation for the partner. Connect the identity pool provider to the partner's identity provider. Grant the workforce pool resources access to the database.

Question 88

Your organization operates a hybrid cloud environment and has recently deployed a private Artifact Registry repository in Google Cloud. On-premises developers cannot resolve the Artifact Registry hostname and therefore cannot push or pull artifacts. You've verified the following:

Connectivity to Google Cloud is established by Cloud VPN or Cloud Interconnect.

No custom DNS configurations exist on-premises.

There is no route to the internet from the on-premises network.

You need to identify the cause and enable the developers to push and pull artifacts. What is likely causing the issue and what should you do to fix the issue?

Options:

A.

Artifact Registry requires external HTTP/HTTPS access. Create a new firewall rule allowing ingress traffic on ports 80 and 443 from the developer's IP ranges.

B.

Private Google Access is not enabled for the subnet hosting the Artifact Registry. Enable Private Google Access for the appropriate subnet.

C.

On-premises DNS servers lack the necessary records to resolve private Google API domains. Create DNS records for restricted.googleapis.com or private.googleapis.com pointing to Google's published IP ranges.

D.

Developers must be granted the artifactregistry.writer IAM role. Grant the relevant developer group this role.

Question 89

A customer wants to make it convenient for their mobile workforce to access a CRM web interface that is hosted on Google Cloud Platform (GCP). The CRM can only be accessed by someone on the corporate network. The customer wants to make it available over the internet. Your team requires an authentication layer in front of the application that supports two-factor authentication

Which GCP product should the customer implement to meet these requirements?

Options:

A.

Cloud Identity-Aware Proxy

B.

Cloud Armor

C.

Cloud Endpoints

D.

Cloud VPN

Page: 1 / 30
Total 297 questions