Labour Day Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: suredis

Google Professional-Cloud-Security-Engineer Google Cloud Certified - Professional Cloud Security Engineer Exam Practice Test

Google Cloud Certified - Professional Cloud Security Engineer Questions and Answers

Testing Engine

  • Product Type: Testing Engine
$42  $119.99

PDF Study Guide

  • Product Type: PDF Study Guide
$36.75  $104.99
Question 1

You manage your organization's Security Operations Center (SOC). You currently monitor and detect network traffic anomalies in your Google Cloud VPCs based on packet header information. However, you want the capability to explore network flows and their payload to aid investigations. Which Google Cloud product should you use?

Options:

A.

Marketplace IDS

B.

VPC Flow Logs

C.

VPC Service Controls logs

D.

Packet Mirroring

E.

Google Cloud Armor Deep Packet Inspection

Question 2

You are a Cloud Identity administrator for your organization. In your Google Cloud environment groups are used to manage user permissions. Each application team has a dedicated group Your team is responsible for creating these groups and the application teams can manage the team members on their own through the Google Cloud console. You must ensure that the application teams can only add users from within your organization to their groups.

What should you do?

Options:

A.

Change the configuration of the relevant groups in the Google Workspace Admin console to prevent externalusers from being added to the group.

B.

Set an Identity and Access Management (1AM) policy that includes a condition that restricts groupmembership to user principals that belong to your organization.

C.

Define an Identity and Access Management (IAM) deny policy that denies the assignment of principals thatare outside your organization to the groups in scope.

D.

Export the Cloud Identity logs to BigQuery Configure an alert for external members added to groups Havethe alert trigger a Cloud Function instance that removes the external members from the group.

Question 3

After completing a security vulnerability assessment, you learned that cloud administrators leave Google Cloud CLI sessions open for days. You need to reduce the risk of attackers who might exploit these open sessions by setting these sessions to the minimum duration.

What should you do?

Options:

A.

Set the session duration for the Google session control to one hour.

B.

Set the reauthentication frequency (or the Google Cloud Session Control to one hour.

C.

Set the organization policy constraint

constraints/iam.allowServiceAccountCredentialLifetimeExtension to one hour.

D.

Set the organization policy constraint constraints/iam. serviceAccountKeyExpiryHours to one

hour and inheritFromParent to false.

Question 4

You are designing a new governance model for your organization's secrets that are stored in Secret Manager. Currently, secrets for Production and Non-Production applications are stored and accessed using service accounts. Your proposed solution must:

Provide granular access to secrets

Give you control over the rotation schedules for the encryption keys that wrap your secrets

Maintain environment separation

Provide ease of management

Which approach should you take?

Options:

A.

1. Use separate Google Cloud projects to store Production and Non-Production secrets.

2. Enforce access control to secrets using project-level identity and Access Management (IAM) bindings.

3. Use customer-managed encryption keys to encrypt secrets.

B.

1. Use a single Google Cloud project to store both Production and Non-Production secrets.

2. Enforce access control to secrets using secret-level Identity and Access Management (IAM) bindings.

3. Use Google-managed encryption keys to encrypt secrets.

C.

1. Use separate Google Cloud projects to store Production and Non-Production secrets.

2. Enforce access control to secrets using secret-level Identity and Access Management (IAM) bindings.

3. Use Google-managed encryption keys to encrypt secrets.

D.

1. Use a single Google Cloud project to store both Production and Non-Production secrets.

2. Enforce access control to secrets using project-level Identity and Access Management (IAM) bindings.

3. Use customer-managed encryption keys to encrypt secrets.

Question 5

Your organization must comply with the regulation to keep instance logging data within Europe. Your workloads will be hosted in the Netherlands in region europe-west4 in a new project. You must configure Cloud Logging to keep your data in the country.

What should you do?

Options:

A.

Configure the organization policy constraint gcp.resourceLocations to europe-west4.

B.

Set the logging storage region to eurcpe-west4 by using the gcloud CLI logging settings update.

C.

Create a new tog bucket in europe-west4. and redirect the _Def auit bucKet to the new bucket.

D.

Configure log sink to export all logs into a Cloud Storage bucket in europe-west4.

Question 6

An organization adopts Google Cloud Platform (GCP) for application hosting services and needs guidance on setting up password requirements for their Cloud Identity account. The organization has a password policy requirement that corporate employee passwords must have a minimum number of characters.

Which Cloud Identity password guidelines can the organization use to inform their new requirements?

Options:

A.

Set the minimum length for passwords to be 8 characters.

B.

Set the minimum length for passwords to be 10 characters.

C.

Set the minimum length for passwords to be 12 characters.

D.

Set the minimum length for passwords to be 6 characters.

Question 7

Your Security team believes that a former employee of your company gained unauthorized access to Google Cloud resources some time in the past 2 months by using a service account key. You need to confirm the unauthorized access and determine the user activity. What should you do?

Options:

A.

Use Security Health Analytics to determine user activity.

B.

Use the Cloud Monitoring console to filter audit logs by user.

C.

Use the Cloud Data Loss Prevention API to query logs in Cloud Storage.

D.

Use the Logs Explorer to search for user activity.

Question 8

Your security team wants to implement a defense-in-depth approach to protect sensitive data stored in a Cloud Storage bucket. Your team has the following requirements:

  • The Cloud Storage bucket in Project A can only be readable from Project B.
  • The Cloud Storage bucket in Project A cannot be accessed from outside the network.
  • Data in the Cloud Storage bucket cannot be copied to an external Cloud Storage bucket.

What should the security team do?

Options:

A.

Enable domain restricted sharing in an organization policy, and enable uniform bucket-level access on the Cloud Storage bucket.

B.

Enable VPC Service Controls, create a perimeter around Projects A and B. and include the Cloud Storage API in the Service Perimeter configuration.

C.

Enable Private Access in both Project A and B's networks with strict firewall rules that allow communication between the networks.

D.

Enable VPC Peering between Project A and B's networks with strict firewall rules that allow communication between the networks.

Question 9

Your company wants to determine what products they can build to help customers improve their credit scores depending on their age range. To achieve this, you need tojoin user information in the company's banking app with customers' credit score data received from a third party. While using this raw data will allow you to complete this task, it exposes sensitive data, which could be propagated into new systems.

This risk needs to be addressed using de-identification and tokenization with Cloud Data Loss Prevention while maintaining the referential integrity across the database. Which cryptographic token format should you use to meet these requirements?

Options:

A.

Deterministic encryption

B.

Secure, key-based hashes

C.

Format-preserving encryption

D.

Cryptographic hashing

Question 10

A company is backing up application logs to a Cloud Storage bucket shared with both analysts and the administrator. Analysts should only have access to logs that do not contain any personally identifiable information (PII). Log files containing PII should be stored in another bucket that is only accessible by the administrator.

What should you do?

Options:

A.

Use Cloud Pub/Sub and Cloud Functions to trigger a Data Loss Prevention scan every time a file is uploaded to the shared bucket. If the scan detects PII, have the function move into a Cloud Storage bucket only accessible by the administrator.

B.

Upload the logs to both the shared bucket and the bucket only accessible by the administrator. Create a

job trigger using the Cloud Data Loss Prevention API. Configure the trigger to delete any files from the shared bucket that contain PII.

C.

On the bucket shared with both the analysts and the administrator, configure Object Lifecycle Management to delete objects that contain any PII.

D.

On the bucket shared with both the analysts and the administrator, configure a Cloud Storage Trigger that is only triggered when PII data is uploaded. Use Cloud Functions to capture the trigger and delete such files.

Question 11

Your privacy team uses crypto-shredding (deleting encryption keys) as a strategy to delete personally identifiable information (PII). You need to implement this practice on Google Cloud while still utilizing the majority of the platform’s services and minimizing operational overhead. What should you do?

Options:

A.

Use client-side encryption before sending data to Google Cloud, and delete encryption keys on-premises

B.

Use Cloud External Key Manager to delete specific encryption keys.

C.

Use customer-managed encryption keys to delete specific encryption keys.

D.

Use Google default encryption to delete specific encryption keys.

Question 12

You need to implement an encryption-at-rest strategy that protects sensitive data and reduces key management complexity for non-sensitive data. Your solution has the following requirements:

  • Schedule key rotation for sensitive data.
  • Control which region the encryption keys for sensitive data are stored in.
  • Minimize the latency to access encryption keys for both sensitive and non-sensitive data.

What should you do?

Options:

A.

Encrypt non-sensitive data and sensitive data with Cloud External Key Manager.

B.

Encrypt non-sensitive data and sensitive data with Cloud Key Management Service.

C.

Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud External Key Manager.

D.

Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud Key Management Service.

Question 13

You need to centralize your team’s logs for production projects. You want your team to be able to search and analyze the logs using Logs Explorer. What should you do?

Options:

A.

Enable Cloud Monitoring workspace, and add the production projects to be monitored.

B.

Use Logs Explorer at the organization level and filter for production project logs.

C.

Create an aggregate org sink at the parent folder of the production projects, and set the destination to a Cloud Storage bucket.

D.

Create an aggregate org sink at the parent folder of the production projects, and set the destination to a logs bucket.

Question 14

A patch for a vulnerability has been released, and a DevOps team needs to update their running containers in Google Kubernetes Engine (GKE).

How should the DevOps team accomplish this?

Options:

A.

Use Puppet or Chef to push out the patch to the running container.

B.

Verify that auto upgrade is enabled; if so, Google will upgrade the nodes in a GKE cluster.

C.

Update the application code or apply a patch, build a new image, and redeploy it.

D.

Configure containers to automatically upgrade when the base image is available in Container Registry.

Question 15

Your organization recently activated the Security Command Center {SCO standard tier. There are a few Cloud Storage buckets that were accidentally made accessible to the public. You need to investigate the impact of the incident and remediate it.

What should you do?

Options:

A.

•1 Remove the Identity and Access Management (IAM) granting access to allusers from the buckets

•2 Apply the organization policy storage. unifromBucketLevelAccess to prevent regressions

•3 Query the data access logs to report on unauthorized access

B.

•1 Change bucket permissions to limit access

•2 Query the data access audit logs for any unauthorized access to the buckets

•3 After the misconfiguration is corrected mute the finding in the Security Command Center

C.

•1 Change permissions to limit access for authorized users

•2 Enforce a VPC Service Controls perimeter around all the production projects to immediately stop any unauthorized access

•3 Review the administrator activity audit logs to report on any unauthorized access

D.

•1 Change the bucket permissions to limit access

•2 Query the buckets usage logs to report on unauthorized access to the data

•3 Enforce the organization policy storage.publicAccessPrevention to avoid regressions

■■

Question 16

You are responsible for managing your company’s identities in Google Cloud. Your company enforces 2-Step Verification (2SV) for all users. You need to reset a user’s access, but the user lost their second factor for 2SV. You want to minimize risk. What should you do?

Options:

A.

On the Google Admin console, select the appropriate user account, and generate a backup code to allow the user to sign in. Ask the user to update their second factor.

B.

On the Google Admin console, temporarily disable the 2SV requirements for all users. Ask the user to log in and add their new second factor to their account. Re-enable the 2SV requirement for all users.

C.

On the Google Admin console, select the appropriate user account, and temporarily disable 2SV for this account Ask the user to update their second factor, and then re-enable 2SV for this account.

D.

On the Google Admin console, use a super administrator account to reset the user account's credentials. Ask the user to update their credentials after their first login.

Question 17

A customer wants to run a batch processing system on VMs and store the output files in a Cloud Storage bucket. The networking and security teams have decided that no VMs may reach the public internet.

How should this be accomplished?

Options:

A.

Create a firewall rule to block internet traffic from the VM.

B.

Provision a NAT Gateway to access the Cloud Storage API endpoint.

C.

Enable Private Google Access on the VPC.

D.

Mount a Cloud Storage bucket as a local filesystem on every VM.

Question 18

You are working with a client who plans to migrate their data to Google Cloud. You are responsible for recommending an encryption service to manage their encrypted keys. You have the following requirements:

  • The master key must be rotated at least once every 45 days.
  • The solution that stores the master key must be FIPS 140-2 Level 3 validated.
  • The master key must be stored in multiple regions within the US for redundancy.

Which solution meets these requirements?

Options:

A.

Customer-managed encryption keys with Cloud Key Management Service

B.

Customer-managed encryption keys with Cloud HSM

C.

Customer-supplied encryption keys

D.

Google-managed encryption keys

Question 19

A company is using Google Kubernetes Engine (GKE) with container images of a mission-critical application The company wants to scan the images for known security issues and securely share the report with the security team without exposing them outside Google Cloud.

What should you do?

Options:

A.

1. Enable Container Threat Detection in the Security Command Center Premium tier.

• 2. Upgrade all clusters that are not on a supported version of GKE to the latest possible GKE version.

• 3. View and share the results from the Security Command Center

B.

• 1. Use an open source tool in Cloud Build to scan the images.

• 2. Upload reports to publicly accessible buckets in Cloud Storage by using gsutil

• 3. Share the scan report link with your security department.

C.

• 1. Enable vulnerability scanning in the Artifact Registry settings.

• 2. Use Cloud Build to build the images

• 3. Push the images to the Artifact Registry for automatic scanning.

• 4. View the reports in the Artifact Registry.

D.

• 1. Get a GitHub subscription.

• 2. Build the images in Cloud Build and store them in GitHub for automatic scanning

• 3. Download the report from GitHub and share with the Security Team

Question 20

A database administrator notices malicious activities within their Cloud SQL instance. The database administrator wants to monitor the API calls that read the configuration or metadata of resources. Which logs should the database administrator review?

Options:

A.

Admin Activity

B.

System Event

C.

Access Transparency

D.

Data Access

Question 21

You run applications on Cloud Run. You already enabled container analysis for vulnerability scanning. However, you are concerned about the lack of control on the applications that are deployed. You must ensure that only trusted container images are deployed on Cloud Run.

What should you do?

Choose 2 answers

Options:

A.

EnableBinary Authorization on the existing Kubernetes cluster.

B.

Set the organization policy constraint constraints/run. allowedBinaryAuthorizationPolicie to

the list of allowed Binary Authorization policy names.

C.

Set the organization policy constraint constraints/compute.trustedimageProjects to the list of

protects that contain the trusted container images.

D.

Enable Binary Authorization on the existing Cloud Run service.

E.

Use Cloud Run breakglass to deploy an image that meets the Binary Authorization policy by default.

Question 22

A customer has 300 engineers. The company wants to grant different levels of access and efficiently manage IAM permissions between users in the development and production environment projects.

Which two steps should the company take to meet these requirements? (Choose two.)

Options:

A.

Create a project with multiple VPC networks for each environment.

B.

Create a folder for each development and production environment.

C.

Create a Google Group for the Engineering team, and assign permissions at the folder level.

D.

Create an Organizational Policy constraint for each folder environment.

E.

Create projects for each environment, and grant IAM rights to each engineering user.

Question 23

You manage your organization’s Security Operations Center (SOC). You currently monitor and detect network traffic anomalies in your VPCs based on network logs. However, you want to explore your environment using network payloads and headers. Which Google Cloud product should you use?

Options:

A.

Cloud IDS

B.

VPC Service Controls logs

C.

VPC Flow Logs

D.

Google Cloud Armor

E.

Packet Mirroring

Question 24

Your company has deployed an application on Compute Engine. The application is accessible by clients on port 587. You need to balance the load between the different instances running the application. The connection should be secured using TLS, and terminated by the Load Balancer.

What type of Load Balancing should you use?

Options:

A.

Network Load Balancing

B.

HTTP(S) Load Balancing

C.

TCP Proxy Load Balancing

D.

SSL Proxy Load Balancing

Question 25

You have stored company approved compute images in a single Google Cloud project that is used as an image repository. This project is protected with VPC Service Controls and exists in the perimeter along with other projects in your organization. This lets other projects deploy images from the image repository project. A team requires deploying a third-party disk image that is stored in an external Google Cloud organization. You need to grant read access to the disk image so that it can be deployed into the perimeter.

What should you do?

Options:

A.

•1 Update the perimeter

•2 Configure the egressTo field to set identity Type toany_identity.

•3 Configure the egressFrom field to include the external Google Cloud project number as an allowed resource and the serviceName to compute. googleapis. com.

B.

* Allow the external project by using the organizational policy

constraints/compute.trustedlmageProjects.

C.

•1 Update the perimeter

•2 Configure the egressTo field to include the external Google Cloud project number as an allowed resource and the serviceName to compute. googleapis. com.

•3 Configure the egressFrom field to set identity Type toany_idestity.

D.

•1 Update the perimeter

•2 Configure the ingressFrcm field to set identityType toan-y_identity.

•3 Configure the ingressTo field to include the external Google Cloud project number as an allowed resource and the serviceName to compute.googleapis -com.

Question 26

A company is running workloads in a dedicated server room. They must only be accessed from within the private company network. You need to connect to these workloads from Compute Engine instances within a Google Cloud Platform project.

Which two approaches can you take to meet the requirements? (Choose two.)

Options:

A.

Configure the project with Cloud VPN.

B.

Configure the project with Shared VPC.

C.

Configure the project with Cloud Interconnect.

D.

Configure the project with VPC peering.

E.

Configure all Compute Engine instances with Private Access.

Question 27

You plan to deploy your cloud infrastructure using a CI/CD cluster hosted on Compute Engine. You want to minimize the risk of its credentials being stolen by a third party. What should you do?

Options:

A.

Create a dedicated Cloud Identity user account for the cluster. Use a strong self-hosted vault solution to store the user's temporary credentials.

B.

Create a dedicated Cloud Identity user account for the cluster. Enable the constraints/iam.disableServiceAccountCreation organization policy at the project level.

C.

Create a custom service account for the cluster Enable the constraints/iam.disableServiceAccountKeyCreation organization policy at the project level.

D.

Create a custom service account for the cluster Enable the constraints/iam.allowServiceAccountCredentialLifetimeExtension organization policy at the project level.

Question 28

You are part of a security team investigating a compromised service account key. You need to audit which new resources were created by the service account.

What should you do?

Options:

A.

Query Data Access logs.

B.

Query Admin Activity logs.

C.

Query Access Transparency logs.

D.

Query Stackdriver Monitoring Workspace.

Question 29

Which two implied firewall rules are defined on a VPC network? (Choose two.)

Options:

A.

A rule that allows all outbound connections

B.

A rule that denies all inbound connections

C.

A rule that blocks all inbound port 25 connections

D.

A rule that blocks all outbound connections

E.

A rule that allows all inbound port 80 connections

Question 30

Your organization uses BigQuery to process highly sensitive, structured datasets. Following the "need to know" principle, you need to create the Identity and Access Management (IAM) design to meet the needs of these users:

• Business user must access curated reports.

• Data engineer: must administrate the data lifecycle in the platform.

• Security operator: must review user activity on the data platform.

What should you do?

Options:

A.

Configure data access log for BigQuery services, and grant Project Viewer role to security operators.

B.

Generate a CSV data file based on the business user's needs, and send the data to their email addresses.

C.

Create curated tables in a separate dataset and assign the role roles/bigquery.dataViewer.

D.

Set row-based access control based on the "region" column, and filter the record from the United States for data engineers.

Question 31

A manager wants to start retaining security event logs for 2 years while minimizing costs. You write a filter to select the appropriate log entries.

Where should you export the logs?

Options:

A.

BigQuery datasets

B.

Cloud Storage buckets

C.

StackDriver logging

D.

Cloud Pub/Sub topics

Question 32

Last week, a company deployed a new App Engine application that writes logs to BigQuery. No other workloads are running in the project. You need to validate that all data written to BigQuery was done using the App Engine Default Service Account.

What should you do?

Options:

A.

1. Use StackDriver Logging and filter on BigQuery Insert Jobs.

2.Click on the email address in line with the App Engine Default Service Account in the authentication field.

3.Click Hide Matching Entries.

4.Make sure the resulting list is empty.

B.

1. Use StackDriver Logging and filter on BigQuery Insert Jobs.

2.Click on the email address in line with the App Engine Default Service Account in the authentication field.

3.Click Show Matching Entries.

4.Make sure the resulting list is empty.

C.

1. In BigQuery, select the related dataset.

2. Make sure the App Engine Default Service Account is the only account that can write to the dataset.

D.

1. Go to the IAM section on the project.

2. Validate that the App Engine Default Service Account is the only account that has a role that can write to BigQuery.

Question 33

You need to enable VPC Service Controls and allow changes to perimeters in existing environments without preventing access to resources. Which VPC Service Controls mode should you use?

Options:

A.

Cloud Run

B.

Native

C.

Enforced

D.

Dry run

Question 34

A company is running their webshop on Google Kubernetes Engine and wants to analyze customer transactions in BigQuery. You need to ensure that no credit card numbers are stored in BigQuery

What should you do?

Options:

A.

Create a BigQuery view with regular expressions matching credit card numbers to query and delete affected rows.

B.

Use the Cloud Data Loss Prevention API to redact related infoTypes before data is ingested into BigQuery.

C.

Leverage Security Command Center to scan for the assets of type Credit Card Number in BigQuery.

D.

Enable Cloud Identity-Aware Proxy to filter out credit card numbers before storing the logs in BigQuery.

Question 35

You have created an OS image that is hardened per your organization’s security standards and is being stored in a project managed by the security team. As a Google Cloud administrator, you need to make sure all VMs in your Google Cloud organization can only use that specific OS image while minimizing operational overhead. What should you do? (Choose two.)

Options:

A.

Grant users the compuce.imageUser role in their own projects.

B.

Grant users the compuce.imageUser role in the OS image project.

C.

Store the image in every project that is spun up in your organization.

D.

Set up an image access organization policy constraint, and list the security team managed project in the projects allow list.

E.

Remove VM instance creation permission from users of the projects, and only allow you and your team to create VM instances.

Question 36

You manage a mission-critical workload for your organization, which is in a highly regulated industry The workload uses Compute Engine VMs to analyze and process the sensitive data after it is uploaded to Cloud Storage from the endpomt computers. Your compliance team has detected that this workload does not meet the data protection requirements for sensitive data. You need to meet these requirements;

• Manage the data encryption key (DEK) outside the Google Cloud boundary.

• Maintain full control of encryption keys through a third-party provider.

• Encrypt the sensitive data before uploading it to Cloud Storage

• Decrypt the sensitive data during processing in the Compute Engine VMs

• Encrypt the sensitive data in memory while in use in the Compute Engine VMs

What should you do?

Choose 2 answers

Options:

A.

Create a VPC Service Controls service perimeter across your existing Compute Engine VMs and Cloud Storage buckets

B.

Migrate the Compute Engine VMs to Confidential VMs to access the sensitive data.

C.

Configure Cloud External Key Manager to encrypt the sensitive data before it is uploaded to Cloud Storage and decrypt the sensitive data after it is downloaded into your VMs

D.

Create Confidential VMs to access the sensitive data.

E.

Configure Customer Managed Encryption Keys to encrypt the sensitive data before it is uploaded to Cloud Storage, and decrypt the sensitive data after it is downloaded into your VMs.

Question 37

When working with agents in a support center via online chat, an organization’s customers often share pictures of their documents with personally identifiable information (PII). The organization that owns the support center is concerned that the PII is being stored in their databases as part of the regular chat logs they retain for

review by internal or external analysts for customer service trend analysis.

Which Google Cloud solution should the organization use to help resolve this concern for the customer while still maintaining data utility?

Options:

A.

Use Cloud Key Management Service (KMS) to encrypt the PII data shared by customers before storing it for analysis.

B.

Use Object Lifecycle Management to make sure that all chat records with PII in them are discarded and not saved for analysis.

C.

Use the image inspection and redaction actions of the DLP API to redact PII from the images before storing them for analysis.

D.

Use the generalization and bucketing actions of the DLP API solution to redact PII from the texts before storing them for analysis.

Question 38

You are migrating an on-premises data warehouse to BigQuery Cloud SQL, and Cloud Storage. You need to configure security services in the data warehouse. Your company compliance policies mandate that the data warehouse must:

•Protect data at rest with full lifecycle management on cryptographic keys

•Implement a separate key management provider from data management

•Provide visibility into all encryption key requests

What services should be included in the data warehouse implementation?

Choose 2 answers

Options:

A.

Customer-managed encryption keys

B.

Customer-Supplied Encryption Keys

C.

Key Access Justifications

D.

Access Transparency and Approval

E.

Cloud External Key Manager

Question 39

You recently joined the networking team supporting your company's Google Cloud implementation. You are tasked with familiarizing yourself with the firewall rules configuration and providing recommendations based on your networking and GoogleCloud experience. What product should you recommend to detect firewall rules that are overlapped by attributes from other firewall rules with higher or equal priority?

Options:

A.

Security Command Center

B.

Firewall Rules Logging

C.

VPC Flow Logs

D.

Firewall Insights

Question 40

Your company’s chief information security officer (CISO) is requiring business data to be stored in specific locations due to regulatory requirements that affect the company’s global expansion plans. After working on a plan to implement this requirement, you determine the following:

  • The services in scope are included in the Google Cloud data residency requirements.
  • The business data remains within specific locations under the same organization.
  • The folder structure can contain multiple data residency locations.
  • The projects are aligned to specific locations.

You plan to use the Resource Location Restriction organization policy constraint with very granular control. At which level in the hierarchy should you set the constraint?

Options:

A.

Organization

B.

Resource

C.

Project

D.

Folder

Question 41

You are in charge of migrating a legacy application from your company datacenters to GCP before the current maintenance contract expires. You do not know what ports the application is using and no documentation is available for you to check. You want to complete the migration without putting your environment at risk.

What should you do?

Options:

A.

Migrate the application into an isolated project using a “Lift & Shift” approach. Enable all internal TCP traffic using VPC Firewall rules. Use VPC Flow logs to determine what traffic should be allowed for the

application to work properly.

B.

Migrate the application into an isolated project using a “Lift & Shift” approach in a custom network. Disable all traffic within the VPC and look at the Firewall logs to determine what traffic should be allowed for the application to work properly.

C.

Refactor the application into a micro-services architecture in a GKE cluster. Disable all traffic from outside the cluster using Firewall Rules. Use VPC Flow logs to determine what traffic should be allowed for the application to work properly.

D.

Refactor the application into a micro-services architecture hosted in Cloud Functions in an isolated project.

Disable all traffic from outside your project using Firewall Rules. Use VPC Flow logs to determine what traffic should be allowed for the application to work properly.

Question 42

You want to use the gcloud command-line tool to authenticate using a third-party single sign-on (SSO) SAML identity provider. Which options are necessary to ensure that authentication is supported by the third-party identity provider (IdP)? (Choose two.)

Options:

A.

SSO SAML as a third-party IdP

B.

Identity Platform

C.

OpenID Connect

D.

Identity-Aware Proxy

E.

Cloud Identity

Question 43

Your security team wants to reduce the risk of user-managed keys being mismanaged and compromised. To achieve this, you need to prevent developers from creating user-managed service account keys for projects in their organization. How should you enforce this?

Options:

A.

Configure Secret Manager to manage service account keys.

B.

Enable an organization policy to disable service accounts from being created.

C.

Enable an organization policy to prevent service account keys from being created.

D.

Remove theiam.serviceAccounts.getAccessTokenpermission from users.

Question 44

A customer implements Cloud Identity-Aware Proxy for their ERP system hosted on Compute Engine. Their security team wants to add a security layer so that the ERP systems only accept traffic from Cloud Identity- Aware Proxy.

What should the customer do to meet these requirements?

Options:

A.

Make sure that the ERP system can validate the JWT assertion in the HTTP requests.

B.

Make sure that the ERP system can validate the identity headers in the HTTP requests.

C.

Make sure that the ERP system can validate the x-forwarded-for headers in the HTTP requests.

D.

Make sure that the ERP system can validate the user’s unique identifier headers in the HTTP requests.

Question 45

Your organization’s Google Cloud VMs are deployed via an instance template that configures them with a public IP address in order to host web services for external users. The VMs reside in a service project that is attached to a host (VPC) project containing one custom Shared VPC for the VMs. You have been asked to reduce the exposure of the VMs to the internet while continuing to service external users. You have already recreated the instance template without a public IP address configuration to launch the managed instance group (MIG). What should you do?

Options:

A.

Deploy a Cloud NAT Gateway in the service project for the MIG.

B.

Deploy a Cloud NAT Gateway in the host (VPC) project for the MIG.

C.

Deploy an external HTTP(S) load balancer in the service project with the MIG as a backend.

D.

Deploy an external HTTP(S) load balancer in the host (VPC) project with the MIG as a backend.

Question 46

Your company has been creating users manually in Cloud Identity to provide access to Google Cloud resources. Due to continued growth of the environment, you want to authorize the Google Cloud Directory Sync (GCDS) instance and integrate it with your on-premises LDAP server to onboard hundreds of users. You are required to:

Replicate user and group lifecycle changes from the on-premises LDAP server in Cloud Identity.

Disable any manually created users in Cloud Identity.

You have already configured the LDAP search attributes to include the users and security groups in scope for Google Cloud. What should you do next to complete this solution?

Options:

A.

1. Configure the option to suspend domain users not found in LDAP.

2. Set up a recurring GCDS task.

B.

1. Configure the option to delete domain users not found in LDAP.

2. Run GCDS after user and group lifecycle changes.

C.

1. Configure the LDAP search attributes to exclude manually created Cloud Identity users not found in LDAP.

2. Set up a recurring GCDS task.

D.

1. Configure the LDAP search attributes to exclude manually created Cloud identity users not found in LDAP.

2. Run GCDS after user and group lifecycle changes.

Question 47

A company is deploying their application on Google Cloud Platform. Company policy requires long-term data to be stored using a solution that can automatically replicate data over at least two geographic places.

Which Storage solution are they allowed to use?

Options:

A.

Cloud Bigtable

B.

Cloud BigQuery

C.

Compute Engine SSD Disk

D.

Compute Engine Persistent Disk

Question 48

Applications often require access to “secrets” - small pieces of sensitive data at build or run time. The administrator managing these secrets on GCP wants to keep a track of “who did what, where, and when?” within their GCP projects.

Which two log streams would provide the information that the administrator is looking for? (Choose two.)

Options:

A.

Admin Activity logs

B.

System Event logs

C.

Data Access logs

D.

VPC Flow logs

E.

Agent logs

Question 49

You perform a security assessment on a customer architecture and discover that multiple VMs have public IP addresses. After providing a recommendation to remove the public IP addresses, you are told those VMs need to communicate to external sites as part of the customer's typical operations. What should you recommend to reduce the need for public IP addresses in your customer's VMs?

Options:

A.

Google Cloud Armor

B.

Cloud NAT

C.

Cloud Router

D.

Cloud VPN

Question 50

An organization is starting to move its infrastructure from its on-premises environment to Google Cloud Platform (GCP). The first step the organization wants to take is to migrate its ongoing data backup and disaster recovery solutions to GCP. The organization's on-premises production environment is going to be the next phase for migration to GCP. Stable networking connectivity between the on-premises environment and GCP is also being implemented.

Which GCP solution should the organization use?

Options:

A.

BigQuery using a data pipeline job with continuous updates via Cloud VPN

B.

Cloud Storage using a scheduled task and gsutil via Cloud Interconnect

C.

Compute Engines Virtual Machines using Persistent Disk via Cloud Interconnect

D.

Cloud Datastore using regularly scheduled batch upload jobs via Cloud VPN

Question 51

A large financial institution is moving its Big Data analytics to Google Cloud Platform. They want to have maximum control over the encryption process of data stored at rest in BigQuery.

What technique should the institution use?

Options:

A.

Use Cloud Storage as a federated Data Source.

B.

Use a Cloud Hardware Security Module (Cloud HSM).

C.

Customer-managed encryption keys (CMEK).

D.

Customer-supplied encryption keys (CSEK).

Question 52

You have the following resource hierarchy. There is an organization policy at each node in the hierarchy as shown. Which load balancer types are denied in VPC A?

Options:

A.

All load balancer types are denied in accordance with the global node’s policy.

B.

INTERNAL_TCP_UDP, INTERNAL_HTTP_HTTPS is denied in accordance with the folder’s policy.

C.

EXTERNAL_TCP_PROXY, EXTERNAL_SSL_PROXY are denied in accordance with the project’s policy.

D.

EXTERNAL_TCP_PROXY, EXTERNAL_SSL_PROXY, INTERNAL_TCP_UDP, and INTERNAL_HTTP_HTTPS are denied in accordance with the folder and project’s policies.

Question 53

You are auditing all your Google Cloud resources in the production project. You want to identity all principals who can change firewall rules.

What should you do?

Options:

A.

Use Policy Analyzer lo query the permissions compute, firewalls, create of

compute, firewalls. Create of compute,firewalls.delete.

B.

Reference the Security Health Analytics - Firewall Vulnerability Findings in the Security Command Center.

C.

Use Policy Analyzer to query the permissions compute, firewalls, get of compute, firewalls, list.

D.

Use Firewall Insights to understand your firewall rules usage patterns.

Question 54

Your team needs to make sure that a Compute Engine instance does not have access to the internet or to any Google APIs or services.

Which two settings must remain disabled to meet these requirements? (Choose two.)

Options:

A.

Public IP

B.

IP Forwarding

C.

Private Google Access

D.

Static routes

E.

IAM Network User Role

Question 55

You manage one of your organization's Google Cloud projects (Project A). AVPC Service Control (SC) perimeter is blocking API access requests to this project including Pub/Sub. A resource running under a service account in another project (Project B) needs to collect messages from a Pub/Sub topic in your project Project B is not included in a VPC SC perimeter. You need to provide access from Project B to the Pub/Sub topic in Project A using the principle of least

Privilege.

What should you do?

Options:

A.

Configure an ingress policy for the perimeter in Project A and allow access for the service account in ProjectB to collect messages.

B.

Create an access level that allows a developer in Project B to subscribe to the Pub/Sub topic that is locatedin Project A.

C.

Create a perimeter bridge between Project A and Project B to allow the required communication betweenboth projects.

D.

Remove the Pub/Sub API from the list of restricted services in the perimeter configuration for Project A.

Question 56

You are a member of your company's security team. You have been asked to reduce your Linux bastion host external attack surface by removing all public IP addresses. Site Reliability Engineers (SREs) require access to the bastion host from public locations so they can access the internal VPC while off-site. How should you enable this access?

Options:

A.

Implement Cloud VPN for the region where the bastion host lives.

B.

Implement OS Login with 2-step verification for the bastion host.

C.

Implement Identity-Aware Proxy TCP forwarding for the bastion host.

D.

Implement Google Cloud Armor in front of the bastion host.

Question 57

You are a security engineer at a finance company. Your organization plans to store data on Google Cloud, but your leadership team is worried about the security of their highly sensitive data Specifically, your

company is concerned about internal Google employees' ability to access your company's data on Google Cloud. What solution should you propose?

Options:

A.

Use customer-managed encryption keys.

B.

Use Google's Identity and Access Management (IAM) service to manage access controls on Google Cloud.

C.

Enable Admin activity logs to monitor access to resources.

D.

Enable Access Transparency logs with Access Approval requests for Google employees.

Question 58

Your company is using GSuite and has developed an application meant for internal usage on Google App Engine. You need to make sure that an external user cannot gain access to the application even when an employee’s password has been compromised.

What should you do?

Options:

A.

Enforce 2-factor authentication in GSuite for all users.

B.

Configure Cloud Identity-Aware Proxy for the App Engine Application.

C.

Provision user passwords using GSuite Password Sync.

D.

Configure Cloud VPN between your private network and GCP.

Question 59

You need to provide a corporate user account in Google Cloud for each of your developers and operational staff who need direct access to GCP resources. Corporate policy requires you to maintain the user identity in a third-party identity management provider and leverage single sign-on. You learn that a significant number of users are using their corporate domain email addresses for personal Google accounts, and you need to follow Google recommended practices to convert existing unmanaged users to managed accounts.

Which two actions should you take? (Choose two.)

Options:

A.

Use Google Cloud Directory Sync to synchronize your local identity management system to Cloud Identity.

B.

Use the Google Admin console to view which managed users are using a personal account for their recovery email.

C.

Add users to your managed Google account and force users to change the email addresses associated with their personal accounts.

D.

Use the Transfer Tool for Unmanaged Users (TTUU) to find users with conflicting accounts and ask them to transfer their personal Google accounts.

E.

Send an email to all of your employees and ask those users with corporate email addresses for personal Google accounts to delete the personal accounts immediately.

Question 60

You are the project owner for a regulated workload that runs in a project you own and manage as an Identity and Access Management (IAM) admin. For an upcoming audit, you need to provide access reviews evidence. Which tool should you use?

Options:

A.

Policy Troubleshooter

B.

Policy Analyzer

C.

IAM Recommender

D.

Policy Simulator

Question 61

You are backing up application logs to a shared Cloud Storage bucket that is accessible to both the administrator and analysts. Analysts should not have access to logs that contain any personally identifiable information (PII). Log files containing PII should be stored in another bucket that is only accessible to the administrator. What should you do?

Options:

A.

Upload the logs to both the shared bucket and the bucket with Pll that is only accessible to the administrator. Use the Cloud Data Loss Prevention API to create a job trigger. Configure the trigger to delete any files that contain Pll from the shared bucket.

B.

On the shared bucket, configure Object Lifecycle Management to delete objects that contain Pll.

C.

On the shared bucket, configure a Cloud Storage trigger that is only triggered when Pll is uploaded. Use Cloud Functions to capture the trigger and delete the files that contain Pll.

D.

Use Pub/Sub and Cloud Functions to trigger a Cloud Data Loss Prevention scan every time a file is uploaded to the administrator's bucket. If the scan does not detect Pll, have the function move the objects into the shared Cloud Storage bucket.

Question 62

You are implementing data protection by design and in accordance with GDPR requirements. As part of design reviews, you are told that you need to manage the encryption key for a solution that includes workloads for Compute Engine, Google Kubernetes Engine, Cloud Storage, BigQuery, and Pub/Sub. Which option should you choose for this implementation?

Options:

A.

Cloud External Key Manager

B.

Customer-managed encryption keys

C.

Customer-supplied encryption keys

D.

Google default encryption

Question 63

Your customer has an on-premises Public Key Infrastructure (PKI) with a certificate authority (CA). You need to issue certificates for many HTTP load balancer frontends. The on-premises PKI should be minimally affected due to many manual processes, and the solution needs to scale.

What should you do?

Options:

A.

Use Certificate Manager to issue Google managed public certificates and configure it at HTTP the load balancers in your infrastructure as code (laC).

B.

Use Certificate Manager to import certificates issued from on-premises PKI and for the frontends. Leverage the gcloud tool for importing

C.

Use a subordinate CA in the Google Certificate Authority Service from the on-premises PKI system to issue certificates for the load balancers.

D.

Use the web applications with PKCS12 certificates issued from subordinate CA based on OpenSSL on-premises Use the gcloud tool for importing. Use the External TCP/UDP Network load balancer instead of an external HTTP Load Balancer.

Question 64

Your company conducts clinical trials and needs to analyze the results of a recent study that are stored in BigQuery. The interval when the medicine was taken contains start and stop dates The interval data is critical to the analysis, but specific dates may identify a particular batch and introduce bias You need to obfuscate the start and end dates for each row and preserve the interval data.

What should you do?

Options:

A.

Use bucketing to shift values to a predetermined date based on the initial value.

B.

Extract the date using TimePartConfig from each date field and append a random month and year

C.

Use date shifting with the context set to the unique ID of the test subject

D.

Use the FFX mode of format preserving encryption (FPE) and maintain data consistency

Question 65

A customer terminates an engineer and needs to make sure the engineer's Google account is automatically deprovisioned.

What should the customer do?

Options:

A.

Use the Cloud SDK with their directory service to remove their IAM permissions in Cloud Identity.

B.

Use the Cloud SDK with their directory service to provision and deprovision users from Cloud Identity.

C.

Configure Cloud Directory Sync with their directory service to provision and deprovision users from Cloud Identity.

D.

Configure Cloud Directory Sync with their directory service to remove their IAM permissions in Cloud Identity.

Question 66

You want to make sure that your organization’s Cloud Storage buckets cannot have data publicly available to the internet. You want to enforce this across all Cloud Storage buckets. What should you do?

Options:

A.

Remove Owner roles from end users, and configure Cloud Data Loss Prevention.

B.

Remove Owner roles from end users, and enforce domain restricted sharing in an organization policy.

C.

Configure uniform bucket-level access, and enforce domain restricted sharing in an organization policy.

D.

Remove*.setIamPolicypermissions from all roles, and enforce domain restricted sharing in an organization policy.

Question 67

You will create a new Service Account that should be able to list the Compute Engine instances in the project. You want to follow Google-recommended practices.

What should you do?

Options:

A.

Create an Instance Template, and allow the Service Account Read Only access for the Compute Engine Access Scope.

B.

Create a custom role with the permission compute.instances.list and grant the Service Account this role.

C.

Give the Service Account the role of Compute Viewer, and use the new Service Account for all instances.

D.

Give the Service Account the role of Project Viewer, and use the new Service Account for all instances.

Question 68

Your company is using Cloud Dataproc for its Spark and Hadoop jobs. You want to be able to create, rotate,

and destroy symmetric encryption keys used for the persistent disks used by Cloud Dataproc. Keys can be stored in the cloud.

What should you do?

Options:

A.

Use the Cloud Key Management Service to manage the data encryption key (DEK).

B.

Use the Cloud Key Management Service to manage the key encryption key (KEK).

C.

Use customer-supplied encryption keys to manage the data encryption key (DEK).

D.

Use customer-supplied encryption keys to manage the key encryption key (KEK).

Question 69

Your team wants to limit users with administrative privileges at the organization level.

Which two roles should your team restrict? (Choose two.)

Options:

A.

Organization Administrator

B.

Super Admin

C.

GKE Cluster Admin

D.

Compute Admin

E.

Organization Role Viewer