Summer Special Flat 65% Limited Time Discount offer - Ends in 0d 00h 00m 00s - Coupon code: netdisc

Snowflake ARA-C01 SnowPro Advanced: Architect Certification Exam Exam Practice Test

Page: 1 / 16
Total 162 questions

SnowPro Advanced: Architect Certification Exam Questions and Answers

Testing Engine

  • Product Type: Testing Engine
$43.75  $124.99

PDF Study Guide

  • Product Type: PDF Study Guide
$38.5  $109.99
Question 1

Which of the below commands will use warehouse credits?

Options:

A.

SHOW TABLES LIKE 'SNOWFL%';

B.

SELECT MAX(FLAKE_ID) FROM SNOWFLAKE;

C.

SELECT COUNT(*) FROM SNOWFLAKE;

D.

SELECT COUNT(FLAKE_ID) FROM SNOWFLAKE GROUP BY FLAKE_ID;

Question 2

What integration object should be used to place restrictions on where data may be exported?

Options:

A.

Stage integration

B.

Security integration

C.

Storage integration

D.

API integration

Question 3

An Architect has designed a data pipeline that Is receiving small CSV files from multiple sources. All of the files are landing in one location. Specific files are filtered for loading into Snowflake tables using the copy command. The loading performance is poor.

What changes can be made to Improve the data loading performance?

Options:

A.

Increase the size of the virtual warehouse.

B.

Create a multi-cluster warehouse and merge smaller files to create bigger files.

C.

Create a specific storage landing bucket to avoid file scanning.

D.

Change the file format from CSV to JSON.

Question 4

A table contains five columns and it has millions of records. The cardinality distribution of the columns is shown below:

Column C4 and C5 are mostly used by SELECT queries in the GROUP BY and ORDER BY clauses. Whereas columns C1, C2 and C3 are heavily used in filter and join conditions of SELECT queries.

The Architect must design a clustering key for this table to improve the query performance.

Based on Snowflake recommendations, how should the clustering key columns be ordered while defining the multi-column clustering key?

Options:

A.

C5, C4, C2

B.

C3, C4, C5

C.

C1, C3, C2

D.

C2, C1, C3

Question 5

A company has a table with that has corrupted data, named Data. The company wants to recover the data as it was 5 minutes ago using cloning and Time Travel.

What command will accomplish this?

Options:

A.

CREATE CLONE TABLE Recover_Data FROM Data AT(OFFSET => -60*5);

B.

CREATE CLONE Recover_Data FROM Data AT(OFFSET => -60*5);

C.

CREATE TABLE Recover_Data CLONE Data AT(OFFSET => -60*5);

D.

CREATE TABLE Recover Data CLONE Data AT(TIME => -60*5);

Question 6

An Architect needs to improve the performance of reports that pull data from multiple Snowflake tables, join, and then aggregate the data. Users access the reports using several dashboards. There are performance issues on Monday mornings between 9:00am-11:00am when many users check the sales reports.

The size of the group has increased from 4 to 8 users. Waiting times to refresh the dashboards has increased significantly. Currently this workload is being served by a virtual warehouse with the following parameters:

AUTO-RESUME = TRUE AUTO_SUSPEND = 60 SIZE = Medium

What is the MOST cost-effective way to increase the availability of the reports?

Options:

A.

Use materialized views and pre-calculate the data.

B.

Increase the warehouse to size Large and set auto_suspend = 600.

C.

Use a multi-cluster warehouse in maximized mode with 2 size Medium clusters.

D.

Use a multi-cluster warehouse in auto-scale mode with 1 size Medium cluster, and set min_cluster_count = 1 and max_cluster_count = 4.

Question 7

Which Snowflake objects can be used in a data share? (Select TWO).

Options:

A.

Standard view

B.

Secure view

C.

Stored procedure

D.

External table

E.

Stream

Question 8

An Architect is using SnowCD to investigate a connectivity issue.

Which system function will provide a list of endpoints that the network must be able to access to use a specific Snowflake account, leveraging private connectivity?

Options:

A.

SYSTEMSALLOWLIST ()

B.

SYSTEMSGET_PRIVATELINK

C.

SYSTEMSAUTHORIZE_PRIVATELINK

D.

SYSTEMSALLOWLIST_PRIVATELINK ()

Question 9

Assuming all Snowflake accounts are using an Enterprise edition or higher, in which development and testing scenarios would be copying of data be required, and zero-copy cloning not be suitable? (Select TWO).

Options:

A.

Developers create their own datasets to work against transformed versions of the live data.

B.

Production and development run in different databases in the same account, and Developers need to see production-like data but with specific columns masked.

C.

Data is in a production Snowflake account that needs to be provided to Developers in a separate development/testing Snowflake account in the same cloud region.

D.

Developers create their own copies of a standard test database previously created for them in the development account, for their initial development and unit testing.

E.

The release process requires pre-production testing of changes with data of production scale and complexity. For security reasons, pre-production also runs in the production account.

Question 10

An Architect runs the following SQL query:

How can this query be interpreted?

Options:

A.

FILEROWS is a stage. FILE_ROW_NUMBER is line number in file.

B.

FILEROWS is the table. FILE_ROW_NUMBER is the line number in the table.

C.

FILEROWS is a file. FILE_ROW_NUMBER is the file format location.

D.

FILERONS is the file format location. FILE_ROW_NUMBER is a stage.

Question 11

Consider the following scenario where a masking policy is applied on the CREDICARDND column of the CREDITCARDINFO table. The masking policy definition Is as follows:

Sample data for the CREDITCARDINFO table is as follows:

NAME EXPIRYDATE CREDITCARDNO

JOHN DOE 2022-07-23 4321 5678 9012 1234

if the Snowflake system rotes have not been granted any additional roles, what will be the result?

Options:

A.

The sysadmin can see the CREDICARDND column data in clear text.

B.

The owner of the table will see the CREDICARDND column data in clear text.

C.

Anyone with the Pl_ANALYTICS role will see the last 4 characters of the CREDICARDND column data in dear text.

D.

Anyone with the Pl_ANALYTICS role will see the CREDICARDND column as*** 'MASKED* **'.

Question 12

Based on the architecture in the image, how can the data from DB1 be copied into TBL2? (Select TWO).

A)

B)

C)

D)

E)

Options:

A.

Option A

B.

Option B

C.

Option C

D.

Option D

E.

Option E

Question 13

An Architect clones a database and all of its objects, including tasks. After the cloning, the tasks stop running.

Why is this occurring?

Options:

A.

Tasks cannot be cloned.

B.

The objects that the tasks reference are not fully qualified.

C.

Cloned tasks are suspended by default and must be manually resumed.

D.

The Architect has insufficient privileges to alter tasks on the cloned database.

Question 14

Based on the Snowflake object hierarchy, what securable objects belong directly to a Snowflake account? (Select THREE).

Options:

A.

Database

B.

Schema

C.

Table

D.

Stage

E.

Role

F.

Warehouse

Question 15

Role A has the following permissions:

. USAGE on db1

. USAGE and CREATE VIEW on schemal in db1

. SELECT on tablel in schemal

Role B has the following permissions:

. USAGE on db2

. USAGE and CREATE VIEW on schema2 in db2

. SELECT on table2 in schema2

A user has Role A set as the primary role and Role B as a secondary role.

What command will fail for this user?

Options:

A.

use database db1;use schema schemal;create view v1 as select * from db2.schema2.table2;

B.

use database db2;use schema schema2;create view v2 as select * from dbl.schemal. tablel;

C.

use database db2;use schema schema2;select * from db1.schemal.tablel union select * from table2;

D.

use database db1;use schema schemal;select * from db2.schema2.table2;

Question 16

Which columns can be included in an external table schema? (Select THREE).

Options:

A.

VALUE

B.

METADATASROW_ID

C.

METADATASISUPDATE

D.

METADAT A$ FILENAME

E.

METADATAS FILE_ROW_NUMBER

F.

METADATASEXTERNAL TABLE PARTITION

Question 17

An Architect needs to design a Snowflake account and database strategy to store and analyze large amounts of structured and semi-structured data. There are many business units and departments within the company. The requirements are scalability, security, and cost efficiency.

What design should be used?

Options:

A.

Create a single Snowflake account and database for all data storage and analysis needs, regardless of data volume or complexity.

B.

Set up separate Snowflake accounts and databases for each department or business unit, to ensure data isolation and security.

C.

Use Snowflake's data lake functionality to store and analyze all data in a central location, without the need for structured schemas or indexes

D.

Use a centralized Snowflake database for core business data, and use separate databases for departmental or project-specific data.

Question 18

A user, analyst_user has been granted the analyst_role, and is deploying a SnowSQL script to run as a background service to extract data from Snowflake.

What steps should be taken to allow the IP addresses to be accessed? (Select TWO).

Options:

A.

ALTERROLEANALYST_ROLESETNETWORK_POLICY='ANALYST_POLICY';

B.

ALTERUSERANALYSTJJSERSETNETWORK_POLICY='ANALYST_POLICY';

C.

ALTERUSERANALYST_USERSETNETWORK_POLICY='10.1.1.20';

D.

USE ROLE SECURITYADMIN;CREATE OR REPLACE NETWORK POLICY ANALYST_POLICY ALLOWED_IP_LIST = ('10.1.1.20');

E.

USE ROLE USERADMIN;CREATE OR REPLACE NETWORK POLICY ANALYST_POLICYALLOWED_IP_LIST = ('10.1.1.20');

Question 19

A global company needs to securely share its sales and Inventory data with a vendor using a Snowflake account.

The company has its Snowflake account In the AWS eu-west 2 Europe (London) region. The vendor's Snowflake account Is on the Azure platform in the West Europe region. How should the company's Architect configure the data share?

Options:

A.

1. Create a share.2. Add objects to the share.3. Add a consumer account to the share for the vendor to access.

B.

1. Create a share.2. Create a reader account for the vendor to use.3. Add the reader account to the share.

C.

1. Create a new role called db_share.2. Grant the db_share role privileges to read data from the company database and schema.3. Create a user for the vendor.4. Grant the ds_share role to the vendor's users.

D.

1. Promote an existing database in the company's local account to primary.2. Replicate the database to Snowflake on Azure in the West-Europe region.3. Create a share and add objects to the share.4. Add a consumer account to the share for the vendor to access.

Question 20

A company has built a data pipeline using Snowpipe to ingest files from an Amazon S3 bucket. Snowpipe is configured to load data into staging database tables. Then a task runs to load the data from the staging database tables into the reporting database tables.

The company is satisfied with the availability of the data in the reporting database tables, but the reporting tables are not pruning effectively. Currently, a size 4X-Large virtual warehouse is being used to query all of the tables in the reporting database.

What step can be taken to improve the pruning of the reporting tables?

Options:

A.

Eliminate the use of Snowpipe and load the files into internal stages using PUT commands.

B.

Increase the size of the virtual warehouse to a size 5X-Large.

C.

Use an ORDER BY command to load the reporting tables.

D.

Create larger files for Snowpipe to ingest and ensure the staging frequency does not exceed 1 minute.

Question 21

An Architect has a design where files arrive every 10 minutes and are loaded into a primary database table using Snowpipe. A secondary database is refreshed every hour with the latest data from the primary database.

Based on this scenario, what Time Travel query options are available on the secondary database?

Options:

A.

A query using Time Travel in the secondary database is available for every hourly table version within the retention window.

B.

A query using Time Travel in the secondary database is available for every hourly table version within and outside the retention window.

C.

Using Time Travel, secondary database users can query every iterative version within each hour (the individual Snowpipe loads) in the retention window.

D.

Using Time Travel, secondary database users can query every iterative version within each hour (the individual Snowpipe loads) and outside the retention window.

Question 22

A table, EMP_ TBL has three records as shown:

The following variables are set for the session:

Which SELECT statements will retrieve all three records? (Select TWO).

Options:

A.

Select * FROM Stbl_ref WHERE Scol_ref IN ('Name1','Nam2','Name3');

B.

SELECT * FROM EMP_TBL WHERE identifier(Scol_ref) IN ('Namel','Name2', 'Name3');

C.

SELECT * FROM identifier WHERE NAME IN ($var1, $var2, $var3);

D.

SELECT * FROM identifier($tbl_ref) WHERE ID IN Cvarl','var2','var3');

E.

SELECT * FROM $tb1_ref WHERE $col_ref IN ($var1, Svar2, Svar3);

Question 23

How can the Snowpipe REST API be used to keep a log of data load history?

Options:

A.

Call insertReport every 20 minutes, fetching the last 10,000 entries.

B.

Call loadHistoryScan every minute for the maximum time range.

C.

Call insertReport every 8 minutes for a 10-minute time range.

D.

Call loadHistoryScan every 10 minutes for a 15-minute time range.

Question 24

The Data Engineering team at a large manufacturing company needs to engineer data coming from many sources to support a wide variety of use cases and data consumer requirements which include:

1) Finance and Vendor Management team members who require reporting and visualization

2) Data Science team members who require access to raw data for ML model development

3) Sales team members who require engineered and protected data for data monetization

What Snowflake data modeling approaches will meet these requirements? (Choose two.)

Options:

A.

Consolidate data in the company’s data lake and use EXTERNAL TABLES.

B.

Create a raw database for landing and persisting raw data entering the data pipelines.

C.

Create a set of profile-specific databases that aligns data with usage patterns.

D.

Create a single star schema in a single database to support all consumers’ requirements.

E.

Create a Data Vault as the sole data pipeline endpoint and have all consumers directly access the Vault.

Question 25

A company has several sites in different regions from which the company wants to ingest data.

Which of the following will enable this type of data ingestion?

Options:

A.

The company must have a Snowflake account in each cloud region to be able to ingest data to that account.

B.

The company must replicate data between Snowflake accounts.

C.

The company should provision a reader account to each site and ingest the data through the reader accounts.

D.

The company should use a storage integration for the external stage.

Question 26

What considerations need to be taken when using database cloning as a tool for data lifecycle management in a development environment? (Select TWO).

Options:

A.

Any pipes in the source are not cloned.

B.

Any pipes in the source referring to internal stages are not cloned.

C.

Any pipes in the source referring to external stages are not cloned.

D.

The clone inherits all granted privileges of all child objects in the source object, including the database.

E.

The clone inherits all granted privileges of all child objects in the source object, excluding the database.

Question 27

An Architect is troubleshooting a query with poor performance using the QUERY function. The Architect observes that the COMPILATION_TIME Is greater than the EXECUTION_TIME.

What is the reason for this?

Options:

A.

The query is processing a very large dataset.

B.

The query has overly complex logic.

C.

The query Is queued for execution.

D.

The query Is reading from remote storage

Question 28

Is it possible for a data provider account with a Snowflake Business Critical edition to share data with an Enterprise edition data consumer account?

Options:

A.

A Business Critical account cannot be a data sharing provider to an Enterprise consumer. Any consumer accounts must also be Business Critical.

B.

If a user in the provider account with role authority to create or alter share adds an Enterprise account as a consumer, it can import the share.

C.

If a user in the provider account with a share owning role sets share_restrictions to False when adding an Enterprise consumer account, it can import the share.

D.

If a user in the provider account with a share owning role which also has override share restrictions privilege share_restrictions set to False when adding an Enterprise consumer account, it can import the share.

Question 29

A new user user_01 is created within Snowflake. The following two commands are executed:

Command 1→ SHOW GRANTS TO USER user_01;

Command 2→ SHOW GRANTS ON USER user_01;

What inferences can be made about these commands?

Options:

A.

Command 1 defines which user owns user_01Command 2 defines all the grants which have been given to user_01

B.

Command 1 defines all the grants which are given to user_01Command 2 defines which user owns user_01

C.

Command 1 defines which role owns user_01Command 2 defines all the grants which have been given to user_01

D.

Command 1 defines all the grants which are given to user_01Command 2 defines which role owns user_01