New Year Special 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: save70

Free and Premium Snowflake ARA-C01 Dumps Questions Answers

Page: 1 / 12
Total 162 questions

SnowPro Advanced: Architect Certification Exam Questions and Answers

Question 1

A user is executing the following command sequentially within a timeframe of 10 minutes from start to finish:

What would be the output of this query?

Options:

A.

Table T_SALES_CLONE successfully created.

B.

Time Travel data is not available for table T_SALES.

C.

The offset -> is not a valid clause in the clone operation.

D.

Syntax error line 1 at position 58 unexpected 'at’.

Buy Now
Question 2

An Architect uses COPY INTO with the ON_ERROR=SKIP_FILE option to bulk load CSV files into a table called TABLEA, using its table stage. One file named file5.csv fails to load. The Architect fixes the file and re-loads it to the stage with the exact same file name it had previously.

Which commands should the Architect use to load only file5.csv file from the stage? (Choose two.)

Options:

A.

COPY INTO tablea FROM @%tablea RETURN_FAILED_ONLY = TRUE;

B.

COPY INTO tablea FROM @%tablea;

C.

COPY INTO tablea FROM @%tablea FILES = ('file5.csv');

D.

COPY INTO tablea FROM @%tablea FORCE = TRUE;

E.

COPY INTO tablea FROM @%tablea NEW_FILES_ONLY = TRUE;

F.

COPY INTO tablea FROM @%tablea MERGE = TRUE;

Question 3

How is the change of local time due to daylight savings time handled in Snowflake tasks? (Choose two.)

Options:

A.

A task scheduled in a UTC-based schedule will have no issues with the time changes.

B.

Task schedules can be designed to follow specified or local time zones to accommodate the time changes.

C.

A task will move to a suspended state during the daylight savings time change.

D.

A frequent task execution schedule like minutes may not cause a problem, but will affect the task history.

E.

A task schedule will follow only the specified time and will fail to handle lost or duplicated hours.

Question 4

Which technique will efficiently ingest and consume semi-structured data for Snowflake data lake workloads?

Options:

A.

IDEF1X

B.

Schema-on-write

C.

Schema-on-read

D.

Information schema

Question 5

An Architect has a VPN_ACCESS_LOGS table in the SECURITY_LOGS schema containing timestamps of the connection and disconnection, username of the user, and summary statistics.

What should the Architect do to enable the Snowflake search optimization service on this table?

Options:

A.

Assume role with OWNERSHIP on future tables and ADD SEARCH OPTIMIZATION on the SECURITY_LOGS schema.

B.

Assume role with ALL PRIVILEGES including ADD SEARCH OPTIMIZATION in the SECURITY LOGS schema.

C.

Assume role with OWNERSHIP on VPN_ACCESS_LOGS and ADD SEARCH OPTIMIZATION in the SECURITY_LOGS schema.

D.

Assume role with ALL PRIVILEGES on VPN_ACCESS_LOGS and ADD SEARCH OPTIMIZATION in the SECURITY_LOGS schema.

Question 6

A company is using Snowflake in Azure in the Netherlands. The company analyst team also has data in JSON format that is stored in an Amazon S3 bucket in the AWS Singapore region that the team wants to analyze.

The Architect has been given the following requirements:

1. Provide access to frequently changing data

2. Keep egress costs to a minimum

3. Maintain low latency

How can these requirements be met with the LEAST amount of operational overhead?

Options:

A.

Use a materialized view on top of an external table against the S3 bucket in AWS Singapore.

B.

Use an external table against the S3 bucket in AWS Singapore and copy the data into transient tables.

C.

Copy the data between providers from S3 to Azure Blob storage to collocate, then use Snowpipe for data ingestion.

D.

Use AWS Transfer Family to replicate data between the S3 bucket in AWS Singapore and an Azure Netherlands Blob storage, then use an external table against the Blob storage.

Question 7

Which Snowflake objects can be used in a data share? (Select TWO).

Options:

A.

Standard view

B.

Secure view

C.

Stored procedure

D.

External table

E.

Stream

Question 8

A Snowflake Architect Is working with Data Modelers and Table Designers to draft an ELT framework specifically for data loading using Snowpipe. The Table Designers will add a timestamp column that Inserts the current tlmestamp as the default value as records are loaded into a table. The Intent is to capture the time when each record gets loaded into the table; however, when tested the timestamps are earlier than the loae_take column values returned by the copy_history function or the Copy_HISTORY view (Account Usage).

Why Is this occurring?

Options:

A.

The timestamps are different because there are parameter setup mismatches. The parameters need to be realigned

B.

The Snowflake timezone parameter Is different from the cloud provider's parameters causing the mismatch.

C.

The Table Designer team has not used the localtimestamp or systimestamp functions in the Snowflake copy statement.

D.

The CURRENT_TIMEis evaluated when the load operation is compiled in cloud services rather than when the record is inserted into the table.

Question 9

A user can change object parameters using which of the following roles?

Options:

A.

ACCOUNTADMIN, SECURITYADMIN

B.

SYSADMIN, SECURITYADMIN

C.

ACCOUNTADMIN, USER with PRIVILEGE

D.

SECURITYADMIN, USER with PRIVILEGE

Question 10

Is it possible for a data provider account with a Snowflake Business Critical edition to share data with an Enterprise edition data consumer account?

Options:

A.

A Business Critical account cannot be a data sharing provider to an Enterprise consumer. Any consumer accounts must also be Business Critical.

B.

If a user in the provider account with role authority to create or alter share adds an Enterprise account as a consumer, it can import the share.

C.

If a user in the provider account with a share owning role sets share_restrictions to False when adding an Enterprise consumer account, it can import the share.

D.

If a user in the provider account with a share owning role which also has override share restrictions privilege share_restrictions set to False when adding an Enterprise consumer account, it can import the share.

Question 11

An Architect has chosen to separate their Snowflake Production and QA environments using two separate Snowflake accounts.

The QA account is intended to run and test changes on data and database objects before pushing those changes to the Production account. It is a requirement that all database objects and data in the QA account need to be an exact copy of the database objects, including privileges and data in the Production account on at least a nightly basis.

Which is the LEAST complex approach to use to populate the QA account with the Production account’s data and database objects on a nightly basis?

Options:

A.

1) Create a share in the Production account for each database

2) Share access to the QA account as a Consumer

3) The QA account creates a database directly from each share

4) Create clones of those databases on a nightly basis

5) Run tests directly on those cloned databases

B.

1) Create a stage in the Production account

2) Create a stage in the QA account that points to the same external object-storage location

3) Create a task that runs nightly to unload each table in the Production account into the stage

4) Use Snowpipe to populate the QA account

C.

1) Enable replication for each database in the Production account

2) Create replica databases in the QA account

3) Create clones of the replica databases on a nightly basis

4) Run tests directly on those cloned databases

D.

1) In the Production account, create an external function that connects into the QA account and returns all the data for one specific table

2) Run the external function as part of a stored procedure that loops through each table in the Production account and populates each table in the QA account

Question 12

You are a snowflake architect in an organization. The business team came to to deploy an use case which requires you to load some data which they can visualize through tableau. Everyday new data comes in and the old data is no longer required.

What type of table you will use in this case to optimize cost

Options:

A.

TRANSIENT

B.

TEMPORARY

C.

PERMANENT

Question 13

An Architect on a new project has been asked to design an architecture that meets Snowflake security, compliance, and governance requirements as follows:

1) Use Tri-Secret Secure in Snowflake

2) Share some information stored in a view with another Snowflake customer

3) Hide portions of sensitive information from some columns

4) Use zero-copy cloning to refresh the non-production environment from the production environment

To meet these requirements, which design elements must be implemented? (Choose three.)

Options:

A.

Define row access policies.

B.

Use the Business-Critical edition of Snowflake.

C.

Create a secure view.

D.

Use the Enterprise edition of Snowflake.

E.

Use Dynamic Data Masking.

F.

Create a materialized view.

Question 14

A company has a Snowflake account named ACCOUNTA in AWS us-east-1 region. The company stores its marketing data in a Snowflake database named MARKET_DB. One of the company’s business partners has an account named PARTNERB in Azure East US 2 region. For marketing purposes the company has agreed to share the database MARKET_DB with the partner account.

Which of the following steps MUST be performed for the account PARTNERB to consume data from the MARKET_DB database?

Options:

A.

Create a new account (called AZABC123) in Azure East US 2 region. From account ACCOUNTA create a share of database MARKET_DB, create a new database out of this share locally in AWS us-east-1 region, and replicate this new database to AZABC123 account. Then set up data sharing to the PARTNERB account.

B.

From account ACCOUNTA create a share of database MARKET_DB, and create a new database out of this share locally in AWS us-east-1 region. Then make this database the provider and share it with the PARTNERB account.

C.

Create a new account (called AZABC123) in Azure East US 2 region. From account ACCOUNTA replicate the database MARKET_DB to AZABC123 and from this account set up the data sharing to the PARTNERB account.

D.

Create a share of database MARKET_DB, and create a new database out of this share locally in AWS us-east-1 region. Then replicate this database to the partner’s account PARTNERB.

Question 15

Assuming all Snowflake accounts are using an Enterprise edition or higher, in which development and testing scenarios would be copying of data be required, and zero-copy cloning not be suitable? (Select TWO).

Options:

A.

Developers create their own datasets to work against transformed versions of the live data.

B.

Production and development run in different databases in the same account, and Developers need to see production-like data but with specific columns masked.

C.

Data is in a production Snowflake account that needs to be provided to Developers in a separate development/testing Snowflake account in the same cloud region.

D.

Developers create their own copies of a standard test database previously created for them in the development account, for their initial development and unit testing.

E.

The release process requires pre-production testing of changes with data of production scale and complexity. For security reasons, pre-production also runs in the production account.

Question 16

A company is designing its serving layer for data that is in cloud storage. Multiple terabytes of the data will be used for reporting. Some data does not have a clear use case but could be useful for experimental analysis. This experimentation data changes frequently and is sometimes wiped out and replaced completely in a few days.

The company wants to centralize access control, provide a single point of connection for the end-users, and maintain data governance.

What solution meets these requirements while MINIMIZING costs, administrative effort, and development overhead?

Options:

A.

Import the data used for reporting into a Snowflake schema with native tables. Then create external tables pointing to the cloud storage folders used for the experimentation data. Then create two different roles with grants to the different datasets to match the different user personas, and grant these roles to the corresponding users.

B.

Import all the data in cloud storage to be used for reporting into a Snowflake schema with native tables. Then create a role that has access to this schema and manage access to the data through that role.

C.

Import all the data in cloud storage to be used for reporting into a Snowflake schema with native tables. Then create two different roles with grants to the different datasets to match the different user personas, and grant these roles to the corresponding users.

D.

Import the data used for reporting into a Snowflake schema with native tables. Then create views that have SELECT commands pointing to the cloud storage files for the experimentation data. Then create two different roles to match the different user personas, and grant these roles to the corresponding users.

Question 17

Consider the following COPY command which is loading data with CSV format into a Snowflake table from an internal stage through a data transformation query.

This command results in the following error:

SQL compilation error: invalid parameter 'validation_mode'

Assuming the syntax is correct, what is the cause of this error?

Options:

A.

The VALIDATION_MODE parameter supports COPY statements that load data from external stages only.

B.

The VALIDATION_MODE parameter does not support COPY statements with CSV file formats.

C.

The VALIDATION_MODE parameter does not support COPY statements that transform data during a load.

D.

The value return_all_errors of the option VALIDATION_MODE is causing a compilation error.