Winter Special - Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: top65certs

ARA-R01 Reviews Questions

Page: 12 / 12
Total 162 questions

SnowPro Advanced: Architect Recertification Exam Questions and Answers

Question 45

A Snowflake Architect created a new data share and would like to verify that only specific records in secure views are visible within the data share by the consumers.

What is the recommended way to validate data accessibility by the consumers?

Options:

A.

Create reader accounts as shown below and impersonate the consumers by logging in with their credentials.

create managed account reader_acctl admin_name = userl , adroin_password ■ 'Sdfed43da!44T , type = reader;

B.

Create a row access policy as shown below and assign it to the data share.

create or replace row access policy rap_acct as (acct_id varchar) returns boolean -> case when 'acctl_role' = current_role() then true else false end;

C.

Set the session parameter called SIMULATED_DATA_SHARING_C0NSUMER as shown below in order to impersonate the consumer accounts.

alter session set simulated_data_sharing_consumer - 'Consumer Acctl*

D.

Alter the share settings as shown below, in order to impersonate a specific consumer account.

alter share sales share set accounts = 'Consumerl’ share restrictions = true

Question 46

A healthcare company wants to share data with a medical institute. The institute is running a Standard edition of Snowflake; the healthcare company is running a Business Critical edition.

How can this data be shared?

Options:

A.

The healthcare company will need to change the institute’s Snowflake edition in the accounts panel.

B.

By default, sharing is supported from a Business Critical Snowflake edition to a Standard edition.

C.

Contact Snowflake and they will execute the share request for the healthcare company.

D.

Set the share_restriction parameter on the shared object to false.

Question 47

A media company needs a data pipeline that will ingest customer review data into a Snowflake table, and apply some transformations. The company also needs to use Amazon Comprehend to do sentiment analysis and make the de-identified final data set available publicly for advertising companies who use different cloud providers in different regions.

The data pipeline needs to run continuously and efficiently as new records arrive in the object storage leveraging event notifications. Also, the operational complexity, maintenance of the infrastructure, including platform upgrades and security, and the development effort should be minimal.

Which design will meet these requirements?

Options:

A.

Ingest the data using copy into and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

B.

Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Create an external function to do model inference with Amazon Comprehend and write the final records to a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

C.

Ingest the data into Snowflake using Amazon EMR and PySpark using the Snowflake Spark connector. Apply transformations using another Spark job. Develop a python program to do model inference by leveraging the Amazon Comprehend text analysis API. Then write the results to a Snowflake table and create a listing in the Snowflake Marketplace to make the data available to other companies.

D.

Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

Question 48

A Snowflake Architect is setting up database replication to support a disaster recovery plan. The primary database has external tables.

How should the database be replicated?

Options:

A.

Create a clone of the primary database then replicate the database.

B.

Move the external tables to a database that is not replicated, then replicate the primary database.

C.

Replicate the database ensuring the replicated database is in the same region as the external tables.

D.

Share the primary database with an account in the same region that the database will be replicated to.

Page: 12 / 12
Total 162 questions