What does Snowflake recommend a user do if they need to connect to Snowflake with a tool or technology mat is not listed in Snowflake partner ecosystem?
Use Snowflake's native API.
Use a custom-built connector.
Contact Snowflake Support for a new driver.
Connect through Snowflake’s JDBC or ODBC drivers
If a user needs to connect to Snowflake with a tool or technology that is not listed in Snowflake's partner ecosystem, Snowflake recommends using its JDBC or ODBC drivers. These drivers provide a standard method of connecting from various tools and programming languages to Snowflake, offering wide compatibility and flexibility. By using these drivers, users can establish connections to Snowflake from their applications, ensuring they can leverage the capabilities of Snowflake regardless of the specific tools or technologies they are using.References: Snowflake Documentation on Client Drivers
What situation is likely to cause data spillage when a query is run?
When the query contains multiple filters and no data is returned.
When a virtual warehouse runs out of memory while executing the query.
When the number of queries exceeds the max_concurrency_level parameter setting.
When running queries that exceed the statement_timecut_in_seccnds parameter setting.
Given the statement template below, which database objects can be added to a share?(Select TWO).
GRANT
Secure functions
Stored procedures
Streams
Tables
Tasks
In Snowflake, shares are used to share data across different Snowflake accounts securely. When you create a share, you can include various database objects that you want to share with consumers. According to Snowflake's documentation, the types of objects that can be shared include tables, secure views, secure materialized views, and streams. Secure functions and stored procedures are not shareable objects. Tasks also cannot be shared directly. Therefore, the correct answers are streams (C) and tables (D).
To share a stream or a table, you use the GRANT statement to grant privileges on these objects to a share. The syntax for sharing a table or stream involves specifying the type of object, the object name, and the share to which you are granting access. For example:
GRANT SELECT ON TABLE my_table TO SHARE my_share; GRANT SELECT ON STREAM my_stream TO SHARE my_share;
These commands grant the SELECT privilege on a table named my_table and a stream named my_stream to a share named my_share. This enables the consumer of the share to access these objects according to the granted privileges.
Which function returns the URL of a stage using the stage name as the input?
BUILD_STAGE_FILE_URL
BUILD_SCOPED_FILE_URL
GET_PRESIGNED_URL
GET STAGE LOCATION
The function in Snowflake that returns the URL of a stage using the stage name as the input is C. GET_PRESIGNED_URL. This function generates a pre-signed URL for a specific file in a stage, enabling secure, temporary access to that file without requiring Snowflake credentials. While the function is primarily used for accessing files in external stages, such as Amazon S3 buckets, it is instrumental in scenarios requiring direct, secure file access for a limited time.
It's important to note that as of my last update, Snowflake's documentation does not specifically list a function named GET_PRESIGNED_URL for directly obtaining a stage's URL by its name. The description aligns closely with functionality available in cloud storage services (e.g., AWS S3's presigned URLs) which can be used in conjunction with Snowflake stages for secure, temporary access to files. For direct interaction with stages and their files, Snowflake offers various functions and commands, but the exact match for generating a presigned URL through a simple function call may vary or require leveraging external cloud services APIs in addition to Snowflake's capabilities.
References:
Snowflake Documentation and cloud services (AWS, Azure, GCP) documentation on presigned URLs and stage interactions.
When an object is created in Snowflake. who owns the object?
The public role
The user's default role
The current active primary role
The owner of the parent schema
In Snowflake, when an object is created, it is owned by the role that is currently active. This active role is the one that is being used to execute the creation command. Ownership implies full control over the object, including the ability to grant and revoke access privileges. This is specified in Snowflake's documentation under the topic of Access Control, which states that "the role in use at the time of object creation becomes the owner of the object."
References:
Snowflake Documentation: Object Ownership
What is a characteristic of a tag associated with a masking policy?
A tag can be dropped after a masking policy is assigned
A tag can have only one masking policy for each data type.
A tag can have multiple masking policies for each data type.
A tag can have multiple masking policies with varying data types
In Snowflake, a tag can be associated with only one masking policy for each data type. This means that for a given data type, you can define a single masking policy to be applied when a tag is used. Tags and masking policies are part of Snowflake's data classification and governance features, allowing for data masking based on the context defined by the tags.
References:
Snowflake Documentation: Tag-Based Masking Policies
Authorization to execute CREATE
Primary role
Secondary role
Application role
Database role
In Snowflake, the authorization to execute CREATE <object> statements, such as creating tables, views, databases, etc., is determined by the role currently set as the user's primary role. The primary role of a user or session specifies the set of privileges (including creation privileges) that the user has. While users can have multiple roles, only the primary role is used to determine what objects the user can create unless explicitly specified in the session.
What does the Activity area of Snowsight allow users to do? (Select TWO).
Schedule automated data backups.
Explore each step of an executed query.
Monitor queries executed by users in an account.
Create and manage user roles and permissions.
Access Snowflake Marketplace to find and integrate datasets.
The Activity area of Snowsight, Snowflake's web interface, allows users to perform several important tasks related to query management and performance analysis. Among the options provided, the correct ones are:
B. Explore each step of an executed query: Snowsight provides detailed insights into query execution, including the ability to explore the execution plan of a query. This helps users understand how a query was processed, identify performance bottlenecks, and optimize query performance.
C. Monitor queries executed by users in an account: The Activity area enables users to monitor the queries that have been executed by users within the Snowflake account. This includes viewing the history of queries, their execution times, resources consumed, and other relevant metrics.
These features are crucial for effective query performance tuning and ensuring efficient use of Snowflake's resources.
References:
Snowflake Documentation on Snowsight: Using Snowsight
Which statement will trigger a stream to advance its offset?
DESCRIBE STREAM
ALTER STREAM
DROP STREM
CREATE OR REPLACE STREAM
If a virtual warehouse is suspended, what happens to the warehouse cache?
The cache is dropped when the warehouse is suspended and is no longer available upon restart.
The warehouse cache persists for as long the warehouse exists, regardless of its suspension status.
The cache is maintained for up to two hours and can be restored If the warehouse Is restarted within this limit.
The cache is maintained for the auto suspend duration and can be restored it the warehouse 15 restarted within this limit.
When a virtual warehouse in Snowflake is suspended, the cache is dropped and is no longer available upon restart. This means that all cached data, including results and temporary data, are cleared from memory. The purpose of this behavior is to conserve resources while the warehouse is not active. Upon restarting the warehouse, it will need to reload any data required for queries from storage, which may result in a slower initial performance until the cache is repopulated. This is a critical consideration for managing performance and cost in Snowflake.
By default, which role can create resource monitors?
ACCOUNTADMIN
SECURITYADMIN
SYSADMIN
USERADMIN
The role that can by default create resource monitors in Snowflake is the ACCOUNTADMIN role. Resource monitors are a crucial feature in Snowflake that allows administrators to track and control the consumption of compute resources, ensuring that usage stays within specified limits. The creation and management of resource monitors involve defining thresholds for credits usage, setting up notifications, and specifying actions to be taken when certain thresholds are exceeded.
Given the significant impact that resource monitors can have on the operational aspects and billing of a Snowflake account, the capability to create and manage them is restricted to the ACCOUNTADMIN role. This role has the broadest set of privileges in Snowflake, including the ability to manage all aspects of the account, such as users, roles, warehouses, databases, and resource monitors, among others.
References:
Snowflake Documentation on Resource Monitors: Managing Resource Monitors
The Snowflake VARIANT data type imposes a 16 MB size limit on what?
An individual row
An individual column
A view
A file in a stage
What best practice recommendations will help prevent timeouts when using the PUT command to load large data sets? (Select TWO).
Compress the files before loading.
Use a semi-structured file format.
Increase the PARALLEL option value.
Load the data into a table stage.
Enable the overwrite option.
To avoid timeouts during large data uploads with the PUT command in Snowflake, it is recommended to:
Compress files before loading: Compressed files are smaller and upload faster, reducing the risk of timeouts.
Increase the PARALLEL option value: This option allows more simultaneous upload threads, improving upload speed and efficiency for large datasets.
Semi-structured file formats and table staging do not directly impact timeouts, while enabling overwrite does not prevent timeouts but rather controls overwriting of existing files.
Which Snowflake data governance feature supports resource usage monitoring?
Data classification
Column lineage
Access history
Object tagging
Which type of role can be granted to a share?
Account role
Custom role
Database role
Secondary role
In Snowflake, shares are used to share data between Snowflake accounts. When creating a share, it is possible to grant access to the share to roles within the Snowflake account that is creating the share. The type of role that can be granted to a share is a Custom role. Custom roles are user-defined roles that account administrators can create to manage access control in a more granular way. Unlike predefined roles such as ACCOUNTADMIN or SYSADMIN, custom roles can be tailored with specific privileges to meet the security and access requirements of different groups within an organization.
Granting a custom role access to a share enables users associated with that role to access the shared data if the share is received by another Snowflake account. It is important to carefully manage the privileges granted to custom roles to ensure that data sharing aligns with organizational policies and data governance standards.
References:
Snowflake Documentation on Shares: Shares
Snowflake Documentation on Roles: Access Control
In Snowflake's data security framework, how does column-level security contribute to the protection of sensitive information? (Select TWO).
Implementation of column-level security will optimize query performance.
Column-level security supports encryption of the entire database.
Column-level security ensures that only the table owner can access the data.
Column-level security limits access to specific columns within a table based on user privileges
Column-level security allows the application of a masking policy to a column within a table or view.
Column-level security in Snowflake enhances data protection by restricting access and applying masking policies to sensitive data at the column level.
Limiting Access Based on User Privileges:
Column-level security allows administrators to define which users or roles have access to specific columns within a table.
This ensures that sensitive data is only accessible to authorized personnel, thereby reducing the risk of data breaches.
Application of Masking Policies:
Masking policies can be applied to columns to obfuscate sensitive data.
For example, credit card numbers can be masked to show only the last four digits, protecting the full number from being exposed.
References:
Snowflake Documentation: Column-Level Security
Snowflake Documentation: Dynamic Data Masking
Who can activate a network policy for users in a Snowflake account? (Select TWO)
ACCOUNTADMIN
USERADMIN
PUBLIC
SYSADMIN
Any role that has the global ATTACH POLICY privilege
Network policies in Snowflake are used to control access to Snowflake accounts based on IP address ranges. These policies can be activated by specific roles that have the necessary privileges.
Role: ACCOUNTADMIN:
The ACCOUNTADMIN role has full administrative rights across the Snowflake account.
This role can manage all aspects of the Snowflake environment, including network policies.
Role with Global ATTACH POLICY Privilege:
Any role that has been granted the global ATTACH POLICY privilege can activate network policies.
This privilege allows the role to attach policies that control network access to the account.
References:
Snowflake Documentation: Network Policies
Snowflake's access control framework combines which models for securing data? (Select TWO).
Attribute-based Access Control (ABAC 1
Discretionary Access Control (DAC)
Access Control List (ACL)
Role-based Access Control (RBAC)
Rule-based Access Control (RuBAC)
Snowflake's access control framework utilizes a combination of Discretionary Access Control (DAC) and Role-based Access Control (RBAC). DAC in Snowflake allows the object owner to grant access privileges to other roles. RBAC involves assigning roles to users and then granting privileges to those roles. Through roles, Snowflake manages which users have access to specific objects and what actions they can perform, which is central to security and governance in the Snowflake environment.References: Snowflake Documentation on Access Control,
To use the overwrite option on insert, which privilege must be granted to the role?
truncate
DELETE
UPDATE
SELECT
To use the overwrite option on insert in Snowflake, the DELETE privilege must be granted to the role. This is because overwriting data during an insert operation implicitly involves deleting the existing data before inserting the new data.
Understanding the Overwrite Option: The overwrite option (INSERT OVERWRITE) allows you to replace existing data in a table with new data. This operation is particularly useful for batch-loading scenarios where the entire dataset needs to be refreshed.
Why DELETE Privilege is Required: Since the overwrite operation involves removing existing rows in the table, the executing role must have the DELETE privilege to carry out both the deletion of old data and the insertion of new data.
Granting DELETE Privilege:
To grant the DELETE privilege to a role, an account administrator can execute the following SQL command:
sqlCopy code
GRANT DELETE ON TABLE my_table TO ROLE my_role;
Which privilege is needed for a SnowFlake user to see the definition of a secure view?
OWNERSHIP
MODIFY
CREATE
USAGE
To see the definition of a secure view in Snowflake, the minimum privilege required is OWNERSHIP of the view. Ownership grants the ability to view the definition as well as to modify or drop the view. Secure views are designed to protect sensitive data, and thus the definition of these views is restricted to users with sufficient privileges to ensure data security.
References:
Snowflake Documentation: Secure Views
What optional properties can a Snowflake user set when creating a virtual warehouse? (Select TWO).
Auto-suspend
Cache size
Default role
Resource monitor
Storage size
When creating a virtual warehouse in Snowflake, users have the option to set several properties to manage its behavior and resource usage. Two of these optional properties are Auto-suspend and Resource monitor.
Auto-suspend: This property defines the period of inactivity after which the warehouse will automatically suspend. This helps in managing costs by stopping the warehouse when it is not in use.
CREATE WAREHOUSE my_warehouse
WITH WAREHOUSE_SIZE = 'XSMALL'
AUTO_SUSPEND = 300; -- Auto-suspend after 5 minutes of inactivity
Resource monitor: Users can assign a resource monitor to a warehouse to control and limit the amount of credit usage. Resource monitors help in setting quotas and alerts for warehouse usage.
CREATE WAREHOUSE my_warehouse
WITH WAREHOUSE_SIZE = 'XSMALL'
RESOURCE_MONITOR = 'my_resource_monitor';
References:
Snowflake Documentation: Creating Warehouses
Snowflake Documentation: Resource Monitors
An external stage many_stage contains many directories including one, app_files that contains CSV files
How can all the CSV files from this directory be moved into table my_table without scanning files that are not needed?
Which view in SNOWFLAKE.ACCOUNT_USAGE shows from which IP address a user connected to Snowflak?
ACCESS_HOSTORY
LOGIN_HISTORY
SESSIONS
QUERY HISTORY
The LOGIN_HISTORY view in SNOWFLAKE.ACCOUNT_USAGE shows from which IP address a user connected to Snowflake. This view is particularly useful for auditing and monitoring purposes, as it helps administrators track login attempts, successful logins, and the geographical location of users based on their IP addresses.
Reference to Snowflake documentation on LOGIN_HISTORY:
Monitoring Login Attempts
Which Snowflake feature or service is primarily used for managing and monitoring data and user activities?
Snowsight
SnowSQL
Snowflake Marketplace
Streamlit
A Snowflake user accidentally deleted a table. The table no longer exists, but the session is within the data retention period. How can the table be restored using the LEAST amount of operational overhead?
Clone the table schema as it existed before the table was dropped.
Clone the database as it existed before the table was dropped.
Recreate the table and reload the data.
Run the UNDROP command against the table.
In Snowflake, if a table is accidentally dropped but still within the data retention period (also known as "Time Travel"), the simplest and most efficient recovery method is the UNDROP command. This command restores the deleted table to its previous state with minimal operational effort. Since Snowflake retains dropped table data for a specific retention period (up to 90 days for the Enterprise edition), UNDROP can quickly recover the table without the need for complex cloning or data reloading processes, making it ideal for accidental deletions.
Which privilege is required on a virtual warehouse to abort any existing executing queries?
USAGE
OPERATE
MODIFY
MONITOR
The privilege required on a virtual warehouse to abort any existing executing queries is OPERATE. The OPERATE privilege on a virtual warehouse allows a user to perform operational tasks on the warehouse, including starting, stopping, and restarting the warehouse, as well as aborting running queries. This level of control is essential for managing resource utilization and ensuring that the virtual warehouse operates efficiently.
References:
Snowflake Documentation on Access Control: Access Control Privileges
A Snowflake user needs to optimize the definition of a secure view, but the user cannot see the view.
Which of the LEAST-PRIVILEGED access or role that should be granted to the user to complete this task?
Grant the user the AYSADMIN role.
Grant the user the ownership privilege on the secure view.
Grant the user the imported privileges privilege on the database.
Grant the user the SHOWFLAKE. object viewer database role.
Which typos of charts does Snowsight support? (Select TWO).
Area charts
Bar charts
Column charts
Radar charts
Scorecards
Snowsight, Snowflake’s user interface for executing and analyzing queries, supports various types of visualizations to help users understand their data better. Among the supported types, area charts and bar charts are two common options. Area charts are useful for representing quantities through the use of filled areas on the graph, often useful for showing volume changes over time. Bar charts, on the other hand, are versatile for comparing different groups or categories of data. Both chart types are integral to data analysis, enabling users to visualize trends, patterns, and differences in their data effectively.References: Snowflake Documentation on Snowsight Visualizations
Which Snowflake function and command combination should be used to convert rows in a relational table to a single VARIANT column, and unload the rows Into a file in JSON format? (Select TWO).
PUT
GET
COPY
EXPORT
OBJECT CONSTRUCT
To convert rows in a relational table to a single VARIANT column and unload the rows into a file in JSON format, you can use the COPY command in combination with the OBJECT_CONSTRUCT function. The OBJECT_CONSTRUCT function converts the row into a JSON object stored in a VARIANT column, and the COPY command can then be used to unload this data into a JSON file.
References:
Snowflake Documentation: OBJECT_CONSTRUCT
Snowflake Documentation: COPY INTO
Top of Form
Bottom of Form
Based on a review of a Query Profile, which scenarios will benefit the MOST from the use of a data clustering key? (Select TWO.)
A column that appears most frequently in order by operations
A column that appears most frequently in where operations
A column that appears most frequently in group by operations
A column that appears most frequently in aggregate operations
A column that appears most frequently in join operations
What is the purpose of the use of the VALIDATE command?
To view any queries that encountered an error
To verify that a SELECT query will run without error
To prevent a put statement from running if an error occurs
To see all errors from a previously run COPY INTO