Snowflake ARA-R01 SnowPro Advanced: Architect Recertification Exam Practice Test

Page: 1 / 14
Total 162 questions
Question 1

What transformations are supported in the below SQL statement? (Select THREE).

CREATE PIPE ... AS COPY ... FROM (...)



Answer : A, B, C

The SQL statement is a command for creating a pipe in Snowflake, which is an object that defines the COPY INTO <table> statement used by Snowpipe to load data from an ingestion queue into tables1.The statement uses a subquery in the FROM clause to transform the data from the staged files before loading it into the table2.

The transformations supported in the subquery are as follows2:

Data can be filtered by an optional WHERE clause, which specifies a condition that must be satisfied by the rows returned by the subquery. For example:

SQLAI-generated code. Review and use carefully.More info on FAQ.

create pipe mypipe as

copy into mytable

from (

select * from @mystage

where col1 = 'A' and col2 > 10

);

Columns can be reordered, which means changing the order of the columns in the subquery to match the order of the columns in the target table. For example:

SQLAI-generated code. Review and use carefully.More info on FAQ.

create pipe mypipe as

copy into mytable (col1, col2, col3)

from (

select col3, col1, col2 from @mystage

);

Columns can be omitted, which means excluding some columns from the subquery that are not needed in the target table. For example:

SQLAI-generated code. Review and use carefully.More info on FAQ.

create pipe mypipe as

copy into mytable (col1, col2)

from (

select col1, col2 from @mystage

);

The other options are not supported in the subquery because2:

Type casts are not supported, which means changing the data type of a column in the subquery. For example, the following statement will cause an error:

SQLAI-generated code. Review and use carefully.More info on FAQ.

create pipe mypipe as

copy into mytable (col1, col2)

from (

select col1::date, col2 from @mystage

);

Incoming data can not be joined with other tables, which means combining the data from the staged files with the data from another table in the subquery. For example, the following statement will cause an error:

SQLAI-generated code. Review and use carefully.More info on FAQ.

create pipe mypipe as

copy into mytable (col1, col2, col3)

from (

select s.col1, s.col2, t.col3 from @mystage s

join othertable t on s.col1 = t.col1

);

The ON ERROR - ABORT statement command can not be used, which means aborting the entire load operation if any error occurs. This command can only be used in the COPY INTO <table> statement, not in the subquery. For example, the following statement will cause an error:

SQLAI-generated code. Review and use carefully.More info on FAQ.

create pipe mypipe as

copy into mytable

from (

select * from @mystage

on error abort

);


1: CREATE PIPE | Snowflake Documentation

2: Transforming Data During a Load | Snowflake Documentation

Question 2

A user has the appropriate privilege to see unmasked data in a column.

If the user loads this column data into another column that does not have a masking policy, what will occur?



Answer : A

According to the SnowPro Advanced: Architect documents and learning resources, column masking policies are applied at query time based on the privileges of the user who runs the query. Therefore, if a user has the privilege to see unmasked data in a column, they will see the original data when they query that column. If they load this column data into another column that does not have a masking policy, the unmasked data will be loaded in the new column, and any user who can query the new column will see the unmasked data as well. The masking policy does not affect the underlying data in the column, only the query results.


Snowflake Documentation: Column Masking

Snowflake Learning: Column Masking

Question 3

The data share exists between a data provider account and a data consumer account. Five tables from the provider account are being shared with the consumer account. The consumer role has been granted the imported privileges privilege.

What will happen to the consumer account if a new table (table_6) is added to the provider schema?



Answer : D

When a new table (table_6) is added to a schema in the provider's account that is part of a data share, the consumer will not automatically see the new table. The consumer will only be able to access the new table once the appropriate privileges are granted by the provider. The correct process, as outlined in option D, involves using the provider's ACCOUNTADMIN role to grant USAGE privileges on the database and schema, followed by SELECT privileges on the new table, specifically to the share that includes the consumer's database. This ensures that the consumer account can access the new table under the established data sharing setup. Reference:

Snowflake Documentation on Managing Access Control

Snowflake Documentation on Data Sharing


Question 4

An Architect is designing a file ingestion recovery solution. The project will use an internal named stage for file storage. Currently, in the case of an ingestion failure, the Operations team must manually download the failed file and check for errors.

Which downloading method should the Architect recommend that requires the LEAST amount of operational overhead?



Question 5

How do Snowflake databases that are created from shares differ from standard databases that are not created from shares? (Choose three.)



Answer : A, C, D

According to the SnowPro Advanced: Architect documents and learning resources, the ways that Snowflake databases that are created from shares differ from standard databases that are not created from shares are:

Shared databases are read-only. This means that the data consumers who access the shared databases cannot modify or delete the data or the objects in the databases.The data providers who share the databases have full control over the data and the objects, and can grant or revoke privileges on them1.

Shared databases cannot be cloned. This means that the data consumers who access the shared databases cannot create a copy of the databases or the objects in the databases.The data providers who share the databases can clone the databases or the objects, but the clones are not automatically shared2.

Shared databases are not supported by Time Travel. This means that the data consumers who access the shared databases cannot use the AS OF clause to query historical data or restore deleted data.The data providers who share the databases can use Time Travel on the databases or the objects, but the historical data is not visible to the data consumers3.

The other options are incorrect because they are not ways that Snowflake databases that are created from shares differ from standard databases that are not created from shares. Option B is incorrect because shared databases do not need to be refreshed in order for new data to be visible.The data consumers who access the shared databases can see the latest data as soon as the data providers update the data1. Option E is incorrect because shared databases will not have the PUBLIC or INFORMATION_SCHEMA schemas without explicitly granting these schemas to the share.The data consumers who access the shared databases can only see the objects that the data providers grant to the share, and the PUBLIC and INFORMATION_SCHEMA schemas are not granted by default4. Option F is incorrect because shared databases cannot be created as transient databases. Transient databases are databases that do not support Time Travel or Fail-safe, and can be dropped without affecting the retention period of the data.Shared databases are always created as permanent databases, regardless of the type of the source database5.Reference:Introduction to Secure Data Sharing | Snowflake Documentation,Cloning Objects | Snowflake Documentation,Time Travel | Snowflake Documentation,Working with Shares | Snowflake Documentation,CREATE DATABASE | Snowflake Documentation


Question 6

Company A would like to share data in Snowflake with Company B. Company B is not on the same cloud platform as Company A.

What is required to allow data sharing between these two companies?



Answer : C

According to the SnowPro Advanced: Architect documents and learning resources, the requirement to allow data sharing between two companies that are not on the same cloud platform is to set up data replication to the region and cloud platform where the consumer resides. Data replication is a feature of Snowflake that enables copying databases across accounts in different regions and cloud platforms. Data replication allows data providers to securely share data with data consumers across different regions and cloud platforms by creating a replica database in the consumer's account. The replica database is read-only and automatically synchronized with the primary database in the provider's account.Data replication is useful for scenarios where data sharing is not possible or desirable due to latency, compliance, or security reasons1. The other options are incorrect because they are not required or feasible to allow data sharing between two companies that are not on the same cloud platform. Option A is incorrect because creating a pipeline to write shared data to a cloud storage location in the target cloud provider is not a secure or efficient way of sharing data. It would require additional steps to load the data from the cloud storage to the consumer's account, and it would not leverage the benefits of Snowflake's data sharing features. Option B is incorrect because ensuring that all views are persisted is not relevant for data sharing across cloud platforms. Views can be shared across cloud platforms as long as they reference objects in the same database.Persisting views is an option to improve the performance of querying views, but it is not required for data sharing2. Option D is incorrect because Company A and Company B do not need to agree to use a single cloud platform.Data sharing is possible across different cloud platforms using data replication or other methods, such as listings or auto-fulfillment3.Reference:Replicating Databases Across Multiple Accounts | Snowflake Documentation,Persisting Views | Snowflake Documentation,Sharing Data Across Regions and Cloud Platforms | Snowflake Documentation


Question 7

A company needs to have the following features available in its Snowflake account:

1. Support for Multi-Factor Authentication (MFA)

2. A minimum of 2 months of Time Travel availability

3. Database replication in between different regions

4. Native support for JDBC and ODBC

5. Customer-managed encryption keys using Tri-Secret Secure

6. Support for Payment Card Industry Data Security Standards (PCI DSS)

In order to provide all the listed services, what is the MINIMUM Snowflake edition that should be selected during account creation?



Page:    1 / 14   
Total 162 questions