A company is designing a process for importing a large amount of loT JSON data from cloud storage into Snowflake. New sets of loT data get generated and uploaded approximately every 5 minutes. Once the loT data is in Snowflake, the company needs up-to-date information from an external vendor to join to the data. This data is then presented to users through a dashboard that shows different levels of aggregation. The external vendor is a Snowflake customer. What solution will MINIMIZE complexity and MAXIMIZE performance?
1. Create an external table over the JSON data in cloud storage. 2. Create a task that runs every 5 minutes to run a transformation procedure on new data, based on a saved timestamp. 3. Ask the vendor to expose an API so an external function can be used to generate a call to join the data back to the loT data in the transformation procedure. 4. Give the transformed table access to the dashboard tool. 5. Perform the aggregations on the dashboard tool.
1. Create an external table over the JSON data in cloud storage. 2. Create a task that runs every 5 minutes to run a transformation procedure on new data based on a saved timestamp. 3. Ask the vendor to create a data share with the required data that can be imported into the company's Snowflake account. 4. Join the vendor's data back to the loT data using a transformation procedure. 5. Create views over the larger dataset to perform the aggregations required by the dashboard. 6. Give the views access to the dashboard tool.
1. Create a Snowpipe to bring the JSON data into Snowflake. 2. Use streams and tasks to trigger a transformation procedure when new JSON data arrives. 3. Ask the vendor to expose an API so an external function call can be made to join the vendor's data back to the loT data in a transformation procedure. 4. Create materialized views over the larger dataset to perform the aggregations required by the dashboard. 5. Give the materialized views access to the dashboard tool.
1. Create a Snowpipe to bring the JSON data into Snowflake. 2. Use streams and tasks to trigger a transformation procedure when new JSON data arrives. 3. Ask the vendor to create a data share with the required data that is then imported into the Snowflake account. 4. Join the vendor's data back to the loT data in a transformation procedure 5. Create materialized views over the larger dataset to perform the aggregations required by the dashboard. 6. Give the materialized views access to the dashboard tool.
A user, analyst_user has been granted the analyst_role, and is deploying a SnowSQL script to run as a background service to extract data from Snowflake. What steps should be taken to allow the IP addresses to be accessed? (Select TWO).
ALTER ROLE ANALYST_ROLE SET NETWORK_POLICY='ANALYST_POLICY';
ALTER USER ANALYSTJJSER SET NETWORK_POLICY='ANALYST_POLICY';
ALTER USER ANALYST_USER SET NETWORK_POLICY='10.1.1.20';
USE ROLE SECURITYADMIN; CREATE OR REPLACE NETWORK POLICY ANALYST_POLICY ALLOWED_IP_LIST = ('10.1.1.20');
USE ROLE USERADMIN; CREATE OR REPLACE NETWORK POLICY ANALYST_POLICY ALLOWED_IP_LIST = ('10.1.1.20');
You are a snowflake architect in an organization. The business team came to to deploy an use case which requires you to load some data which they can visualize through tableau. Everyday new data comes in and the old data is no longer required. What type of table you will use in this case to optimize cost
TRANSIENT
TEMPORARY
PERMANENT
A company is storing large numbers of small JSON files (ranging from 1-4 bytes) that are received from IoT devices and sent to a cloud provider. In any given hour, 100,000 files are added to the cloud provider. What is the MOST cost-effective way to bring this data into a Snowflake table?
An external table
A pipe
A stream
A copy command at regular intervals
Which feature provides the capability to define an alternate cluster key for a table with an existing cluster key?
External table
Materialized view
Search optimization
Result cache
A retail company has 2000+ stores spread across the country. Store Managers report that they are having trouble running key reports related to inventory management, sales targets, payroll, and staffing during business hours. The Managers report that performance is poor and time-outs occur frequently. Currently all reports share the same Snowflake virtual warehouse. How should this situation be addressed? (Select TWO).
Use a Business Intelligence tool for in-memory computation to improve performance.
Configure a dedicated virtual warehouse for the Store Manager team.
Configure the virtual warehouse to be multi-clustered.
Configure the virtual warehouse to size 4-XL
Advise the Store Manager team to defer report execution to off-business hours.
Which of the below commands will use warehouse credits?
SHOW TABLES LIKE 'SNOWFL%';
SELECT MAX(FLAKE_ID) FROM SNOWFLAKE;
SELECT COUNT(*) FROM SNOWFLAKE;
SELECT COUNT(FLAKE_ID) FROM SNOWFLAKE GROUP BY FLAKE_ID;
A company needs to have the following features available in its Snowflake account: 1. Support for Multi-Factor Authentication (MFA) 2. A minimum of 2 months of Time Travel availability 3. Database replication in between different regions 4. Native support for JDBC and ODBC 5. Customer-managed encryption keys using Tri-Secret Secure 6. Support for Payment Card Industry Data Security Standards (PCI DSS) In order to provide all the listed services, what is the MINIMUM Snowflake edition that should be selected during account creation?
Standard
Enterprise
Business Critical
Virtual Private Snowflake (VPS)
Which Snowflake objects can be used in a data share? (Select TWO).
Standard view
Secure view
Stored procedure
External table
Stream
A user has activated primary and secondary roles for a session. What operation is the user prohibited from using as part of SQL actions in Snowflake using the secondary role?
Insert
Create
Delete
Truncate