1. Home
  2. Training Library
  3. Big Data
  4. Courses
  5. Introduction to Snowflake

Time Travel

Contents

keyboard_tab
Introduction
1
Course Intro
PREVIEW1m 46s
2
Snowflake Intro
PREVIEW8m 43s
Architecture
Snowflake
4
6
8
Security
9m 43s
9
Pricing
6m 39s
11
12
Snowpipe
4m 23s
Summary
13
Start course
Overview
Difficulty
Beginner
Duration
1h 33m
Students
156
Ratings
5/5
starstarstarstarstar
Description

Snowflake is an insanely cool next generation SaaS data warehousing solution that operates in the cloud!

Engineered from the ground up, Snowflake takes advantage of the elasticity that the cloud provides – and is truly revolutionary in every aspect.

Harnessing the power of the cloud, Snowflake has unique capabilities in the form of unlimited and instant scalability, making it perhaps the ultimate data warehouse solution. Cloud elasticity is very much at the heart of Snowflake – making its unique architecture and value proposition difficult to compete with in the market.

From an end user perspective, Snowflake is incredibly appealing. Building data warehouses and petabyte data scaled solutions without having to worry about on-prem compute and storage issues means your focus remains solely on the data itself and even more importantly, the analytics you derive from

In this course, you'll learn about the many distinguishing features that set Snowflake apart from its competitors.

For any feedback, queries, or suggestions relating to this course, please contact us at support@cloudacademy.com.

Learning Objectives

  • Learn about Snowflake and how it can provision cloud-hosted data warehouses
  • Learn how to administrate a Snowflake data warehouse
  • Learn how to scale Snowflake data warehouses instantly and on-demand
  • Learn how to use Snowflake to perform analytics on petabyte scale and beyond datasets

Intended Audience

  • Anyone interested in learning about Snowflake, and the benefits of using it to build a data warehouse in the cloud

Prerequisites

To get the most from this course, it would help to have a basic understanding of:

  • Basic Cloud and SaaS knowledge
  • Basic DBA knowledge
  • Basic SQL knowledge
Transcript

Welcome back. In this lesson, I'll review time travel, a unique and powerful feature that Snowflake provides. Time travel enables access to historical data which was either deleted or updated at any point within a defined period. Let's begin. Time travel is a very cool feature that can be used to recover data that has either been deleted, modified, or dropped, either intentionally or unintentionally. With time travel, you can confidently restore data from a pass point in time within a defined retention period in near real-time. There's no special requirements to preconfigure periodic database snapshots or backups etc.

Time travel is a feature that is turned on and available by default. With time travel, you can confidently experiment on your data sets knowing that if things go wrong, you can always recover quickly and easily. As already mentioned, because of Snowflake's unique storage architecture, recovery times are near instant. Time travel data recovery is only available within a defined retention period. The retention period being either 24 hours for the standard edition of Snowflake or 90 days for the enterprise edition. Regardless, when the retention period expires, a further 7-day failsafe period is available for you during which you can consult with the Snowflake support team who can recover data on your behalf. Out beyond this period, deleted data and/or objects dropped will be gone for good.

Note: it is only the sys admin role that can execute time travel queries within the 24-hour or 90 days retention period. Snowflake provides simple to use time travel SQL extensions, which make it super easy to either query deleted data from the past or restore dropped objects. The following slide presents a number of time travel select queries which highlights some of the ways to query deleted table data.

In the first example, the select query leverages the at and offset keywords to query all historical data stored in the cloudacademydb.training.coursestable as of 10 minutes ago. The second example leverages the at and timestamp keywords to query all historical data stored in the cloudacademydb.training.coursestable at the specified date time. The third and final query leverages the before and statement keywords to query all historical data stored in the cloudacademydb.training.coursestable up to but not including any changes made by the specified statement.

The following slide now presents a number of time travel cloning type queries, highlighting some of the ways to use time travel to clone objects. In the first example, the create schema command, in tandem with the clone keyword, is used to create a clone of the cloudacademydb.trainingschema and all of its objects as they existed one hour before the current time, renaming it cloudacademydb.training_cloned. The second example is similar in intention to the first, but this time clones an actual table at a defined point in time as specified within the provided timestamp. And the third and final example clones an entire database and all of its objects as they existed prior to the completion of the specified statement.

The following slide now presents a number of time travel undrop-based queries, demonstrating to you some of the ways to use time travel to restore various dropped objects, noting that when a table, schema, or database is dropped, it is not permanently erased, rather it can be quickly restored using the undrop command within the retention period applicable to the object. In the first example, the undrop keyword is used to restore a specific database table. In this case, the cloudacademydb.training.coursestable. The second example uses the undrop keyword to restore a specific schema and all of its child objects. In this case, the cloudacademydb.trainingschema. And finally, the third example undrops an entire database. In this case, the Cloud Academy DB database.

About the Author
Students
106789
Labs
59
Courses
113
Learning Paths
91

Jeremy is a Content Lead Architect and DevOps SME here at Cloud Academy where he specializes in developing DevOps technical training documentation.

He has a strong background in software engineering, and has been coding with various languages, frameworks, and systems for the past 25+ years. In recent times, Jeremy has been focused on DevOps, Cloud (AWS, GCP, Azure), Security, Kubernetes, and Machine Learning.

Jeremy holds professional certifications for AWS, GCP, Terraform, Kubernetes (CKA, CKAD, CKS).