1. Home
  2. Training Library
  3. Google Cloud Platform
  4. Courses
  5. Data Management on Google Cloud Platform

Object Storage with Google Cloud Storage

Object Storage with Google Cloud Storage
Overview
Difficulty
Beginner
Duration
27m
Students
697
Description

Data Management is critical for all server infrastructures - whether cloud-based or earth-bound. Even small applications need to process, manage, and digest large quantities of data, either as files, objects, or items in a database. So it's not surprising that all the major cloud platforms provide multiple data management solutions.

This course, designed and produced by our expert Linux System Administrator David Clinton, offers a great overview of the data solutions available from Google. You will see Google's two managed DBMS, its Object Storage service, and BigQuery - the Big Data service into which Google poured its years of experience performing analytics and queries on massive datasets.

Who should take this course

"Data Management on Google Cloud Platform" is a beginner level course, so it's available without any special prerequisites. Nevertheless, you might benefit by first going through our "Introduction to Google Cloud Platform" course for a more general view of the whole GCP family. Also, some experience with the Linux CLI might be helpful for those videos that include terminal operations.

After this course, you might want to try "Getting Started with Google Compute Engine",to round out your GCP knowledge. And after that, check out our Google quiz questions: Test yourself and increase your understanding at the same time thanks to our exclusive learning technology.

 

Transcript

Hi and welcome to CloudAcademy.com's video series on Google Cloud Platform data management. In this video, we're going to explore Google Cloud Storage, which allows direct global storage and retrieval of any type or amount of data. I should note document sharing is best done through Google drive.

How to activate Google Cloud Storage API 

Google storage is really meant for application deployments and data operations. That means APIs, online archives, backup replacement. The important elements to be aware of in cloud storage are projects, buckets, and objects. Projects are the overall Google cloud project within which you might have instances and other data projects. Buckets are the containers for those objects you choose to store in cloud storage. And the objects are the videos or data that you choose to leave there.

First though, we'll have to make sure that Google Cloud Storage API is activated. Let's click on APIs, then APIs again and make sure the cloud storage is actually on. And in this case it is. If it wasn't, you could browse through the disabled APIs below and click on their off button to turn them on. But everything's okay the way we need it right now.

gsutil: access Google Cloud Storage from the command line

Now, from an internet connected terminal session anywhere in the world, let's download and install the gsutil, the Google Storage Utility package, using Wget, which is a utility that downloads the contents of a specified webpage. In this case, we're taking the gsutil.tar.gz file. Now, we're going to restore the contents of this archive, using XZVF. The X is to extract, Z means it's a zipped or compressed file, V just means it's verbose, and F is the file name. We will add gsutil in its home directory, home/ubuntu/gsutil in this case, to the bash RC file, meaning that anywhere on the system, if we type gsutil, Ubuntu or Linux in general will know we're referring to this file, gsutil in home Ubuntu. In order for this to take affect, we'll have to restart the shell, which means we can exit out and log in again. And at this point, it should work. Let's authenticate. gsutil config, which will connect our gsutil package with our Google cloud project.

Since I have the gcloud package installed also, it seems that gsutil can piggyback on top of that. Let's try running this command anyway, so we can get an idea of how gsutil config itself is going to work. Let's copy this URL and I'm going to paste it into a browser, and that's going to return for me an authorization code. I'll paste the authorization code I was given.

How to create a bucket with gsutil

What's your project ID? Our project ID is future graph 718. I'll note that you have to use this project ID rather than the name you gave your project, because the Google system doesn't recognize that name for any purpose besides your own visual identification. We don't need any proxy, so we should be good to go. Let's now create a bucket. MB stands for make bucket, and the bucket name will be tiny bucket 9922. We could, by the way, have added a -C and then a class like durable reduce availability, which would be DRA.

Let's just show how that might have appeared. -C and you could have specified that with using -L for location, a specific location, which might be US Central 1. Which in fact, is the default. If you don't specify a location, that's where, by default, Google will send you.

Now let's list all the buckets associated with this project using LS, list. The only one so far is tiny bucket 9922, the one we've just created. Now let's move, using gsutil and MV to move this data file from our local directory, data.text to GS, that is the Google storage, tiny bucket 9922.

It's gone. Take another list of our home directory, and it's not there anymore. It's been moved actually from my directory up to the bucket. Finally, let's see how large the bucket has grown. gsutil du, which will return the size of the bucket. It's 1202 bytes, which is the size of the data.text file. Had there been other files there and directories, they would have been listed in this DU operation, but we've seen everything that's there. So we have created a bucket, we have uploaded data to the bucket, an object to the bucket, and we have connected from our local session to see the contents and manipulate the contents of this Google storage bucket.

Google Cloud Storage web interface 

Of course, you can also access Google cloud storage through their browser interface. Click on storage, storage browser. Let's create a bucket. We give it a unique name. Say, my bucket and then some numbers afterwards. Create. Let's upload a file.

Let's say a JPEG. We can select share publicly, click on public link, and this file is available to anybody on the internet, or at least to anybody we give this URL to, in just a few clicks.

About the Author
Avatar
David Clinton
Linux SysAdmin
Students
11905
Courses
12
Learning Paths
4

David taught high school for twenty years, worked as a Linux system administrator for five years, and has been writing since he could hold a crayon between his fingers. His childhood bedroom wall has since been repainted.

Having worked directly with all kinds of technology, David derives great pleasure from completing projects that draw on as many tools from his toolkit as possible.

Besides being a Linux system administrator with a strong focus on virtualization and security tools, David writes technical documentation and user guides, and creates technology training videos.

His favorite technology tool is the one that should be just about ready for release tomorrow. Or Thursday.