Deploying and Implementing Kubernetes Engine Resources
Deploying and Implementing App Engine and Cloud Functions Resources
Deploying and Implementing Data Solutions
Deploying a Solution Using GCP Marketplace
Deploying Resources Using Deployment Manager
This course has been designed to teach you how to deploy and implement Google Cloud Platform solutions. The content in this course will help prepare you for the Associate Cloud Engineer exam.
- To learn how to deploy Kubernetes Engine resources on Google Cloud Platform
- To learn how to deploy and implement App Engine and Cloud Functions resources
- To learn how to use Cloud Launcher and Deployment Manager
- Those who are preparing for the Associate Cloud Engineer exam
- Those looking to learn more about GCP networking and compute features
To get the most from this course then you should have some exposure to GCP resources, such as Kubernetes Engine, App Engine, Cloud Functions, Cloud Launcher, and Deployment Manager. However, this is not essential.
There are many ways to load data into your Cloud Storage Bucket. You can use the command line, APIs, the console, and even various programming languages such as Python, PHP, Ruby, and more.
To load data into Google Cloud using the console, open the Cloud Storage browser in the Google Cloud Platform Console and then, while viewing the list of buckets, click the bucket that you want to upload data to. From the bucket's Objects tab, you can drag and drop your files from your workstation to the main pane in the GCP Console. However, you can also just click the Upload Files button, and then choose the files that you need to upload, and then click Open.
Either way, uploading data into GCP from your workstation, using the console, is pretty straightforward.
You can also use the command line to get data into Google Cloud. To upload data to Google Cloud using the command line, you need to leverage gsutil. To upload data using gsutil, run the gsutil cp command that you see on your screen.
What you'd need to do is replace OBJECT_LOCATION with the folder on your workstation that hosts the data you want to upload. You'd replace DESTINATION_BUCKET_NAME with the name of the bucket you are uploading to.
For example, if I wanted to upload finances.txt from my desktop, I'd replace OBJECT_LOCATION with Desktop/finances.txt. If I wanted to upload to a bucket named mybucket, I'd replace [DESTINATION_BUCKET_NAME] with mybucket.
My full command would look like the one that you now see on your screen.
If you need, or want, to upload data to GCP using APIs, you have two main choices. You can leverage the JSON API or the XML API.
In either case, you first need to obtain an authorization access token from the OAuth 2.0 Playground, and then configure the playground to use your OAuth credentials.
Once you've got your OAuth credentials configured, you can use the API of your choice to upload data.
If using the JSON API, you need to use curl with a POST command to upload your data. The sample code on your screen would upload a file named photo.png from the desktop to a bucket called mybucket.
OAUTH2_TOKEN would be replaced with the access token that you obtained from the OAuth 2.0 Playground. If using the XML API, you need to use curl with a PUT command to upload your data. The sample code on your screen would upload a file named photo.png from the desktop to a bucket called mybucket. Again OAUTH2_TOKEN would be replaced with the access token that you obtained from the OAuth 2.0 Playground.
So, as you can see, there are quite a few ways to get data into Google Cloud. In a production setting, you'd obviously use whatever method works for what you are trying to accomplish.
Tom is a 25+ year veteran of the IT industry, having worked in environments as large as 40k seats and as small as 50 seats. Throughout the course of a long an interesting career, he has built an in-depth skillset that spans numerous IT disciplines. Tom has designed and architected small, large, and global IT solutions.
In addition to the Cloud Platform and Infrastructure MCSE certification, Tom also carries several other Microsoft certifications. His ability to see things from a strategic perspective allows Tom to architect solutions that closely align with business needs.
In his spare time, Tom enjoys camping, fishing, and playing poker.