image
Logs Explorer
Start course
Difficulty
Intermediate
Duration
30m
Students
21
Ratings
5/5
Description

In this lesson you will learn how to diagnose database issues using Google’s Cloud Monitoring and Cloud Logging services.

Learning Objectives

  • Manage and minimize your system downtime
  • Optimize the performance of your Google databases

Intended Audience

  • Database administrators
  • Database engineers
  • Cloud architects
  • Anyone preparing for a Google Cloud certification

Prerequisites

  • Some experience working with databases
  • Access to a GCP account
Transcript

Eventually, you are going to discover a problem with one or more of your databases.  And when this happens, you will need a way to quickly track down the root cause.  Generally, the best way to do this is by looking at your log files.  Log files contain useful debug information, notices, warnings, errors, and critical events.  So when something breaks, you probably want to look at what happened just before the event. 

Of course, digging through log files is not easy.  They can be massive in size.  And they might be spread across multiple files and several different locations.  This is why Google created the Cloud Logging service.  Cloud Logging is designed to be a centralized repository for all your logs.  And this makes them much easier to locate and manage.  

Now, if you are using a Google-managed database, then you just need to make sure that logging is enabled for your instance.  If you are running a custom, unmanaged solution, then you just need to install and configure the Ops Agent.  After you do that, your logs will be automatically added to Cloud Logging.

In this lesson, I am going to show you how to use a tool called Logs Explorer.  Log Explorer is how you access the log entries stored in Cloud Logging.  It also allows you to create and apply powerful filters, and it supports string matching.  So whenever something breaks, you can quickly find the log entries that preceded the event, and then scan for any warnings or errors that might be related.

To start off, you first need to log into the Google web console.  And then search for “logs explorer”.

So here is the Logs Explorer page.  I know at first it can look rather intimidating.  But it is actually much easier to use than it looks.  All log entries are displayed down here in the bottom right.  And everything else is basically used to filter or search this list.  

For example, let’s say I wanted to look at all logs written in the last 12 hours.  I can select that here.  Notice that this updated the list of log entries.  So you scroll through the entries here.  And if you want, you can try to read through everything.  But that is going to take forever.  It is much easier to use these filters on the left to narrow down this list to something more manageable.

Right now, this list includes log entries for everything.  And I mean everything.  Not only does this contain the logs for my Cloud SQL instance, but also for my Compute Engine VMs and other services.  So first, I probably only want to show things that are related to Cloud SQL.  And I can do that by picking the resource type here.

You will notice that once again, the log entries were updated.  It also added something to the query field.  Now this is a filter expression.  This particular expression tells Logs Explorer to only show logs from Cloud SQL.  Now as you add more filters, the query expression is going to be expanded.  So you can either add filters by clicking on the options to the left.  Or you can write your own filter expression on the top.  Either way works.

Now I am currently viewing all log entries for Cloud SQL made in the last 12 hours.  But this is still a lot of records.  So I probably should apply some more filters.  Many times you are going to want to filter by severity.  So, if there were any critical or error messages present, I could filter by those.  But right now, the worst I have is some warnings.  So let me select that.  Ok, so now this is all the warnings generated in the last 12 hours for my Cloud SQL instance.

There are many other filters available.  You can limit the entries to a specific log file.  You limit entries to a project if you have multiple.  You can even pick specific databases or regions.  So basically, if my European clients experienced a major slowdown last Wednesday morning, I could construct an appropriate filter to show me the log entries for that time period.

Now, If you ever apply a filter that you don’t want, you can always remove it.  Just click on the clear button here.  Or you can just modify the filter expression here.  And then re-run the query like this.  I personally like the query editor, but if you don’t, you can hide it like this.

So once you have narrowed down your list of entries, you can start scrolling through the list like this.  In addition, there is a search option here.  And this will help you find what you are looking for.  You can easily highlight all “insecure” messages.  Or you can highlight any log entries containing the word “fail”.

When you find an entry you are interested in, just click on it to expand it.  The entries are formatted in JSON, so you might have to expand several fields.  There is also a button up here that you can use to fully expand or collapse a field.

So this is basically how you are going to search your logs.  You pick the times and dates you are interested in.  Then filter by resource type, severity, or whatever else.  And then use the search feature to find exactly what you are looking for.

Now there is one last thing I want to show you.  Logs Explorer is not the only way to go through these logs.  You can also export your log entries and use a different tool if you prefer.  To do that, just click on “More actions”, and then select “Create sink”.  A log sink will copy the specified log entries outside of Cloud Logging.

You need to give your sink a name.  Then you have to choose where you want the entries to be exported to.  You can choose things like Cloud Storage, BigQuery, or Splunk.  Since I picked Cloud Storage, I also have to specify a bucket.  And it looks like I need to create one.  

Next, I need to set my filters.  The filters specify what log entries that I want to export.  So by default, it will copy the same expression I had set in Logs Explorer.  But I could overwrite this with something else, if I wanted.  There also is an additional exclusion filter you can set down here, but I am going to leave that blank.  And then finally, I just have to click on “Create Sink”.  Ok, after a minute or so, the data should be accessible in my bucket.  

And there you go.  Now you should understand the basics of using Cloud Logging.  And you should understand how to use Logs Explorer to track down the root cause of any database issue.

About the Author
Students
37100
Courses
44
Learning Paths
16

Daniel began his career as a Software Engineer, focusing mostly on web and mobile development. After twenty years of dealing with insufficient training and fragmented documentation, he decided to use his extensive experience to help the next generation of engineers.

Daniel has spent his most recent years designing and running technical classes for both Amazon and Microsoft. Today at Cloud Academy, he is working on building out an extensive Google Cloud training library.

When he isn’t working or tinkering in his home lab, Daniel enjoys BBQing, target shooting, and watching classic movies.