Logging is very important today given the volume and variety of data we deal with across different customer use cases. This course will enable you to take a more proactive approach towards identifying faults and crashes in your applications through the effective use of Google Cloud Logging. As a result, you will learn how to delegate the operational overhead to GCP through automated logging tools, resulting in a more productive operational pipeline for your team and organization.
Through this course, you will equip yourself with the required skills for streaming log data to Google Cloud Logging service and use metrics to understand your system's behavior. The course will start with an introduction to the Cloud Logging Service and then demonstrate how to stream logs using Cloud Logging Agent and the Python client library.
To get the most out of this course, you should already have an understanding of application logging.
This course is suited for anyone interested in logging using Google Cloud Platform (GCP) Cloud Logging.
- Source code for this course: https://github.com/cloudacademy/Managing-Application-Logs-and-Metrics-on-GCP
- Google Cloud fluentd source code: https://github.com/GoogleCloudPlatform/google-fluentd
- Google Cloud fluentd additional configurations: https://github.com/GoogleCloudPlatform/fluentd-catch-all-config
- Google Cloud fluentd output plugin configuration: https://cloud.google.com/logging/docs/agent/logging/configuration#cloud-fluentd-config
- Package release history: https://pypi.org/project/google-cloud-logging/#history
- Metrics Explorer pricing: https://cloud.google.com/stackdriver/pricing#metrics-chargeable
We have learned that Cloud Logging offers very helpful features, and before we start playing with Cloud Logging, let's see how it works under the hood. The Cloud Logging API is the single endpoint for sending logs to Google Cloud Logging. Logs can come from different places like a Fluentd logging agent running on an on-premises server or an application running on Google Kubernetes Engine or a cloud logging agent running on a Google Compute Engine server.
Once the log entry hits the Cloud Logging API, it then gets forwarded to the Log Router, where it gets evaluated. Based on exclusion filters defined, it then gets forwarded to the default log storage where user can view the logs via dashboard, search and analyze the logs, create log-based metrics, create alerts, etc. Log entries can also be forwarded to various systems for different purposes using Log Sink. For example, forwarding logs to BigQuery for advanced analytics, Cloud Storage for archival or Pub/Sub to send logs to third-party platforms like Elasticsearch, Splunk, etc.
Pradeep Bhadani is an IT Consultant with over nine years of experience and holds various certifications related to AWS, GCP, and HashiCorp. He is recognized as HashiCorp Ambassador and GDE (Google Developers Expert) in Cloud for his knowledge and contribution to the community.
He has extensive experience in building data platforms on the cloud and as well as on-premises through the use of DevOps strategies & automation. Pradeep is skilled at delivering technical concepts helping teams and individuals to upskill on the latest technologies.