Ends in
00
days
00
hrs
00
mins
00
secs
ENROLL NOW

🎁 Get 20% Off - Christmas Big Sale on All Practice Exams, Video Courses, and eBooks!

Google Cloud Logging

Home » Google Cloud » Google Cloud Logging

Google Cloud Logging

Last updated on June 26, 2023

Google Cloud Logging Cheat Sheet

  • An exabyte-scale, fully managed service for real-time log management. 
  • Helps you to securely store, search, analyze, and alert on all of your log data and events.

Features

  • Write any custom log, from any source, into Cloud Logging using the public write APIs.
  • You can search, sort, and query logs through query statements, along with rich histogram visualizations, simple field explorers, and the ability to save the queries.
  • Integrates with Cloud Monitoring to set alerts on the logs events and logs-based metrics you have defined.
  • You can export data in real-time to BigQuery to perform advanced analytics and SQL-like query tasks.
  • Cloud Logging helps you see the problems with your mountain of data using Error Reporting. It helps you automatically analyze your logs for exceptions and intelligently aggregate them into meaningful error groups.

Cloud Audit Logs

Cloud Audit Logs maintains audit logs for each Cloud project, folder, and organization. There are four types of logs you can use:

1. Admin Activity audit logs

  • Contains log entries for API calls or other administrative actions that modify the configuration or metadata of resources.
  • Tutorials dojo strip
  • You must have the IAM role Logging/Logs Viewer or Project/Viewer to view these logs.
  • Admin Activity audit logs are always written and you can’t configure or disable them in any way.

2. Data Access audit logs

  • Contains API calls that read the configuration or metadata of resources, including user-driven API calls that create, modify, or read user-provided resource data.
  • You must have the IAM roles Logging/Private Logs Viewer or Project/Owner to view these logs.
  • You must explicitly enable Data Access audit logs to be written. They are disabled by default because they are large.

3. System Event audit logs

  • Contains log entries for administrative actions taken by Google Cloud that modify the configuration of resources.
  • You must have the IAM role Logging/Logs Viewer or Project/Viewer to view these logs.
  • System Event audit logs are always written so you can’t configure or disable them.
  • There is no additional charge for your System Event audit logs.

4. Policy Denied audit logs

  • Contains logs when a Google Cloud service denies access to a user or service account triggered by a security policy violation.
  • You must have the IAM role Logging/Logs Viewer or Project/Viewer to view these logs.
  • Policy Denied audit logs are generated by default. Your cloud project is charged for the logs storage. 

Exporting Audit Logs

  • Log entries received by Logging can be exported to Cloud Storage buckets, BigQuery datasets, and Pub/Sub topics.
  • To export audit log entries outside of Logging:
    • Create a logs sink.
    • Give the sink a query that specifies the audit log types you want to export.
  • If you want to export audit log entries for a Google Cloud organization, folder, or billing account, review Aggregated sinks.

Pricing

  • All features of Cloud Logging are free to use, and the charge is only applicable for ingested log volume over the free allotment. Free usage allotments do not come with upfront fees or commitments.

Validate Your Knowledge

Question 1

You are working as a Cloud Security Officer in your company. You are asked to log all read requests and activities on your Cloud Storage bucket where you store all of the company’s sensitive data. You need to enable this feature as soon as possible because this is also a compliance requirement that will be checked on the next audit.

What should you do?

  1. Enable Data Access audit logs for Cloud Storage
  2. Enable Identity-Aware Proxy feature on the Cloud Storage.
  3. Enable Certificate Authority (CA) Service on the bucket.
  4. Enable Object Versioning on the bucket.

Correct Answer: 1

Google Cloud services write audit logs to help you answer the questions, “Who did what, where, and when?” Your Cloud projects contain only the audit logs for resources that are directly within the project. Other entities, such as folders, organizations, and Cloud Billing accounts, contain the audit logs for the entity itself.

You can enable the Data Access audit logs from the IAM & Admin section of the Google Cloud Console by selecting Access Approval from the service list.

With Cloud Audit Logs, you can keep records of all API operations in your Cloud Storage.

Cloud Audit Logs generates the following logs for Cloud Storage operations:

– Admin Activity logs: Entries for operations that modify the configuration or metadata of a project, bucket, or object.

– Data Access logs: Entries for operations that modify objects or read a project, bucket, or object. There are several sub-types of data access logs.

Hence, the correct answer is: Enable Data Access audit logs for Cloud Storage.

The option that says: Enable the Identity-Aware Proxy feature on the Cloud Storage is incorrect because this service doesn’t give you the ability to log API requests on Cloud Storage. As an identity and access service, Identity-Aware Proxy simply secures your application and VMs by checking web requests and authenticating the requestor via the Google Identity service.

The option that says: Enable Certificate Authority (CA) Service on the bucket is incorrect because this service is just used to deploy and manage the Certificate Authority component for your applications.

The option that says: Enable Object Versioning on the bucket is incorrect because this feature allows you to keep and store copies of objects when they are deleted or overwritten. This is a great feature but it doesn’t log Cloud Storage API operations.

References:
https://cloud.google.com/storage/docs/audit-logging
https://cloud.google.com/logging/docs/audit/configure-data-access

Note: This question was extracted from our Google Certified Associate Cloud Engineer Practice Exams.

Free AWS Courses

Question 2

Your company runs hundreds of projects on the Google Cloud Platform. You are tasked to store the company’s audit log files for three years for compliance purposes. You need to implement a solution to store these audit logs in a cost-effective manner.

What should you do?

  1. On the Logs Router, create a sink with Cloud BigQuery as a destination to save audit logs.
  2. Configure all resources to be a publisher on a Cloud Pub/Sub topic and publish all the message logs received from the topic to Cloud SQL to store the logs.
  3. Develop a custom script written in Python that utilizes the Logging API to duplicate the logs generated by Operations Suite to BigQuery.
  4. Create a Cloud Storage bucket using a Coldline storage class. Then on the Logs Router, create a sink. Choose Cloud Storage as a sink service and select the bucket you previously created.

Correct Answer: 4

To keep audit logs for a longer period of time or to use more powerful search capabilities, you can export copies of your audit logs to Cloud Storage, BigQuery, or Pub/Sub. Using Pub/Sub, you can export to other applications, other repositories, and to third parties.

Exporting involves writing a filter that selects the log entries you want to export, and choosing a destination from the following options:

– Cloud Storage: JSON files stored in Cloud Storage buckets.

– BigQuery: Tables created in BigQuery datasets.

– Pub/Sub: JSON messages delivered to Pub/Sub topics. Supports third-party integrations, such as Splunk, with Logging.

– Another Google Cloud Cloud project: Log entries held in Cloud Logging logs buckets.

The filter and destination are held in an object called a sink. Sinks can be created in Google Cloud project, organizations, folders, and billing accounts.

Among the recommended sinks by Google Cloud, Cloud Storage is the most inexpensive choice.

Hence, the correct answer is: Create a Cloud Storage bucket using a Coldline storage class. Then on the Logs Router, create a sink. Choose Cloud Storage as a sink service and select the bucket you previously created.

The option that says: On the Logs Router, create a sink with Cloud Bigquery as a destination to save audit logs is incorrect because it is costly to store logs on BigQuery compared to Cloud Storage. We need to save on costs as detailed in the scenario so this would not be a good choice. 

The option that says: Configure all resources to be a publisher on a Cloud Pub/Sub topic and publish all the message logs received from the topic to Cloud SQL to store the logs is incorrect because this is an expensive approach. Using Pub/Sub to forward logs will incur costs including provisioning a Cloud SQL instance as logs storage. Moreover, you don’t need to configure each resource to be a publisher on a Pub/Sub topic to receive the logs. A better solution is to use Logs Router instead.

The option that says: Develop a custom script written in Python that utilizes the Logging API to duplicate the logs generated by Operations Suite to BigQuery is incorrect because duplicating logs will definitely increase your cost. In addition, the logs will be stored on BigQuery, which is more expensive than exporting the logs to a Cloud Storage bucket.

References:
https://cloud.google.com/monitoring/audit-logging
https://cloud.google.com/logging/docs/export#sink-terms

Note: This question was extracted from our Google Certified Associate Cloud Engineer Practice Exams.

For more Google Cloud practice exam questions with detailed explanations, check out the Tutorials Dojo Portal:

Google Certified Associate Cloud Engineer Practice Exams

Google Cloud Logging Cheat Sheet References:

https://cloud.google.com/logging
https://cloud.google.com/error-reporting/docs/
https://cloud.google.com/logging/docs/audit
https://cloud.google.com/logging/docs/export/configure_export_v2
https://cloud.google.com/stackdriver/pricing

Get 20% Off – Christmas Big Sale on All Practice Exams, Video Courses, and eBooks!

Tutorials Dojo portal

Learn AWS with our PlayCloud Hands-On Labs

Tutorials Dojo Exam Study Guide eBooks

tutorials dojo study guide eBook

FREE AWS Exam Readiness Digital Courses

FREE AWS, Azure, GCP Practice Test Samplers

Subscribe to our YouTube Channel

Tutorials Dojo YouTube Channel

Follow Us On Linkedin

Recent Posts

Written by: Jon Bonso

Jon Bonso is the co-founder of Tutorials Dojo, an EdTech startup and an AWS Digital Training Partner that provides high-quality educational materials in the cloud computing space. He graduated from Mapúa Institute of Technology in 2007 with a bachelor's degree in Information Technology. Jon holds 10 AWS Certifications and is also an active AWS Community Builder since 2020.

AWS, Azure, and GCP Certifications are consistently among the top-paying IT certifications in the world, considering that most companies have now shifted to the cloud. Earn over $150,000 per year with an AWS, Azure, or GCP certification!

Follow us on LinkedIn, YouTube, Facebook, or join our Slack study group. More importantly, answer as many practice exams as you can to help increase your chances of passing your certification exams on your first try!

View Our AWS, Azure, and GCP Exam Reviewers Check out our FREE courses

Our Community

~98%
passing rate
Around 95-98% of our students pass the AWS Certification exams after training with our courses.
200k+
students
Over 200k enrollees choose Tutorials Dojo in preparing for their AWS Certification exams.
~4.8
ratings
Our courses are highly rated by our enrollees from all over the world.

What our students say about us?