DoiT Wins Google Cloud Global Sales Partner of the Year Award – Learn more

Centralize Logs From Multiple Projects On Google Cloud Platform

1 njor dfm imr opqzwluiq

At DoiT International, we work with a variety of software companies around the globe. We frequently receive requests to solve similar problems from multiple customers. Recently, I witnessed several cases where organizations wanted to ship their logs from multiple Google Cloud Platform (GCP) projects into a single project for centralized access and observation.

A lot of companies ship their logs to third-party providers like Datadog, Splunk, and others; but in this post, I’ll illustrate how to accomplish log-file unification with access control using GCP’s Cloud Logging service (formerly Stackdriver) alone. The result is a simple and elegant solution to an emerging common need.


1 oingcshhcvwcefmkilvh7a
Demo architecture overview


For this example, I will create two test projects and configure their logs to ship to a central project. As an added bonus, I’ll show how to centralize monitoring and metrics as well (another common use case).

  1. Create three test projects on GCP, mike-test-log-view, mike-test-log-a, and mike-test-log-b
  2. Create a logs bucket in project mike-test-log-view and copy the path to the bucket
  3. Create a log sink of type Cloud Logging bucket, pointing to the path copied in step 2 for both projects mike-test-log-a and mike-test-log-b
  4. View details for each log sink and copy the writer entity Service Account email address (dynamically created for each)
  5. Edit the IAM roles in project mike-test-log-view and add each Service Account copied from the log sinks, granting the Logs Bucket Writer role to ach
  6. Edit the IAM roles in project mike-test-log-view and add the role Logs View Accessor with condition pointing to the path of your logs bucket (to restrict access by user)
  7. View the Logs Explorer page and click “Refine Scope” at the top, selecting “Scope by storage” and selecting your logs bucket

Step By Step

The following steps illustrate how you can ship logs from multiple Google Cloud Platform projects to a single centralized project.

Step 1: Create test projects

This step is self-explanatory. I created three projects as described above for demo purposes.

Step 2: Create logs bucket in view project

1 k45b9akfbqbdswicr 1eq
Visit Logs Storage and “Create Logs Bucket” then copy the path to the bucket

Step 3: Create log sinks in test projects

Create the log sinks in the test projects a and b respectively.

1 zfbvygtmw4qq3fhkofzfua
Set sink destination to “Cloud Logging Bucket”
1 ltehbfet3hxpdllzpgfgpw
Select the “Use a logs bucket in another project” option
1 qlcuo8ue6pfrllsqngst6w
Log sink for test project “a”, appending destination with a path to logs bucket (after domain)
1 ieke uhq jdf u73af9kqa
Log sink for test project “b”, appending destination with a path to logs bucket (after domain)

Step 4: View details for log sinks and note IAM writer identity

For each test project, in the list view of the “Log Router Sinks” page click the “” (3 dots) at the far right of the row corresponding to your new log sink and select “View Details”.

1 zdpskpe vy8n9twqxaclmw

Copy the “Writer identity” which is the service account dynamically created with the log sink. You will add this to the view project to allow it to write log entries to your central logs bucket.

Step 5: Edit IAM roles in view project for log sink writers

For each log sink, in your central view project, visit the IAM administration page and add a member with the Logs Bucket Writer role using the copied “Writer identity” service account from step 4 as shown.

1 r91jqx8ysaj85hp9iwmnsq
Adding IAM roles to allow log sinks to write log entries in view project

Step 6: Edit IAM roles in view project for log viewers (users)

In order for users to view the logs in the Logs Explorer, you need to grant them the ability to edit the view (or scope) by granting them the Logs View Accessor role.

1 nvyjmnfmslrq6lfdwgvq w
Granting users the ability to edit the view (scope) for Logs Explorer

You can (and should) add further granularity to the user IAM role by adding a condition restricting access to only desired resources. In this case, it’s the path to the logs bucket you created. This allows you to restrict users’ view to only the logs and buckets you desire, useful for compliance controls.

1 3gt042wgfvgwfkyzjfbvow

Step 7: View the logs

With the permissions in place, within minutes or less, you should begin to see log entries from your view project. You must first “Refine Scope” and select the desired logs source as shown.

1 nitafe0p3vdqoh4ropvzq
Refining scope of Logs Explorer view to explore logs shipped to the Logs Bucket from sinks

Congratulations! From your view project, you now can view logs from other projects as shown.

1 wx0kxjg01glwvjx6c3cpwg
Logs from test project “a” are visible in the “view” project

Bonus #1: Reduce costs with exclusion filters

You can disable your “_Default” log sink on your projects to avoid paying for logging in multiple locations.

You may also add exclusion filters (or inclusion filters) in your log sinks to control which services they ship and which they filter out.

Bonus #2: Centralize your Cloud Monitoring

Google Cloud Operations (formerly Stackdriver) is a full-featured observability platform which, in addition to logging, includes another tool called Cloud Monitoring.

In a few clicks you can create a “Workspace” in your view project, and then select your other projects to centralize your monitoring and dashboards if you desire.

1 rxgvvjajxqmws5miditw3g
Cloud Monitoring allows you to centralize monitoring from GCP and AWS in one or more “Workspaces”


Hopefully, this article helps you better organize and manage your logging and observability across your organization. Follow me or view the DoiT Blog for more articles on tips & techniques, new features, best practices, and more for the public cloud.

Subscribe to updates, news and more.