ShrewdStyle
ShrewdStyle

Reputation: 520

GCP Storage Bucket Access Logs

If you set a storage bucket as a static website, is there any way to trace who has accessed it? e.g. IP addresses, time viewed etc...

I have looked in the stackdriver logs but it only shows events e.g. bucket created, files uploaded etc...

Upvotes: 6

Views: 16139

Answers (2)

Paul Dawson
Paul Dawson

Reputation: 1382

You will need to configure access logs for public buckets. Then you may import them into BigQuery for analysis.

Use Access and Storage logs if:

  • You want to track access to public objects, such as assets in a bucket that you've configured to be a static website.

You'll be able to get all the information required such as IP address, time, region, zone, headers, read/write ops etc. in the access log fields.

https://cloud.google.com/storage/docs/access-logs

Upvotes: 6

Kolban
Kolban

Reputation: 15276

In GCP (of which GCS is a part), there is the concept of Audit Logs. These are normally switched off by default and be can be enabled on a product by product basis. For GCS, the Data Access Logs include DATA_READ which claims to log information on "getting object data".

However, before we go much further, there is a huge caveat. It reads:

Cloud Audit Logs does not track access to public objects.

What this means is that if you have exposed the objects as publicly readable (which is common for a WebSite hosted on GCS) then there are no logs captured.

References:

Upvotes: 2

Related Questions