Reputation: 50
Is it possible to set-up and generate usage reports for Google Cloud Coldline Bucket?
I am looking to track items like: Filename, Filesize, Download URL, Requester IP, Requester GEO, Download Status, etc.
Upvotes: 2
Views: 3739
Reputation: 715
GCP monitoring provides graphs for a number of requests and data in bytes etc graphs in the GCP .
To enable GCP monitoring you have to enable GCP monitoring API.
Then Go to
monitoring -> Select Dashboard -> Select Cloud Storage.
It will show you a chart/graph for GCP storage bucket activities.
Upvotes: 1
Reputation: 1217
You can definitely track some of the things you've mentioned, like filename, download URL, and requester IP out-of-the-box. Other elements, like requester geo-location and download status, will require additional processing.
When it comes to Google Cloud Storage, you can enable logging for any kind of bucket you may have. There are currently two options for logging access to buckets, namely Cloud Audit Logging and Access & Storage logging. The first one is more generic, in the sense that it tracks RESTful requests in real-time, while the 2nd options is specific to Storage and can track more information about each access to a bucket. From what you said you were looking for, Access & Storage seems to be the way to go for you.
Access & Storage logging will create CSV files with plenty of information on each access. You can find exactly what's stored in these here. An interesting thing to note here is that there's a field named c_ip_region
that's currently not in use, but may one day contain geo-location information in the future...
You can find information on how to enable Access & Storage logging on a bucket via gsutil
here. It basically comes down to a few commands:
gsutil mb gs://example-logs-bucket
);gsutil acl ch -g [email protected]:W gs://example-logs-bucket
);gsutil defacl set project-private gs://example-logs-bucket
);gsutil logging set on -b gs://example-logs-bucket [-o log_object_prefix ] gs://example-bucket
).Seeing as this produces CSV files, you can easily import those into BigQuery. That way, you can query its contents easily! This import can be done via Dataflow or Cloud Functions (the latter is the best option if you want to customize the data before importing it; it also can be triggered by Storage events).
Upvotes: 4