Reputation: 31
How have people implemented Production Access Controls (i.e. logging and reporting on access to compute instances by services and humans over SSH). Our goal is to forward all user logon entries to our SIEM consistently across projects and ideally avoid having project specific Stackdriver sinks (and associated setup and maintenance).
We've tried the following:
Issues we're seeing: - Limited documentation on filter format at org level (seems to differ from project level for things like logName). log_id function does appear to work - Some log types appear at the org level (things like cloudapis activity) but syslog does not appear to get processed - Container OS appears to not enable ssh/sudo forwarding by default in fluentd (or I haven't found which log type has this data). I do see this logged to journalctl on a test node
Does anyone have a consistent way to achieve this?
Upvotes: 1
Views: 157
Reputation: 327
A way to approach this could be to by exporting your log sink to BigQuery. Note that sink setup to export BigQuery Logs for all projects under the Organization contains a parameter that is set to 'False', the field 'includeChildren' must be set to 'True'. Once set to true then logs from all the projects, folders, and billing accounts contained in the sink's parent resource are also available for export, if set to false then only the logs owned by the sink's parent resource are available for export. Then you must be able to filter the logs needed from BigQuery.
Another way to approach this will be to script it out by listing all the projects using command: gcloud projects list | tail -n +2 | awk -F" " '{print $1}'
This can be made into an array that can be iterated over and the logs for each project can be retrieved using a similar command as the one in this doc.
Not sure if all this can help somehow to solve or workaround your question, hope so.
Upvotes: 1
Reputation: 31
In case anyone else comes across this, we found the following:
Still trying to get logs of gcloud ssh commands
Upvotes: 1