Reputation: 4623
Currently my role is owner, but I'm not seeing a way to view all queries across all users. I'm mainly interested in seeing bytes processed for each successful query. Without this ability, it makes it really hard to determine where the costs are coming from without asking each user individually. Is there a way to do this through the UI or the CLI command?
Upvotes: 6
Views: 6019
Reputation: 945
The right corner of BigQuery UI, you can see a checkbox for Show queries for all users
.
Upvotes: 0
Reputation: 4623
Danny Kitt's answer got me part of the way there. Here's one of the Ruby scripts I wrote that ultimately gave me what I wanted. Should be easy enough to port to the language of your choice.
#!/usr/bin/ruby
require "json"
require "time"
MAX_RESULTS = 1000
jobs_json = `bq ls --format json --jobs --all --max_results #{MAX_RESULTS}`
jobs = JSON.parse(jobs_json)
users = {}
jobs.each do |job|
if job["Job Type"] == "query" and job["State"] == "SUCCESS"
job_data_json = `bq show --format=json --job #{job["jobId"]}`
job_data = JSON.parse(job_data_json)
creation_time = Time.at(job_data["statistics"]["creationTime"][0..-4].to_i)
user = job_data["user_email"]
bytes_processed = job_data["statistics"]["totalBytesProcessed"].to_i
if bytes_processed > 0
users[user] = 0 unless users.key?(user)
users[user] += bytes_processed
puts "[#{creation_time}] #{user}: #{bytes_processed} (running total: #{users[user]})"
# puts job_data["configuration"]["query"]["query"]
end
end
end
puts "\nFINAL:\n"
users.each do |user, bytes_processed|
puts "#{user}: #{bytes_processed}"
end
Upvotes: 3
Reputation: 3251
In the CLI, you can run bq ls -j -a
to retrieve jobs for all users in a project. I don't think this command returns the bytes processed for a successful query, but a bq show
on the successful queries' job IDs would retrieve the relevant information.
Using the API directly, you can specify the allUsers
parameter to a jobs.list
request.
There is no way to view this information in the web UI at present.
Upvotes: 5