activelearner
activelearner

Reputation: 7745

BigQuery - Get the total number of columns in a BigQuery table

Is there a way to query the total number of columns in a BigQuery table? I went through the BigQuery documentation but did not find anything relevant.

Thanks in advance!

Upvotes: 11

Views: 24441

Answers (8)

Samer Ayoub
Samer Ayoub

Reputation: 1001

Using Python Client libraries for Google BigQuery

from google.cloud import bigquery

bq_client = bigquery.Client.from_service_account_json("mypath\Service_Account_JSON_key_path")
table_id = "myproject.mydataset.mytable"
table = bq_client.get_table(table_id)  # API request.
print("The table {} has {} rows and {} columns".format(table_id, table.num_rows, len(table.schema)))

Upvotes: 0

Katy Q.
Katy Q.

Reputation: 169

Using SQL query & built-in INFORMATION_SCHEMA tables:

SELECT count(distinct column_name) 
FROM  `project_id`.name_of_dataset.INFORMATION_SCHEMA.COLUMNS
WHERE table_name = "name_of_table"

Upvotes: 16

Mikhail Berlyant
Mikhail Berlyant

Reputation: 172993

You can now use INFORMATION_SCHEMA - a series of views that provide access to metadata about datasets, tables, and views

For example

SELECT * EXCEPT(is_generated, generation_expression, is_stored, is_updatable)
FROM `bigquery-public-data.hacker_news.INFORMATION_SCHEMA.COLUMNS`
WHERE table_name = 'stories'

Also INFORMATION_SCHEMA.COLUMN_FIELD_PATHS view is useful when you need all nested fields within a RECORD (or STRUCT) column.

Upvotes: 0

Eduardo Conte
Eduardo Conte

Reputation: 1203

In node.js I used this code to get the length:

const { BigQuery } = require('@google-cloud/bigquery');

var params= {bq_project_id : "my_project_id"};//YOUR PROJECT ID
params.bq_dataset_id = "my_dataset_id"; //YOUR DATASET ID
params.bq_table_id = "my_table_id"; //YOUR TABLE ID
params.bq_keyFilename = './my_bq_key.json';//YOUR KEY PATH

const bigquery = new BigQuery({
    projectId: params.bq_project_id,
    keyFilename: params.bq_keyFilename,
});
async function colNums() {
    let resp = await bigquery.dataset(params.bq_dataset_id).table(params.bq_table_id).get();
    console.log(resp[1].schema.fields.length)
}
colNums();

I am not sure if the "resp[1]" applies to everyone (if you have problems, try to look at the resp object)

Upvotes: 0

user475043
user475043

Reputation: 67

This would be useful

#standardSQL
with table1 as(
select "somename1" as name, "someaddress1" adrs union all
select "somename2" as name, "someaddress2" adrs union all
select "somename3" as name, "someaddress3" adrs
)
select  array_length(regexp_extract_all(to_json_string(table1),"\":"))total_columns from table1 limit 1

Upvotes: 1

Krishnaa
Krishnaa

Reputation: 581

Just adding a snippet to get the schema in python:

from gcloud import bigquery

client = bigquery.Client(project="project_id")
dataset = client.list_datasets()
flag=0
for ds in dataset[0]:
    if flag==1:
        break
    if ds.name==<<dataset_name>>:
        for table in ds.list_tables()[0]:
            if table.name==<<table_name>>:
                table.reload()
                no_columns = len(table.schema)
                flag=1
                break

no_columns variable contains the column length of the required table.

Upvotes: 1

David M Smith
David M Smith

Reputation: 2332

Here's an alternative that doesn't require JQ, but is a little more "costly" ;-):

bq --format=csv query "select * FROM publicdata:samples.shakespeare LIMIT 1"|tail -n1|sed 's/[^,]//g' | wc -c

Note: I doubt this would work on tables that contain multiple repeated/nested columns.

Upvotes: 0

Pentium10
Pentium10

Reputation: 207912

There are a couple of ways to do this:

A. Using the BQ command line tool, and the JQ linux library to parse JSON.

bq --format=json show publicdata:samples.shakespeare | jq '.schema.fields | length'

This outpus:

4

B. Using the REST api to do a Tables:get call

GET https://www.googleapis.com/bigquery/v2/projects/projectId/datasets/datasetId/tables/tableId

this returns a full JSON, that you can parse and query for schema.field length.

{
   "kind":"bigquery#table",
   "description":"This dataset is a word index of the works of Shakespeare, giving the number of times each word appears in each corpus.",
   "creationTime":"1335916045099",
   "tableReference":{
      "projectId":"publicdata",
      "tableId":"shakespeare",
      "datasetId":"samples"
   },
   "numRows":"164656",
   "numBytes":"6432064",
   "etag":"\"E7ZNanj79wmDHI9DmeCWoYoUpAE/MTQxMzkyNjgyNzI1Nw\"",
   "lastModifiedTime":"1413926827257",
   "type":"TABLE",
   "id":"publicdata:samples.shakespeare",
   "selfLink":"https://www.googleapis.com/bigquery/v2/projects/publicdata/datasets/samples/tables/shakespeare",
   "schema":{
      "fields":[
         {
            "description":"A single unique word (where whitespace is the delimiter) extracted from a corpus.",
            "type":"STRING",
            "name":"word",
            "mode":"REQUIRED"
         },
         {
            "description":"The number of times this word appears in this corpus.",
            "type":"INTEGER",
            "name":"word_count",
            "mode":"REQUIRED"
         },
         {
            "description":"The work from which this word was extracted.",
            "type":"STRING",
            "name":"corpus",
            "mode":"REQUIRED"
         },
         {
            "description":"The year in which this corpus was published.",
            "type":"INTEGER",
            "name":"corpus_date",
            "mode":"REQUIRED"
         }
      ]
   }
}

Upvotes: 6

Related Questions