konkodi
konkodi

Reputation: 145

Insufficient 'DISKS_TOTAL_GB' quota on Dataproc Serverless

I am ingesting 200 plus files into bigquery on dataproc serverless. The input files are not huge at all. All of them are in few mbs. Still many jobs are failing with error "Insufficient 'DISKS_TOTAL_GB' quota"

When I checked disk, i have 150TB plus space before job started. It was dataproc job which ate all space. Is there any way we can configure persistent disk that get allocated to each dataproc job>

Upvotes: 2

Views: 657

Answers (1)

Jayadeep Jayaraman
Jayadeep Jayaraman

Reputation: 2825

You can configure the disk size for Dataproc Serverless Spark workloads via spark.dataproc.driver.disk.size and spark.dataproc.executor.disk.size properties as mentioned in the Dataptoc Serverless documentation.

Upvotes: 1

Related Questions