gsinha
gsinha

Reputation: 1185

Is there a limit on the number of files in Google Cloud Storage ( GCS )?

I believe there should not be any limit but just wanted to confirm (as no mention in official docs):

  1. Is there a limit on the number of files in Google Cloud Storage (GCS)?
  2. Is there a performance impact (in access and write operation) if I have a very large number of files in GCS?
  3. Is there a limit on file name length (since I could use the filename to create pseudo directory structure)?

Upvotes: 10

Views: 10137

Answers (2)

Alex Martelli
Alex Martelli

Reputation: 881635

Re (3): per https://cloud.google.com/storage/docs/bucket-naming, bucket names are limited to 222 characters (and with several other limitations); per https://cloud.google.com/storage/docs/naming-objects, object names are limited to 1024 characters (when utf8-encoded), with one mandatory limitation ("must not contain Carriage Return or Line Feed characters") and several "strongly recommended" conventions (no control characters, avoid certain punctuation characters).

Re (1) and (2), to the best of my knowledge there are no limitations on numbers of objects you can store in GCS, nor performance implications depending on such numbers. Google's online docs do specifically say "any amount of data".

However, if you need a firm commitment for a Project of Unusual Size (many petabytes, not the mere terabytes mentioned at https://cloud.google.com/storage/docs/overview) you may be best advised to get such a commitment "officially", by contacting Sales at https://cloud.google.com/contact/ .

http://googlecloudplatform.blogspot.com/2013/11/justdevelopit-migrates-petabytes-of-data-to-google-cloud-storage.html specifically interviews a customer using Cloud Storage for "over 10 petabytes [[growing]] at a rate of 800 terabytes a month", so, at least up to such orders of magnitude, you should definitely be fine.

Upvotes: 25

Alex
Alex

Reputation: 34978

There might be a limit

I am doing backups of a large number of files using HyperBackup from a Synology diskstation to Google S3. While backup jobs with less numbers of files work well, it always fails with the error "Authorization failed" for bigger task.

Synology support told me, that is because of too much files on the side of Google S3.

I am using legacy S3 compatible access - not the nativ google access, maybe it is due to this.

Upvotes: -1

Related Questions