glux
glux

Reputation: 532

gsutil - copy all object that were uploaded before a specific time

I currently have a cron job that uploads a set of directories/files using r-sync. I have versioning enabled and I have a lifecycle management policy that will delete all versions that are older than 30 days.

gsutil ls -la gs://rpp-XXXXXXXXX-bkup

I am able to see the generated versions and I also see each version has a timestamp associated with it. Is it possible to download using gsutil the entire directory structure and files from GCS to local by specifying all files that are less than or equal to a specific date. The use case is to recover from the GCS backup to a specific date/time stamp.

Upvotes: 0

Views: 3035

Answers (2)

user9436250
user9436250

Reputation: 21

gsutil ls -l gs://your-bucket | grep '2022-05-18' | awk '{print $3}' | gsutil -m cp -I ./download_dir

Check https://cloud.google.com/storage/docs/gsutil/commands/cp#description -I flag for reference

Upvotes: 2

Travis Hobrla
Travis Hobrla

Reputation: 5511

There's no way to do this with a single gsutil command, but you could write a simple parser of the list output that filters the names objects to the time range you are interested in. Then you could pass that as input to gsutil cp -I.

Note that your parser would need to reconcile the case where there was more than one version of the same file within the time range you specified.

Upvotes: 2

Related Questions