Reputation: 1372
I have this scenarios happening in my bucket, I have file called red.dat in my storage and this file will be updating regularly by jenkins once this file has been update I trigger event to deploy this red.dat file, I want check md5 hash of the file before and after update and if the value is different only do the deployment
this is how I upload the file to GCS
gsutil cp red.dat gs://example-bucket
and I have tried this command to get hash
gsutil hash -h gs://example-bucket/red.dat
and the result is this
Hashes [hex] for red.dat:
Hash (crc32c): d4c9895e
Hash (md5): 732b9e36d945f31a6f436a8d19f64671
but I'm little confused how I can implement to compare md5 before and after update since the file is always gonna be stay remote location(GCS). I would like some advice or show me right direction to achieve this, solution in commands or ansible is fine
Upvotes: 1
Views: 2350
Reputation: 12145
You can use the gsutil hash command on the local file, and then compare the output with what you saw from gsutil hash against the cloud object:
gsutil hash red.dat
Upvotes: 3