Reputation: 832
I have three folders in a storage bucket, each containing test_result_0.xml
:
gs://test-lab-12345-67890/2018-02-23_18:48:45.403202_urtc/Nexus6P-27-en_US-portrait/test_result_0.xml
gs://test-lab-12345-67890/2018-02-23_18:48:45.403202_urtc/Nexus7-21-en_US-portrait/test_result_0.xml
gs://test-lab-12345-67890/2018-02-23_18:48:45.403202_urtc/Nexus6P-23-en_US-portrait/test_result_0.xml
Now I'm looking to copy them into a local folder, and rename them to something like:
test_result_0.xml
test_result_1.xml
test_result_2.xml
What is the best way to go about doing that, using a bash script? I have this so far, but it doesn't work :(
for i in $("`gsutil ls gs://test-lab-12345-67890 | tail -1`*/*.xml"); do
gsutil -m cp -r -U $i ~/Documents
done
Upvotes: 4
Views: 3702
Reputation: 832
Solved it using something like this:
counter=0
for path in $(gsutil ls -r gs://test-lab-12345-67890/**/*.xml | tail -3); do
counter=$((counter+1))
gsutil -m cp -r -U $path ~/localpath
mv ~/localpath/test_result_0.xml ~/localpath/test_result_$counter.xml
done
Not the most pretty code but it works for now .. unless someone can offer me a better way to do it? :)
Edit: Turns out in the documentation (which I missed) you can specify:
--results-bucket=RESULTS_BUCKET]
--results-dir=RESULTS_DIR
So that completely alleviates the problem I'm having since I'm now able to specify the folder names in advance!! I will keep the old code up for reference.
https://cloud.google.com/sdk/gcloud/reference/alpha/firebase/test/android/run
Upvotes: 2
Reputation: 5075
I tried to improve your solution making use of AWK. Check if you prefer this kind of solution, I do not know if you consider it more "elegant":
gsutil ls -r gs://test-lab-12345-67890/**/test_result_* | awk -F"/" {'system("gsutil cp "$0" ~/localpath/"$6)'}
In my test environment works with a file and folder structure similar to yours.
P.S. I didn't use many of the gsutils flags you used in your solution to improve readability, but you can add them if you think they are needed.
Upvotes: 1