Reputation: 526
Is there any way we can upload files in git directly to the google cloud storage bucket?
I have tried using the below command:
gsutil cp https://Link.git gs://bucketname
But it gives me an error as:
InvalidUrlError: Unrecognized scheme "https".
Is there any other way that I can upload content to this?
Any help is much appreciated!
Upvotes: 3
Views: 3745
Reputation: 39834
Peeking under the hood shows, in google-cloud-sdk/platform/gsutil/gslib/storage_url.py
, the (likely) trigger for the error message you got:
def _GetSchemeFromUrlString(url_str):
"""Returns scheme component of a URL string."""
end_scheme_idx = url_str.find('://')
if end_scheme_idx == -1:
# File is the default scheme.
return 'file'
else:
return url_str[0:end_scheme_idx].lower()
[...]
def StorageUrlFromString(url_str):
"""Static factory function for creating a StorageUrl from a string."""
scheme = _GetSchemeFromUrlString(url_str)
if scheme not in ('file', 's3', 'gs'):
raise InvalidUrlError('Unrecognized scheme "%s"' % scheme)
Basically the tool doesn't have support for generic URLs.
Of course - one could venture into enhancing the tool to actually support direct copy from a git repo. But it should be noted that it will only work in daisychain mode. From Options:
-D
Copy in "daisy chain" mode, i.e., copying between two buckets by hooking a download to an upload, via the machine where gsutil is run. This stands in contrast to the default, where data are copied between two buckets "in the cloud", i.e., without needing to copy via the machine where gsutil runs.
[...]
Note: Daisy chain mode is automatically used when copying between providers (e.g., to copy data from Google Cloud Storage to another provider).
But since in such case the data would have to pass through the local machine running gsutil
it is probably simpler to just clone the git repo locally and then use the unmodified gsutil
to upload from that local repo to the bucket :)
Upvotes: 1