Reputation: 487
I have a R script in a docker in order to execute it on Google Cloud Run.
After full treatment of a file X.csv located in bucket "Input", I want to migrate it to bucket "Done". How to do it with googleCloudStorageR?
`googleCloudStorageR::gcs_upload("myfile.csv")`
seems no offer a gs:// syntax:
googleCloudStorageR::gcs_upload("gs://Input/X")
returns an error:
Path 'gs://Input/X.csv' does not exist
Additionaly bucket name is not in gcs_upload() function arguments. Previsouly do I have to set the default bucket to "Done"?
googleCloudStorageR::gcs_global_bucket("Done")
Thanks.
Upvotes: 0
Views: 118
Reputation: 160
If you are using cran.r-project this Documentation shows how to upload objects
## upload a file - type will be guessed from file extension or supply type
write.csv(mtcars, file = filename)
gcs_upload(filename)
## upload an R data.frame directly - will be converted to csv via write.csv
gcs_upload(mtcars)
## upload an R list - will be converted to json via jsonlite::toJSON
gcs_upload(list(a = 1, b = 3, c = list(d = 2, e = 5)))
## upload an R data.frame directly, with a custom function
## function should have arguments 'input' and 'output'
## safest to supply type too
f <- function(input, output) write.csv(input, row.names = FALSE, file = output)
gcs_upload(mtcars,
object_function = f,
type = "text/csv")
If you are using cloudyr
gcs_upload(
file,
bucket = gcs_get_global_bucket(),
type = NULL,
name = deparse(substitute(file)),
object_function = NULL,
object_metadata = NULL,
predefinedAcl = c("private", "bucketLevel", "authenticatedRead",
"bucketOwnerFullControl", "bucketOwnerRead", "projectPrivate", "publicRead",
"default"),
upload_type = c("simple", "resumable")
)
gcs_upload_set_limit(upload_limit = 5000000L)
if you want to set the bucket:
## set global bucket so don't need to keep supplying in future calls
gcs_global_bucket("my-bucket")
You can find this doc in the directory:
googleCloudStorageR-master/docs/reference/gcs_upload.html
Upvotes: 1