Reputation: 1237
I'm writing a component of a data pipeline in R (mainly consisting of reading in some small csv files and reshaping before writing to a database) and am going to run the process in a docker container probably via AWS Lambda. I've written each part of the process as a function which I've put in an R package. The main wrapper function takes a few parameters as inputs, which will be passed to this process in json format. I want to have some way of including a script in my package which takes the json, converts it to an R list, and passes the elements of that list to the function that executes the process, and I guess I will execute this script using Rscript
in the Dockerfile.
What is best practice for managing a script like this? At the moment my plan is to put the script in the inst
folder of the package and call it via source(system.file("myscript.R", package = "mypackage"))
. But I've also read about the exec
and demo
subdirectories in inst
and wonder if this is better? Is there a more formal way of bundling up an R script as an executable as part of a package that is standard practice? I'd really appreciate it if anyone has done something similar and can share a neat solution.
Upvotes: 0
Views: 774
Reputation: 2532
In my opinion, it would be best to wrap the code you need execute in a function (f
) and then call it as
Rscript -e 'package_name::f()' --args "$@"
Upvotes: 0