lostintranslation
lostintranslation

Reputation: 24583

Where to store npm modules in node project source control

Ok I have a little bit of a weird situation. I have a node application that is going to be delivered to systems that do not have access to the internet. I have all my deps in my package.json file, but when I deliver the server I cannot run npm install.

Currently the node_modules directory is being checked into SVN. So far I hate this because every time I need to get a newer version of a module I have delete the entire module from SVN, install the newer version, add it to SVN and checkin.

Other option I have about is have some sort of build that does the npm install when packaging up the node application for delivery. Maybe something that checks out from SVN, does the npm install and creates the necessary tarball or rpm.

I have used 'bundler' for ruby in the past and that is pretty nice as you just put all you deps in another dir and it will pull in those deps. Works great if you are offline. Anything like that for node?

Upvotes: 4

Views: 2032

Answers (3)

Jamie Mason
Jamie Mason

Reputation: 4211

There is a CLI called shrinkpack which can help manage this for you.

It works by reading the dependency graph generated by npm shrinkwrap and repointing the https:// urls for each dependency (and sub-dependency) to instead point to a tarball in a node_shrinkwrap directory in your project.

The node_shrinkwrap directory contains the exact same .tgz files that npm install downloads from the npm registry and — since an npm-shrinkwrap.json file is present (created by npm shrinkwrap and updated by shrinkpack) — npm install knows to install using the tarballs found locally, instead of going over the network to the npm registry.

npm install -g shrinkpack

Upvotes: 3

Sevenate
Sevenate

Reputation: 6495

While looking for similar answer I've found this article about why there is some sense to keep your full node_modules in source control:

node_modules in git

Although it is from December 10 2011, so may be a bit out dated in nowadays.

Update: at January 2014, the advice to store all your node_modules in source control still apply.

Upvotes: 3

dublx
dublx

Reputation: 14536

I also face a similar deployment scenario and the solution I followed from searching about this, relies in using a make (unix tool) and write my own Makefile. The Makefile is a text file that follows a specific formatting and where you create targets, eg: test, publish, install.. each target is a piece of Bash code that runs when you call from the command-line, eg: 'make publish' or you can chain them together, like 'make test publish'.

So, in my scenario I have a 'test' target that executes my tests, then I have a 'publish' target that does several things like call 'npm install' and then 'npm prune' (to delete old npm dependencies I stopped using). Then the 'publish' is finished by doing a gzip of the folder into a separate location and then pushes the code to as intranet location my production server can download from and unzip. All node_modules code goes in the zip file. On the production server the Operations team extracts the gzip and then calls 'make start'. 'start' is just another target that sets any environments variables and starts my node app.

Summarizing, I have my node_modules in source-control and I found makefile to be very customisable and so it fits well on different projects with different needs, but you can keep a convention on the naming of your targets, so its easy for other team-workers and DevOps to test/publish/install your app.

Regards, Luis.

Upvotes: 1

Related Questions