Reputation: 26165
As part of a build script used by an npm package I am tinkering with (-g or required), I need to copy the contents of a directory in a remote github repository into the built
directory.
eg. an ACE editor version - a bunch of files on a remote. https://github.com/ajaxorg/ace-builds/tree/master/src-min-noconflict
I don't want my module to have the ACE files locally and there is no npm package for their builds I can require and use. It should pull it from the latest remote at the time of building.
I would prefer not to clone the repo but perhaps can grab the zipball and make something out of that.
Does anyone know of any package or script that can already do this without access to git
on the shell? I am hesitant to rely on git as you can install the package globally and use it as a binary...
What else can I do? node gyp? volo perhaps? component.js? anything that can reference and deploy from the remote git file system...
Upvotes: 2
Views: 164
Reputation: 26992
Is this an option? Rolling your own maybe simple enough (I'm not aware of anything existing)
Base64
encoded content of each file at the content pathCaveat: I have never tried this
Edit
Here's some rough code. The content query at the folder level doesn't return the encoded content, so you need to iterate each file and make another API request (this may blow rate limit for unauthenticated requests)
var GitHubApi = require("github")
,fs = require("fs");
var github = new GitHubApi({
version: "3.0.0"
,timeout: 5000
});
var msg = {
user: "visionmedia"
,repo: "express"
,path: "lib/router"
};
github.repos.getContent(msg, function(err, res) {
delete res.meta; //appended by wrapper, remove for iteration
for(file in res) {
msg.path = res[file].path;
github.repos.getContent(msg, function(err, res) {
fs.writeFile(res.name
,res.content
,res.encoding
,function(err){
console.log("Wrote %s to filesystem!", res.name)
});
});
};
});
Upvotes: 1