Reputation: 169
At work, we have a few servers in an internal network (separate from the internet), yet they use projects hosted on github.com. The way we handled it up to now was we manually cloned the repo every once in a while, then manually transferred the project into the local network, using a USB device. Sometimes this projects can be big - dozens of GBs, and the transfer is extremely slow. So my question is:
Can I transfer each time only the the necessary commits somehow?
Upvotes: 2
Views: 84
Reputation: 42471
Instead of cloning the whole repository of the third-party project every time, consider using "shallow cloning" which allows to clone only the last commits (to a certain depth, say 1).
You can also clone a single branch (say master) from the github.
There are many articles about this, you can read about shallow cloning here for instance.
Another interesting solution that comes to mind is git archive command which allows creating an archive of files out of named tree.
Upvotes: 1
Reputation: 114310
Git maintains each repo independently, and allows you to sync them together however you want. The idea of a "central" repo is a convention by the users. As far as git is concerned, all repos are equally "central".
With this in mind, you can do the following:
To update:
The repo on the hard drive will recognize GitHub as its upstream. The repo on the isolated server will recognize the external drive as its upstream.
This is very similar to what you are doing now, but has one crucial difference. Since you are using git to manage the history, only the commits that differ will be transferred, making the update portion very fast.
Of course this will have up-front cost, but nothing that you haven't incurred already:
Upvotes: 1