Reputation: 29051
I cloned a large Mercurial repo, made some changes and then pushed it to a different server.
Some new developers are coming on to the project and are trying to clone the repo from the new server, using
hg clone -r "branchname" http://ourserver/scm/hg/repo
which successfully downloads the source. But, largefiles are missing, so we try
hg lfpull --rev "all()"
Which results in a whole lot of errors like this:
Foo/Bar/afile.7z: largefile xxxx not available from http://ourserver/scm/hg/repo
If I go into the Mercurial web interface, browse to a file and download it via the "raw" link, I get a file with the correct filename, but it only contains what I am guessing is an ID. For instance, one PDF contains only:
f91476a8c2cc0a164c0880d128ca80776a8a934e
Any suggestions? I have all of these files locally, but they came from the original server. How do I make sure that largefiles are pushed?
Upvotes: 1
Views: 623
Reputation: 97285
Just idea:
When you push a changeset that affects largefiles to a remote repository, its largefile revisions will be uploaded along with the changeset. This ensures that the central store gets a copy of every revision of every largefile. Note that the remote Mercurial must also have the largefiles extension enabled for this to work.
It seems, that "different server" doesn't have LargeFiles enabled, thus - everybody will have a mess with files, which repo with LF-extension doesn't store in default repo storage (repo contain only "links" to revision of largefile in it's storage)
Upvotes: 2