timothyclifford
timothyclifford

Reputation: 6969

Git on Windows, "Out of memory - malloc failed"

Have run into a problem with repository and tried almost every possible config setting found out there eg. pack.WindowMemory etc etc

I believe someone has checked in a large file to remote repository and now each time I try and pull or push to it, GIT tries to pack it and runs out of memory:

Auto packing the repository for optimum performance. You may also
run "git gc" manually. See "git help gc" for more information.
Counting objects: 6279, done.
Compressing objects: 100% (6147/6147), done.
fatal: Out of memory, malloc failed (tried to allocate 1549040327 bytes)
error: failed to run repack

Have tried git gc & git repack with various options but keeps returning same error.

Almost given up and about to just create a new repo but thought I'd ask around first :)

Upvotes: 83

Views: 114643

Answers (8)

Mostafa islami
Mostafa islami

Reputation: 104

If the repository or file you're working with is large, try increasing Git's buffer size:

git config --global http.postBuffer 524288000

Upvotes: 0

julien-lav
julien-lav

Reputation: 11

I came across this error message on Ubuntu. In my case a large committed file was causing issues. I realized the issue was due to a file that I hadn't noticed initially.

Identify Large Files:

To identify large files in the Git history, you can use the following command:

git rev-list --objects --all | git cat-file --batch-check='%(objecttype) %(objectname) %(objectsize) %(rest)' | sed -n 's/^blob //p' | sort --numeric-sort --key=2 | cut -c 1-12,41- | $(command -v gnumfmt || echo numfmt) --field=2 --to=iec-i --suffix=B --padding=7 --round=nearest

This command lists the objects in the Git history, sorts them by size, and displays the file sizes in a human-readable format.

Source: How to find/identify large commits in Git history

Remove Large File:

Once identified, you can remove the large file from the Git history using the following command:

git filter-branch --force --index-filter 'git rm --cached --ignore-unmatch path/to/my/file.jpg' --prune-empty --tag-name-filter cat -- --all

This command filters the branch history and removes the specified file.

Source: Removing large files from Git history

Update .gitignore:

Afterwards, make sure to either delete the file or add it to .gitignore:

echo 'path/to/my/*' >> .gitignore

Remember to add, commit, and push the changes to reflect them in your remote repository.

Hope this will help.

Upvotes: 1

Denis P
Denis P

Reputation: 796

In my case the config changes above didn't help. What helped was a simple server reboot (in my case sudo shutdown -r now). Looks like something was eating lots of memory on the server. Hope this helps someone, too.

Upvotes: 0

Adrian
Adrian

Reputation: 179

Some of the options suggested in the selected answer seem to be only partially relevant to the issue or not necessary at all.

From looking at https://git-scm.com/docs/git-config, it appears that just setting the following option is sufficient (set only for the project here):

git config pack.windowMemory 512m

From the manual:

pack.windowMemory

The maximum size of memory that is consumed by each thread in git-pack-objects[1] for pack window memory when no limit is given on the command line. The value can be suffixed with "k", "m", or "g". When left unconfigured (or set explicitly to 0), there will be no limit.

With this, I never went over the specified 512m per thread, actually used RAM was about half of that most of the time. Of course, the amount chosen here is user-specific depending on the available RAM and number of threads.

Upvotes: 2

craig Rickett
craig Rickett

Reputation: 448

This worked for me, but I had to set the options via the command line using:

git --global core\pack [param] value

Upvotes: -1

oHo
oHo

Reputation: 54611

EDIT:  Since git-v2.5.0 (Aug/2015), git-for-windows (formerly MSysGit)
      provides 64-bits versions as noticed by Pan.student.
      In this answer I was advising to install Cygwin 64-bits (providing 64-bits Git version).


I got a similar Out of memory, malloc failed issue using MSysGit when reaching the 4GB barrier:

> git --version
git version 1.8.3.msysgit.0

> file path/Git/cmd/git
path/Git/cmd/git: PE32 executable for MS Windows (console) Intel 80386 32-bit

> time git clone --bare -v ssh://linuxhost/path/repo.git
Cloning into bare repository 'repo.git'...
remote: Counting objects: 1664490, done.
remote: Compressing objects: 100% (384843/384843), done.
remote: Total 1664490 (delta 1029586), reused 1664490 (delta 1029586)
Receiving objects: 100% (1664490/1664490), 550.96 MiB | 1.55 MiB/s, done.
Resolving deltas: 100% (1029586/1029586), done.
fatal: Out of memory, malloc failed (tried to allocate 4691583 bytes)
fatal: remote did not send all necessary objects

real    13m8.901s
user    0m0.000s
sys     0m0.015s

MSysGit crashing after reaching 4 GB barrier

Finally git 64 bits from Cygwin fix it:

> git --version
git version 1.7.9

> file /usr/bin/git
/usr/bin/git: PE32+ executable (console) x86-64 (stripped to external PDB), for MS Windows

> time git clone --bare -v ssh://linuxhost/path/repo.git
Cloning into bare repository 'repo.git'...
remote: Counting objects: 1664490, done.
remote: Compressing objects: 100% (384843/384843), done.
remote: Total 1664490 (delta 1029586), reused 1664490 (delta 1029586)
Receiving objects: 100% (1664490/1664490), 550.96 MiB | 9.19 MiB/s, done.
Resolving deltas: 100% (1029586/1029586), done.

real    13m9.451s
user    3m2.488s
sys     3m53.234s

git 64 bits from Cygwin succeeded

FYI on linuxhost 64 bits:

repo.git> git config -l
[email protected]
core.repositoryformatversion=0
core.filemode=true
core.bare=true

repo.git> git --version
git version 1.8.3.4

repo.git> uname -a
Linux linuxhost 2.6.32-279.19.1.el6.x86_64 #1 SMP Sat Nov 24 14:35:28 EST 2012 x86_64 x86_64 x86_64 GNU/Linux

If my answer does not fix your issue, you may also check these pages:

Upvotes: 15

VonC
VonC

Reputation: 1326776

For reference (you might already seen it), the msysgit case dealing with that issue is the ticket 292.

It suggests several workarounds:

To disable the delta compression for certain files, in .git/info/attributes, add:

*.zip binary -delta

From Gitattributes man page:

Delta compression will not be attempted for blobs for paths with the attribute delta set to false.


Maybe a simpler workaround would be to somehow reset the history before that large file commit, and redo the other commits from there.

Upvotes: 21

git
git

Reputation: 1214

I found a solution Here that worked for me.

In .git/config file (client and/or server) I added the following:

[core]
  packedGitLimit = 128m
  packedGitWindowSize = 128m

[pack]
  deltaCacheSize = 128m
  packSizeLimit = 128m
  windowMemory = 128m

Upvotes: 119

Related Questions