Reputation:
I had a logging process go haywire on one of my servers and I now have tons of files that I can't delete:
➜ logs ls -l | wc -l
11135951
➜ logs rm log*
-bash: fork: Cannot allocate memory
Ideas? I could just blow the server away but I'm genuinely curious about how to actually fix this.
Upvotes: 2
Views: 72
Reputation: 18568
find . -type f -name 'log*' -delete
Would be the most efficient way to do it
In most cases replacing -delete with-print would show you all the files which would be removed. In your case though I don't think that will help
@biffen points out that it will do sub dirs too
To prevent this use the maxdepth argument
-maxdepth 1
1 limits to current dir
Upvotes: 2