Reputation: 2662
Running the following code
Dir.foreach(FileUtils.pwd()) do |f| if f.end_with?('log') File.open(f) do |file| if File.size(f) > MAX_FILE_SIZE puts f puts file.ctime puts file.mtime # zipping the file orig = f Zlib::GzipWriter.open('arch_log.gz') do |gz| gz.mtime = File.mtime(orig) gz.orig_name = orig gz.write IO.binread(orig) puts "File has been archived" end #deleting the file begin File.delete(f) puts "File has been deleted" rescue Exception => e puts "File #{f} can not be deleted" puts " Error #{e.message}" puts "======= Please remove file manually ==========" end end end end end
Also files are pretty heavy more than 1GB. Any help would be appreciated.
Upvotes: 3
Views: 3110
Reputation: 160571
If the files you are reading are > 1GB, you have to have that much memory free at a minimum, because IO.binread
is going to slurp that amount in.
You'd be better off to load a known amount and loop over the input until it's completely read, reading and writing in chunks.
From the docs:
IO.binread(name, [length [, offset]] ) -> string ------------------------------------------------------------------------------ Opens the file, optionally seeks to the given offset, then returns length bytes (defaulting to the rest of the file). binread ensures the file is closed before returning. The open mode would be "rb:ASCII-8BIT". IO.binread("testfile") #=> "This is line one\nThis is line two\nThis is line three\nAnd so on...\n" IO.binread("testfile", 20) #=> "This is line one\nThi" IO.binread("testfile", 20, 10) #=> "ne one\nThis is line "
Upvotes: 2