Guenther Schmitz
Guenther Schmitz

Reputation: 1999

log file variables in function eat up all memory

i have the following simple function which is used several times in a script which iterates through a directory and checks the age of the files in it.

function log($Message) {
  $logFilePath = 'C:\logPath\myLog.txt'

  $date = Get-Date -Format 'yyyyMMddHHmmss'
  $logMessage = "{0}_{1}" -f $date,$Message
  if(Test-Path -Path $logFilePath) {
    $logFileContent = Get-Content -Path $logFilePath
  } else {
    $logFileContent = ''
  }
  $logMessage,$logFileContent | Out-File -FilePath $logFilePath
}

i figured out that this eats up all ram. i don't understand why. i thought the scope of variables in a function are destroyed once the function is run. i fixed the ram issue by adding Remove-Variable logMessage,logFileContent,logFilePath,date to the very end of the function but would like to know how this ram issue could be solved otherwise and why the variables within the function are not destroyed automatically.

Upvotes: 3

Views: 117

Answers (3)

js2010
js2010

Reputation: 27491

Powershell or .Net has a garbage collector, so freeing the memory isn't instant. Garbage Collection in Powershell to Speed Scripts Also the memory management is probably better in Powershell 7. I tried repeating your function many times, but the memory usage didn't go above a few hundred megs.

There's probably some more efficient .net way to prepend a line to a file: Prepend "!" to the beginning of the first line of a file

I have a weird way to prepend a line. I'm not sure how well this would work with a large file. With a 700 meg file, the Working Set memory stayed at 76 megs.

$c:file
one
two
three

$c:file = 'pre',$c:file

$c:file
pre
one
two
three

ps powershell

Handles NPM(K) PM(K) WS(K) CPU(s)   Id SI ProcessName
------- ------ ----- ----- ------   -- -- -----------
    607     27 67448 76824   1.14 3652  3 powershell

Upvotes: 3

Ambrose Leung
Ambrose Leung

Reputation: 4225

I just want to rule out the RAM usage is caused by prepending to the file. Have you tried not storing the log contents in a variable? i.e.

$logMessage,(Get-Content -Path $logFilePath) | Out-File -FilePath $logFilePath

Edit 5/8/20 - Turns out that prepending (when done efficiently) isn't as slow as I thought - it is on the same order as using -Append. However the code (that js2010 pointed to) is long and ugly, but if you really need to prepend to the file, this is the way to do it.

I modified the OP's code a bit to automatically insert a new line.

function log-prepend{
  param(
    $content,
    $filePath = 'C:\temp\myLogP.txt'
  )
    $file = get-item $filePath
    if(!$file.exists){
      write-error "$file does not exist";
      return;
    }
    $filepath = $file.fullname;
    $tmptoken = (get-location).path + "\_tmpfile" + $file.name;
    write-verbose "$tmptoken created to as buffer";
    $tfs = [System.io.file]::create($tmptoken);
    $fs = [System.IO.File]::Open($file.fullname,[System.IO.FileMode]::Open,[System.IO.FileAccess]::ReadWrite);
    try{
      $date = Get-Date -Format 'yyyyMMddHHmmss'
      $logMessage = "{0}_{1}`r`n" -f $date,$content
      $msg = $logMessage.tochararray();
      $tfs.write($msg,0,$msg.length);
      $fs.position = 0;
      $fs.copyTo($tfs);
    }
    catch{
      write-verbose $_.Exception.Message;
    }
    finally{

      $tfs.close();
      $fs.close();
      if($error.count -eq 0){
        write-verbose ("updating $filepath");
        [System.io.File]::Delete($filepath);
        [System.io.file]::Move($tmptoken,$filepath);
      }
      else{
        $error.clear();
        [System.io.file]::Delete($tmptoken);
      }
    }
}

Here was my original answer that shows how to test the timing using a stopwatch.

When you prepend to a log file, you're reading the entire log file into memory, then writing it back.

You really should be using append - that would make the script run a lot faster.

function log($Message) {
  $logFilePath = 'C:\logPath\myLog.txt'

  $date = Get-Date -Format 'yyyyMMddHHmmss'
  $logMessage = "{0}_{1}" -f $date,$Message
  $logMessage | Out-File -FilePath $logFilePath -Append
}

Edit: To convince you that prepending to a log file is a bad idea, here's a test you can do on your own system:

function logAppend($Message) {
  $logFilePath = 'C:\temp\myLogA.txt'

  $date = Get-Date -Format 'yyyyMMddHHmmss'
  $logMessage = "{0}_{1}" -f $date,$Message
  $logMessage | Out-File -FilePath $logFilePath -Append
}

function logPrepend($Message) {
  $logFilePath = 'C:\temp\myLogP.txt'

  $date = Get-Date -Format 'yyyyMMddHHmmss'
  $logMessage = "{0}_{1}" -f $date,$Message
  if(Test-Path -Path $logFilePath) {
    $logFileContent = Get-Content -Path $logFilePath
  } else {
    $logFileContent = ''
  }
  $logMessage,$logFileContent | Out-File -FilePath $logFilePath 
}

$processes = Get-Process

$stopwatch =  [system.diagnostics.stopwatch]::StartNew()
foreach ($p in $processes)
{
  logAppend($p.ProcessName)
}

$stopwatch.Stop()
$stopwatch.Elapsed


$stopwatch =  [system.diagnostics.stopwatch]::StartNew()
foreach ($p in $processes)
{
  logPrepend($p.ProcessName)
}

$stopwatch.Stop()
$stopwatch.Elapsed

I've run this several times until I got a few thousand lines in the log file. Going from: 1603 to 1925 lines, my results were:

Append: 7.0167008 s Prepend: 21.7046793 s

Upvotes: 0

Theo
Theo

Reputation: 61148

Although as commented, I can hardly believe that this function gobbles up your memory, you could optimize it:

function log ([string]$Message) {
    $logFilePath = 'C:\logPath\myLog.txt'
    # prefix the message with the current date
    $Message = "{0:yyyyMMddHHmmss}_{1}" -f (Get-Date), $Message
    if (Test-Path -Path $logFilePath -PathType Leaf) {
        # newest log entry on top: append current content
        $Message = "{0}`r`n{1}" -f $Message, (Get-Content -Path $logFilePath -Raw)
    }

    Set-Content -Path $logFilePath -Value $Message
}

Upvotes: 0

Related Questions