Reputation: 312
The process is getting slower after each iteration.
I am trying to open all files(as temp file) kept in a particular folder, compute the sum of three columns in each file, assigning the values to the main sheet and then close the temp file.
Sub Compute()
Dim File_Check As Date
Dim rcell As Range
Dim temp as Workbook
Set ws = ThisWorkbook.Sheets("DATA")
Application.ScreenUpdating = False
For dt = #6/5/2017# To now
' Name of the file is like data02052017.DAT
myFilenm = "D:\data" & Format(dt, "ddmmyyyy") & ".DAT"
If Dir(myFilenm) <> "" Then ' To exclude non-working days since data is not generated on those days
ws.Range("A" & i).Value = myFilenm
ws.Range("b" & i).Value = dt
Set temp = Workbooks.Open(Filename:=myFilenm)
Set ws2 = temp.ActiveSheet
ws2.Range("A:A").Select
' replacing space from each cell in temp workbook
Selection.Replace What:=" ", Replacement:="", LookAt:=xlPart, _
SearchOrder:=xlByRows, MatchCase:=False, SearchFormat:=False, _
ReplaceFormat:=False
' text to column, seperating by comma
Selection.TextToColumns Destination:=ActiveCell, DataType:=xlDelimited, _
TextQualifier:=xlDoubleQuote, ConsecutiveDelimiter:=False, Tab:=False, _
Semicolon:=False, Comma:=True, Space:=False, Other:=False, OtherChar _
:=":", FieldInfo:=Array(Array(1, 1), Array(2, 1), Array(3, 1), Array(4, 1), Array(5, _
1), Array(6, 1), Array(7, 1), Array(8, 1), Array(9, 1), Array(10, 1), Array(11, 1), Array(12 _
, 1), Array(13, 1), Array(14, 1), Array(15, 1), Array(16, 1), Array(17, 1), Array(18, 1), _
Array(19, 1), Array(20, 1), Array(21, 1), Array(22, 1), Array(23, 1), Array(24, 1), Array( _
25, 1), Array(26, 1)), TrailingMinusNumbers:=True
' Some intendent Computions
ws.Range("c" & i).Value = Application.WorksheetFunction.Sum(ws2.Range("P:P"))
ws.Range("D" & i).Value = Application.WorksheetFunction.Sum(ws2.Range("Q:Q"))
ws.Range("E" & i).Value = Application.WorksheetFunction.Sum(ws2.Range("W:W"))
ws.Range("F" & i).Value = Format(Now, "HH:MM:SS")
i = i + 1
temp.Close SaveChanges:=False
End If
Next dt
Application.ScreenUpdating = True
End Sub
the 1st iteration is taking 5 seconds were as the 100th iteration is taking 25 seconds to complete and eventually the process time is going into minutes per iteration. I have around 500 records and each record is having approx. same size so i wanted to understand why is the iteration time increasing exponentially and what can be done to resolve the issue.
Upvotes: 1
Views: 111
Reputation: 7951
Office is rubbish at Memory Management. Or, rather it is being "clever", instead of doing what it is told.
Open a new workbook in Excel and Task Manager, (Ctrl+Shift+Esc), and observe the Memory that Excel is using. Open a saved file, close the file.
The memory Excel is using goes Up when you open the file, and Down when you close it - but not back to the level it was at before you opened the file. This is because Office is storing a "temporary" version of the file in memory, so that it can be reopened faster.
You can make this "bloat" slightly smaller by opening the file as Read Only (Set temp = Workbooks.Open(Filename:=myFilenm, ReadOnly:=True)
), but the only way I have found to to "reset" it is to completely close Excel and reopen it.
So, by file 100, your Excel application has 99 documents worth of junk data in memory slowing everything down.
(N.B. This "feature" was, so far as I can tell, introduced in Office 2007, and completely absent in Office 2003)
Upvotes: 1