Yaron Levi
Yaron Levi

Reputation: 13079

Run SSIS package from C# , and limit the memory usage

I am runnig an SSIS package from C# code like so:

ErrorHandler errorHandler = new ErrorHandler();
Application app = new Application();
DTSExecResult res;
Package pkg = app.LoadPackage(packagePath, null);
res = pkg.Execute(null, null, errorHandler, null, null);

I have two questions:

  1. What "engine" runs this package ? Does SQL Server runs it, or a smaller SSIS engine is the only thing needed.

  2. When this c# code (it's a console application) runs for long periods it eats up more and more memory gradually. How can I limit the memory usage of the package being run ?

*One idea that came to mind that this behavior is like SQL Server behavior i.e. it takes up memory and never releases it, of course under the memory limit defined. So maybe the "engine" that runs the package is set up with a certain memory limit which gradually gets filled.

Upvotes: 1

Views: 1490

Answers (1)

Todd McDermid
Todd McDermid

Reputation: 1680

That's some reasonable extrapolation - but not accurate.

  1. The "engine" that runs the package is the library that you referenced in your C# application. It's the same code that's in DTExec - the utility that executes SSIS packages (in SQL 2008R2 and below) when you use an Agent SSIS Job Step. (SSIS2012 executes them with the same code inside a service, not an executable.) The SQL Server database service is NOT required to be installed. A full SQL Server license IS REQUIRED on the machine you're running your code on. The part of the SQL install that is needed is just the SSIS "shared" components.
  2. I don't have a good answer for you on this one, because I experience similar issues myself and haven't gotten to the real answer yet. If you watch Task Manager, you'll see that the Private Working Set (the amount of memory your process is actually using) always stays reasonable. It's the "Commit Size" that grows unreasonable. The Commit Size includes memory that your process isn't actually using, but that Windows has allocated to it. In my case, after running a package for quite a while (it has a loop to process a lot of files), I can watch the Private Working set stabilize at 2GB, but see the Commit Size grow and grow past 4GB until the package fails because it can't get memory. I don't understand why that is, given my understanding that there is plenty of memory reserved for my process that isn't actually getting used. I can only assume (without basis) that the process has fragmented the memory internally. It must be using 2GB "spread out" amongst that 4GB in such a manner as that the empty spaces scattered around the memory block are too small to fill any one request. I don't know of an SSIS API call to "defragment" that memory :)

What I could suggest is that you create your package to limit itself. In my case, with a loop, I could implement a "20 loop maximum", where the package simply finishes after 20 loops, even if there are still files to process. A return value or database table could signal that more processing remained to be done. Inside your C# app, you could loop on the call to pkg.Execute, possibly with some GC attempts in there. I don't know if that would be sufficient to trigger the process to release whatever memory was acquired and/or contiguously re-allocate it, but that's what I'd try.

Upvotes: 3

Related Questions