nlkasyap
nlkasyap

Reputation: 65

Captuing runtime for each task within a dataflow in SSIS2012

In my SSIS package I have a dataflow that looks something like this.

enter image description here

My requirement is to log the end time of each flatfile destination (Or the time when each of the flat files is created) , in a SQL server table. To be more clear, there will be one row per flatfile in the log table. Is there any simple way(preferably) to accomplish this? Thanks in advance.

Update: I ended up using a script task after the dataflow and read the creation time of each of the file created in the dataflow. I also used same script task to insert logs into the table, just to keep things in one place. For details refer the post masked as answer.

Upvotes: 1

Views: 956

Answers (4)

Maverick
Maverick

Reputation: 1185

The easy solution will be: 1) drag the OLE DB Command from the tool box after the Fatfile destination. 2) Update Script to update table with current date when Flat file destination is successful. 3) You can create a variable (scope is project) with value systemdatetime. 4) You might have to create another variable depending on your package construct if Success or fail

Upvotes: 1

Beth
Beth

Reputation: 9607

Wanted to provide more detail to @TabAlleman's approach. For each control flow task with a name like Bene_hic, you will have a source file and a destination file. On the 'Event Handlers' tab for that executable (use the drop-down list,) you can create the OnPostExecute event.

In that event, I have two SQL tasks. One generates the SQL to execute for this control flow task, the second executes the SQL.

These SQL tasks are dependent on two user variables scoped in the OnPostExecute event. The EvaluateAsExpression property for both is set to True. The first one, Variable1, is used as a template for the SQL to execute and has a value like:

"SELECT execSQL FROM db.Ssis_onPostExecute 
where stgTable = '"  + @[System::SourceName]  +  "'"

@[System::SourceName] is an SSIS system variable containing the name of the control flow task.

I have a table in my database named Ssis_onPostExecute with two fields, an execSQL field with values like:

     DELETE FROM db.TableStats WHERE TABLENAME = 'Bene_hic';
        INSERT INTO db.TableStats 
             SELECT CreatorName ,t.tname, CURRENT_TIMESTAMP ,rcnt  FROM 
        (SELECT databasename, TABLENAME AS tname, CreatorName FROM dbc.TablesV) t  
        INNER JOIN
        (SELECT 'Bene_hic' AS tname, 
COUNT(*) AS rcnt FROM db.Bene_hic) u ON
             t.tname = u.tname 
    WHERE  t.databasename = 'db' AND t.tname = 'Bene_hic';

and a stgTable field with the name of the corresponding control flow task in the package (case-sensitive!) like Bene_hic

In the first SQL task (named SQL,) I have the SourceVariable set to a user variable (User::Variable1) and the ResultSet property set to 'single row.' The Result Set detail includes a Result Name = 0 and Variable name as the second user variable (User::Variable2.)

In the second SQL task (exec,) I have the SQLSourceType property set to Variable and the SourceVariable property set to User::Variable2.

Then the package is able to copy the data in the source object to the destination, and whether it fails or not, enter a row in a table with the timestamp and number of rows copied, along with the table name and anything else you want to track.

Also, when debugging, you have to run the whole package, not just one task in the event. The variables won't be set correctly otherwise.

HTH, it took me forever to figure all this stuff out, working from examples on several web sites. I'm using code to generate the SQL in the execSQL field for each of the 42 control flow tasks, meaning I created 84 user variables.

-Beth

Upvotes: 1

Tab Alleman
Tab Alleman

Reputation: 31775

After looking more closely at the toolbox, I think the best way to do this is to move each source/destination pairing into its own dataflow and use the OnPostExecute event of each dataflow to write to the SQL table.

Upvotes: 1

user3662215
user3662215

Reputation: 526

In order to get the accurate date and timestamp of each flat file created as the destination, you'll need to create three new global variables and set up a for-each loop container in the control flow following your current data flow task and then add to the for-each loop container a script task that will read from one flat file at a time the date/time information. That information will then be saved to one of the new global variables that can then be applied in a second SQL task (also in the for-each loop) to write the information to a database table.

The following link provides a good example of the steps you'll need to apply. There are a few extra steps not applicable that you can easily exclude.

http://microsoft-ssis.blogspot.com/2011/01/use-filedates-in-ssis.html

Hope this helps.

Upvotes: 1

Related Questions