Reputation: 727
My SSIS package needs to read JSON files and store the entire JSON text into a single row. Currently, I can do this no problem with a flat file source.
However, the data type must be DT_NTEXT so I can import more than 4000 characters. My issue arises when I need to include more data captured from the filename and directory path. Once it passes through a script component to capture that information, I can no longer write the JSON data back into its original column DT_NTEXT is read only. I have also tried just reading the files from Script Component and run into the same issue.
I also tried using them as two different data sources and merging them, but the data type will not allow me to sort. Lastly, I have tried to assign the JSON data to a variable using the script component, but you can only do this in post execute.
The irony of the whole situation is that all the data only needs to be on a single row for each file. The columns are outlined here:
Here is a picture of my current dataflow, but this doesn't reflect everything I have tried.
If anyone can help me with these bizarre limitations it would be greatly appreciated.
Upvotes: 2
Views: 2066
Reputation: 1200
On your script component, convert the JSON string to a byte array and store it.
string jsonData = "<Your json string>";
System.Text.ASCIIEncoding encoding = new System.Text.ASCIIEncoding();
OutputBuffer.AddRow();
OutputBuffer.JSON.AddBlobData(encoding.GetBytes(jsonData ));
Follow this link for further details.
Upvotes: 2