Reputation: 990
I have run into this strange SSIS error where my script component on the server keeps failing with the error -"The binary code for the script is not found. Please open the script and make sure it build correctly". The script build absolutely fine on my local machine. The script component basically refers to 3 additional DLLs , apart from the standard references that you get when you open a script component. These dlls are Microsoft.Hadoop.avro.dll, Newtonsoft.json.dll and Microsoft.Csharp.
The Microsoft.Hadoop.avro and Newtonsoft.json are the ones that I had to download and I have even registered to the GAC, both on the local and the server.
I have tried opening the components, rebuilding them, saving them and deploying them. I have set DelayValidation to False on the Data Flow. I have set "Copy local" to False for the dlls in the reference section. I have re-added the references by explicitly putting in the GAC location i.e. C:\Windows\Microsoft.Net\Assembly\GAC_MSIL.
This is the sample code for the script component
#region Help: Introduction to the Script Component
/* The Script Component allows you to perform virtually any operation that can be accomplished in
* a .Net application within the context of an Integration Services data flow.
*
* Expand the other regions which have "Help" prefixes for examples of specific ways to use
* Integration Services features within this script component. */
#endregion
#region Namespaces
using System;
using System.Collections.Generic;
using System.Data;
using System.IO;
using Microsoft.Hadoop.Avro;
using Microsoft.Hadoop.Avro.Container;
using Microsoft.Hadoop.Avro.Schema;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using Microsoft.SqlServer.Dts.Runtime.Wrapper;
using Microsoft.CSharp.RuntimeBinder;
using System.Linq;
using System.Reflection;
#endregion
/// <summary>
/// This is the class to which to add your code. Do not change the name, attributes, or parent
/// of this class.
/// </summary>
[Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute]
public class ScriptMain : UserComponent
{
#region Help: Using Integration Services variables and parameters
/* To use a variable in this script, first ensure that the variable has been added to
* either the list contained in the ReadOnlyVariables property or the list contained in
* the ReadWriteVariables property of this script component, according to whether or not your
* code needs to write into the variable. To do so, save this script, close this instance of
* Visual Studio, and update the ReadOnlyVariables and ReadWriteVariables properties in the
* Script Transformation Editor window.
* To use a parameter in this script, follow the same steps. Parameters are always read-only.
*
* Example of reading from a variable or parameter:
* DateTime startTime = Variables.MyStartTime;
*
* Example of writing to a variable:
* Variables.myStringVariable = "new value";
*/
#endregion
#region Help: Using Integration Services Connnection Managers
/* Some types of connection managers can be used in this script component. See the help topic
* "Working with Connection Managers Programatically" for details.
*
* To use a connection manager in this script, first ensure that the connection manager has
* been added to either the list of connection managers on the Connection Managers page of the
* script component editor. To add the connection manager, save this script, close this instance of
* Visual Studio, and add the Connection Manager to the list.
*
* If the component needs to hold a connection open while processing rows, override the
* AcquireConnections and ReleaseConnections methods.
*
* Example of using an ADO.Net connection manager to acquire a SqlConnection:
* object rawConnection = Connections.SalesDB.AcquireConnection(transaction);
* SqlConnection salesDBConn = (SqlConnection)rawConnection;
*
* Example of using a File connection manager to acquire a file path:
* object rawConnection = Connections.Prices_zip.AcquireConnection(transaction);
* string filePath = (string)rawConnection;
*
* Example of releasing a connection manager:
* Connections.SalesDB.ReleaseConnection(rawConnection);
*/
#endregion
#region Help: Firing Integration Services Events
/* This script component can fire events.
*
* Example of firing an error event:
* ComponentMetaData.FireError(10, "Process Values", "Bad value", "", 0, out cancel);
*
* Example of firing an information event:
* ComponentMetaData.FireInformation(10, "Process Values", "Processing has started", "", 0, fireAgain);
*
* Example of firing a warning event:
* ComponentMetaData.FireWarning(10, "Process Values", "No rows were received", "", 0);
*/
#endregion
string copiedAddressFile;
private StreamWriter textWriter;
private string columnDelimiter = ",";
List<AvroRecord> DatesRowList;
public override void AcquireConnections(object Transaction)
{
IDTSConnectionManager100 connMgr = this.Connections.Dates;
copiedAddressFile = (string)connMgr.AcquireConnection(null);
}
public static string Schema = "";
/// <summary>
/// This method is called once, before rows begin to be processed in the data flow.
///
/// You can remove this method if you don't need to do anything here.
/// </summary>
public override void PreExecute()
{
base.PreExecute();
Schema = @"{
""type"":""record"",
""name"":""Microsoft.Hadoop.Avro.Specifications.Dates"",
""fields"":
[
{ ""name"":""MK_DatesID"", ""type"":""int"" },
{ ""name"":""Date"", ""type"":""string"" },
{ ""name"":""IsTradingDay"", ""type"":""boolean"" }
]
}";
DatesRowList = new List<AvroRecord>();
}
/// <summary>
/// This method is called after all the rows have passed through this component.
///
/// You can delete this method if you don't need to do anything here.
/// </summary>
public override void PostExecute()
{
base.PostExecute();
if (Variables.PushToDataLake == true)
{
using (Stream st = new FileStream(copiedAddressFile, FileMode.Create, FileAccess.Write, FileShare.Write))
{
using (var w = AvroContainer.CreateGenericWriter(Schema, st, Codec.Deflate))
{
using (var writer = new SequentialWriter<object>(w, 24))
{
// Serialize the data to stream using the sequential writer
DatesRowList.ForEach(writer.Write);
}
}
}
}
}
/// <summary>
/// This method is called once for every row that passes through the component from Input0.
///
/// Example of reading a value from a column in the the row:
/// string zipCode = Row.ZipCode
///
/// Example of writing a value to a column in the row:
/// Row.ZipCode = zipCode
/// </summary>
/// <param name="Row">The row that is currently passing through the component</param>
public override void Input0_ProcessInputRow(Input0Buffer Row)
{
var serializer = AvroSerializer.CreateGeneric(Schema);
var rootSchema = serializer.WriterSchema as RecordSchema;
//Create a generic record to represent the data
dynamic DatesRow = new AvroRecord(rootSchema);
DatesRow.MK_DatesID = Row.MKDatesID;
DatesRow.Date = Row.Date.ToString();
DatesRow.IsTradingDay = Row.IsTradingDay;
DatesRowList.Add(DatesRow);
}
}
Basically this script component acts as a destination and writes avro files.
As mentioned , everything works on my local. I have even tried downloading the packages on my local from the SSIS catalog on the server and then executing it and even that works (just to see if during the deployment process something changed or became awry). I am at a loss to explain what is missing here. By the way , this is SSIS 2016, deployed to the catalog (project deployment model) and then I am executing it through a SQL Agent job.
The Target framework of the script component is .Net 4.5.
Anybody on ideas to solve this?
If more information is required, I will be happy to provide it.
Upvotes: 2
Views: 2794
Reputation: 990
So, I found the problem in this case. The proxy account which was running the SQL job step did not have proper access. And that is what made it throw the cryptic error - "Binary code for the script not found". Made the proxy account, the local administrator and it worked.
Upvotes: 1