Patterson
Patterson

Reputation: 2825

Azure Data Factory Error: SqlColumnNameMismatchByCaseSensitive. Is it possible to Ignore Case Sensitivity with SnowFlake

I'm not sure what group this question should be targetted at.

I have created a copy activity from SnowFlake to Azure SQLDB. The connection is working fine. However, the copy activity is giving me an error suggesting there is Column Name mismatch in terms of case sensivity.

Failure happened on 'Sink' side. ErrorCode=SqlColumnNameMismatchByCaseSensitive,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Column 'CONTENTTYPEID' in DataSet '[OBLIXXXXXX]' cannot be found in physical Sql database. Column matching is case sensitive. Column 'ContentTypeID' appears similar. Check the DataSet(s) configuration to proceed further.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.InvalidOperationException,Message=The given ColumnMapping does not match up with any column in the source or destination.,Source=System.Data,'

enter image description here

For example, the columns in SnowFlake are capitails by default, see below image enter image description here

However, when copying over to Azure SQLDB the copy activity doesn't like the fact I'm copying from a table where the columns are all capitals while the columns in SQLDB are both lower case.

enter image description here

So, is there a way of getting the copy activity to ignore case sensitivity?

Updating the question with the mapping screenshot

enter image description here

The sink is parameterized, so can't hardcode mappings enter image description here

Upvotes: 0

Views: 371

Answers (1)

Pratik Lad
Pratik Lad

Reputation: 8382

As per this MS Documnet ADF auto mapping is a case sensitive

To work around this,

  1. you need to match the case of column either with snowflake or SQL database.

To convert the columns names In SQL table You can use following query: It will change the all-column names of all tables to capital format.

DECLARE @sqlstring nvarchar(512) -- sql to execute
DECLARE @cmd nvarchar(512)
DECLARE db_cursor CURSOR FOR 
select 'exec sp_rename '''+QUOTENAME(b.name)+'.'+QUOTENAME(a.name)+''','''+UPPER(a.name)+''',''column''' as cmd
from syscolumns a, sysobjects b 
where a.id=b.id and b.type='U' 
order by b.name

OPEN db_cursor  
FETCH NEXT FROM db_cursor INTO @cmd

WHILE @@FETCH_STATUS = 0  
BEGIN  
      set @sqlstring = @cmd;

      EXECUTE sp_executesql @sqlstring;

      FETCH NEXT FROM db_cursor INTO @cmd
END 

CLOSE db_cursor  
DEALLOCATE db_cursor

And now you can use this tables to get the data from snowflake.

  1. Using data flow activity, you can transform the data as below:
  • Take Dataflow activity with two Sources as below one is snowflake from which you need to transform and the SQL table of which you want the column name as header First source (SQL table (destination table) to get the column names): enter image description here Second source (Snowflake table to get the data) enter image description here
  • Then after first source take filter transformation to filter as we only want column name not data as its destination table with False() condition. enter image description here Data preview: enter image description here
  • After this take Union trandformation to get the data of snowflake table.!enter image description hereData preview: enter image description here
  • Now you can transform this data.

Upvotes: 1

Related Questions