Vicki
Vicki

Reputation: 43

How to insert Blob column in Oracle with Non-Ascii characters in Oracle?

I am trying to Insert/Update data into BLOB Column where there are few Non-Ascii Characters present. When I retrieve the column values, they are not fetched as same as when I inserted them.

// This is sample data or data snippet of very large data text where i see the difference after update.

Tried this updated Query:

UPDATE Blob_table SET FILE_BLOB = utl_raw.cast_to_raw('00001Ý#01¨ return') where eid = 1;

When i fetch the table:

SELECT FILE_BLOb FROM Blob_table WHERE eid = 1;

Above Query output: 00001�#01¨ return

Expected : 00001Ý#01¨ return

Actual : 00001�#01¨ return

Can anyone explain Why the data is retrieved wrongly and how to fix them?

Oracle Database Version :

"Oracle Database 18c Enterprise Edition Release 18.0.0.0.0 - Production Version 18.5.0.0.0"

Character Set Details:

NLS_NCHAR_CHARACTERSET - AL16UTF16

NLS_CHARACTERSET - AL32UTF8

Upvotes: 0

Views: 3394

Answers (1)

Wernfried Domscheit
Wernfried Domscheit

Reputation: 59557

BLOB holds binary data, so the expected output should be something like 3030303031C39D233031C2A82072657475726E.

If you expect 00001Ý#01¨ return then you need to convert binary data back to character data, apparently your tool does it implicitly. Using CAST_TO_VARCHAR2 should return correct result.

CAST_TO_VARCHAR2 uses the current database character set (AL32UTF8 in your case). If your client uses a different character set, then you have to use DBMS_LOB.CONVERTTOCLOB, there you can specify it.

However, the simplest way would be to convert your column to proper data type CLOB, this will make your life much easier.

Upvotes: 0

Related Questions