F0RR
F0RR

Reputation: 1620

Hardcoded strings are cut in half with node-oracle

I use node-oracle to connect to an Oracle db.

When I select values from tables with cyrillic data, everything is fine, but if I call a procudure like this:

CREATE OR REPLACE PROCEDURE TEST_ENCODING (CUR OUT SYS_REFCURSOR) AS 
BEGIN
  open cur for
    select 'тест' as hello from dual; -- cyrillic hardcoded text
END TEST_ENCODING;

and then call it from node:

connection.execute("call TEST_ENCODING(:1)", [new oracle.OutParam(oracle.OCCICURSOR)],
  function (err, result) {
    console.log(result)
  }
);

Result is:[ { HELLO: 'те' } ] (the string is cut in half).

The database is configured as follows:

NLS_LANGUAGE    AMERICAN
NLS_TERRITORY   AMERICA
NLS_CURRENCY    $
NLS_ISO_CURRENCY    AMERICA
NLS_NUMERIC_CHARACTERS  .,
NLS_CHARACTERSET    CL8MSWIN1251
NLS_CALENDAR    GREGORIAN
NLS_DATE_FORMAT DD-MON-RR
NLS_DATE_LANGUAGE   AMERICAN
NLS_SORT    BINARY
NLS_TIME_FORMAT HH.MI.SSXFF AM
NLS_TIMESTAMP_FORMAT    DD-MON-RR HH.MI.SSXFF AM
NLS_TIME_TZ_FORMAT  HH.MI.SSXFF AM TZR
NLS_TIMESTAMP_TZ_FORMAT DD-MON-RR HH.MI.SSXFF AM TZR
NLS_DUAL_CURRENCY   $
NLS_COMP    BINARY
NLS_LENGTH_SEMANTICS    BYTE
NLS_NCHAR_CONV_EXCP FALSE
NLS_NCHAR_CHARACTERSET  AL16UTF16
NLS_RDBMS_VERSION   11.2.0.3.0

In my local env: NLS_LANG=AMERICAN_AMERICA.UTF8 (also tried NLS_LANG=RUSSIAN_RUSSIA.UTF8 and RUSSIAN_RUSSIA.AL32UTF8 with same results)

My configuration:
Mac OS X 10.9
Oracle Client 11.2
node 0.10.22
node-oracle 0.3.4

Upvotes: 6

Views: 1259

Answers (3)

ThinkJet
ThinkJet

Reputation: 6735

  1. Seems that for now there are no support for encodings other then UTF8 in node-oracle because node.js dosn't support native encodings (proof).

  2. To handle strings properly you need to set NLS_LANG parameter on the client to same value as in database (CL8MSWIN1251)

So, you can choice from 2 variants:

A) Migrate database to UTF8 encoding.

B) Patch node-oracle source to convert strings and CLOBs to UTF8 before returning it content to node.js and applying conversion from UTF8 to CL8MSWIN1251 before passing it to Oracle. OCI interface have a functions for such conversions. E.g. for your local purpose it's enough to patch OBJ_GET_STRING macro in utils.h

P.S. node-oracle looks very simplistic at the moment, so be prepared for many surprises (e.g. no support for BLOBs and collections, lack of connections settings and so on).

Upvotes: 1

Galbarad
Galbarad

Reputation: 453

are you sure that your source code has UTF-8 character set?
if problem only with hardcoded symbols maybe your GUI for Oracle development do not support UTF-8
I have similar problem with special characters like ¥ in my package and sql*plus that convert special characters into some unreadable

Upvotes: 0

Eugene Bobkov
Eugene Bobkov

Reputation: 1

It could be because your database primary charset is CL8MSWIN1251, when local setting specifies UTF8.

NLS_CHARACTERSET    CL8MSWIN1251

The variable NLS_LANG specifies how to interpret your local environment

NLS_LANG = language_territory.charset

The last part of NLS_LANG provides information about local charset and it is used to let Oracle know what character set you are USING on the client side, so Oracle can do the proper conversion. Probably, values from tables are converted properly, when charset of value from dual table is not identified correctly.

Please try to set NLS_LANG variable to AMERICAN_AMERICA.CL8MSWIN1251 (or RUSSIAN_RUSSIA.CL8MSWIN1251, it doesn't really matter)

Upvotes: 0

Related Questions