Reputation: 877
I have a library which connects to a SQL Server 2012 instance. One of the registered tables contains international addresses so can contain non-Latin characters (Chinese, Russian etc) in nvarchar
columns. When connecting to the table via Enterprise Guide, the non-Latin characters are missing.
I have tried changing the SAS Foundation sasv9.cfg
to point at the u8 nls folder which has a default encoding of utf-8
instead of the en folder, which is wlatin1
.
However, my users then complain their datasets written with the wlatin1
encoding are inaccessible. Changing the sasv9.cfg
file back to the 'en' nls folder resolves the issue.
Is there a way I can get users to game this by starting their individual sessions as utf-8?
Thanks
Upvotes: 2
Views: 2144
Reputation: 12691
You have a few approaches available. One is to use the inencoding=utf8
library option.
If this fails, you could then try using the encoding=utf8
dataset option.
As a last resort, you can use the kcvt
function to translate the specific variable, eg :
kcvt(YourVariable,'utf8','wlatin1')
You can't modify the encoding of workspace session without updating the config file, although it is possible to 'game' a stored process session by setting the context of your client - see the locale gotcha.
Upvotes: 3