Reputation: 4232
I have the following problem:
I have to store in the database the data that is inserted in an input field inside a form. I validate that the data in that input doesn't exceed 40 chars.
The problem is, if I insert a text that has 40 characters but one of them is accented, then obviously the validation doesn't show any errors but when it's going to be stored in the database then the following exception is thrown:
ORA-12899: value too large for column "DBUSER"."TABLE"."COLUMN" (actual: 41, maximum: 40)
it seems that the accented character takes up more than one char in the database.
i guess this has to do with the encoding but i have no idea where to start looking. any ideas?
thanks!!
Upvotes: 3
Views: 2125
Reputation: 36922
Use CHAR
, as @Adam Musch suggested. You really don't want to use NVARCHAR2, or guess at the possible number of bytes.
create table my_table1(small_string varchar2(1 byte));
create table my_table2(small_string varchar2(1 char));
insert into my_table1 values('Þ'); --"ORA-12899: value too large for ..."
insert into my_table2 values('Þ'); --works fine
You can explicitly set the length semantics to either BYTE or CHAR, but most likely you're using the default value, BYTE. The default is determined by NLS_LENGTH_SEMANTICS. Check the value with this query:
select * from v$parameter where name = 'nls_length_semantics';
You can change the default with the below statement. (Although you'll probably forget to change this parameter, it's more reliable to explicitly
use CHAR
in your DDL.)
alter session set nls_length_semantics = char;
That setting doesn't change existing objects, you'll need to manually change your table with SQL like:
alter table my_table1 modify (small_string varchar2(1 char));
Upvotes: 3
Reputation: 328770
Use NVARCHAR2
as type for the column or make the column at least six times as wide as the longest input (one Unicode character can become at most 6 bytes with the UTF-8 encoding).
Upvotes: 4