Dev Anand
Dev Anand

Reputation: 354

How to add ARRAY column to Spark table (using ALTER TABLE)?

I am trying to add a new column of Array Type to the table with default value.

%sql
ALTER TABLE testdb.tabname ADD COLUMN new_arr_col ARRAY DEFAULT ['A','B','C'];

But it says that the data type in not supported

Error in SQL statement: ParseException: 
DataType array is not supported.(line 1, pos 54)

== SQL ==
ALTER TABLE testdb.dim_category ADD COLUMN c_cat_area ARRAY

So, there is no way we can add an array column directly to the table? Kindly assist me on this. Thanks in advance!

Upvotes: 3

Views: 3213

Answers (1)

Jacek Laskowski
Jacek Laskowski

Reputation: 74669

The reason for this error is that complex types (e.g. ARRAY) require another type to be specified (cf. SqlBase.g4):

dataType
    : complex=ARRAY '<' dataType '>'                            #complexDataType
    | complex=MAP '<' dataType ',' dataType '>'                 #complexDataType
    | complex=STRUCT ('<' complexColTypeList? '>' | NEQ)        #complexDataType
    | INTERVAL from=(YEAR | MONTH) (TO to=MONTH)?               #yearMonthIntervalDataType
    | INTERVAL from=(DAY | HOUR | MINUTE | SECOND)
      (TO to=(HOUR | MINUTE | SECOND))?                         #dayTimeIntervalDataType
    | identifier ('(' INTEGER_VALUE (',' INTEGER_VALUE)* ')')?  #primitiveDataType
    ;

In your case, it'd be as follows:

ARRAY<CHAR(3)>

Upvotes: 7

Related Questions