james kam
james kam

Reputation: 49

cassandra dsbulk mapping failed

I am using dsbulk to load dataset into the datastax astra

error message: enter image description here

my table structure:

CREATE TABLE project(
 FL_DATE date, 
 OP_CARRIER text, 
 DEP_DELAY float, 
 ARR_DELAY float, 
 PRIMARY KEY ((FL_DATE), OP_CARRIER)
) WITH CLUSTERING ORDER BY (OP_CARRIER ASC);

my mapping error

enter image description here enter image description here

i try changing datatype still not working. Appreciate if anyone can help me

Upvotes: 1

Views: 295

Answers (1)

Madhavan
Madhavan

Reputation: 649

Assumptions:

  • Both secure connect bundle and input csv is loacted at /path/to/ directory

Table Structure:

token@cqlsh:payloadtest> DESC TABLE projectjk;

CREATE TABLE projectjk.projectjk (
    fl_date date,
    op_carrier text,
    arr_delay float,
    dep_delay float,
    PRIMARY KEY ((fl_date), op_carrier)
) WITH CLUSTERING ORDER BY (op_carrier ASC)
...;

Starting with an empty table:

token@cqlsh:projectjk> select * from projectjk;

 fl_date | op_carrier | arr_delay | dep_delay
---------+------------+-----------+-----------

(0 rows)

Input sample csv file contents:

% cat /path/to/projectjk.csv 
fl_date,op_carrier,dep_delay,arr_delay
2020-01-01,WN,44.0,363.0
2020-01-02,AN,42.0,143.42

DSBulk configuration contents is:

% cat projectjk.conf 
dsbulk {
 connector {
  name = "csv"
 }
 csv {
  url='/path/to/projectjk.csv'
  header=true
 }
 schema {
   keyspace=projectjk
   table=projectjk
 }
 log.stmt.level=EXTENDED
}
datastax-java-driver {
  basic {
    cloud.secure-connect-bundle="/path/to/secure-connect-projectjk.zip"
  }
  advanced.auth-provider {
    username = "CHANGE_ME"
    password = "CHANGE_ME"
  }
}

DSBulk Load command executed is:

./dsbulk load -f projectjk.conf

Upvotes: 0

Related Questions