Reputation: 4288
I'm using SparkSql for querying. I'm trying something like:
val sqc = new SQLContext(sc);
import sqc.createSchemaRDD
var p1 = Person("Hari",22)
val rdd1 = sc.parallelize(Array(p1))
rdd1.registerAsTable("data")
var p2 = Person("sagar", 22)
var rdd2 = sc.parallelize(Array(p2))
rdd2.insertInto("data")
but getting the error
"java.lang.AssertionError: assertion failed: No plan for InsertIntoTable Map(), false"
Seems I'm using the insertInto the wrong way?
Upvotes: 2
Views: 2429
Reputation: 5202
I also faced the error for normal SchemaRDD. When I tried insertInto
with a SchemaRDD backed by a Parquet file, it succeeded. It looks only Parquet-backed tables are supported for insertInto method.
The 1.1.0 API documentation on SQLContext tells that tables from Parquet File can be used for insertInto operations. > "This registered table can be used as the target of future insertInto operations."
Upvotes: 2