Reputation: 1557
I have a CSV file setting on my blob storage that I wish to insert into a SQL database table, is that achievable using spark scala with JDBC ?
Update
I followed this blog post here and it helped http://viralpatel.net/blogs/java-load-csv-file-to-database/
Upvotes: 0
Views: 1818
Reputation: 609
I would suggest to use scala slick library if you need just jdbc connection to a database to load csv files
you can find good examples here: https://codequs.com/p/B1IogRLY/scala-tutorial-create-crud-with-slick-and-mysql/
the example is much more then what you want to do but there is the insert part as well
Upvotes: 0
Reputation: 6385
The short answer: YES.
Extended answer: you can find quite a few decent examples of how to read and process csv file in spark, depends on Spark version you have, you might want to use DataFrame or DataSets.
Spark documentations also contains samples how to store data to DB using JDBC. Again depends on Apache Spark version the might be some difference. Here is example from v. 2.1:
// Saving data to a JDBC source
dataFrame.write
.format("jdbc")
.option("url", "jdbc:postgresql:dbserver")
.option("dbtable", "schema.tablename")
.option("user", "username")
.option("password", "password")
.save()
Upvotes: 1