Digvijay Chauhan
Digvijay Chauhan

Reputation: 31

Spark : java.sql.SQLException: No suitable driver found for jdbc:postgresql://localhost/postgres

In my spark application, I am trying to connect to local Postgres database using following line:

val conn = DriverManager.getConnection("jdbc:postgresql://localhost/postgres", "postgres", "*Qwerty#")

Postgres server is running on port 5432 (default). I have also tried including the port. I have also tried Class.forName("org.postgresql.Driver") but it throws ClassNotFoundException. I have made sure that the driver is in the ClassPath.

I am running spark in the local mode.

But I am getting the above exception.

I have included the jdbc driver via sbt as mentioned here : https://mvnrepository.com/artifact/org.postgresql/postgresql/42.2.2

Upvotes: 1

Views: 6469

Answers (2)

Jack
Jack

Reputation: 197

You can also try out this code:

Properties dbProperties = new Properties();

dbProperties.put("driver", "org.postgresql.Driver");

dbProperties.put("user", "postgres");

dbProperties.put("password", "*Qwerty#");

val conn = DriverManager.getConnection("jdbc:postgresql://localhost:5432/postgresDB",dbProperties);

Upvotes: 5

Digvijay Chauhan
Digvijay Chauhan

Reputation: 31

So the problem was executors were not able to access the driver jar.

So passing driver jar using spark.jars configuration property solved it.

Its in the spark documentation here:

Comma-separated list of jars to include on the driver and executor classpaths. Globs are allowed.

Upvotes: 2

Related Questions