philantrovert
philantrovert

Reputation: 10092

Specifying jceks file with Spark JDBC

I am trying to connect to Oracle via the sqlContext.read.format("json") method. Everything is going good but while creating the JDBC String, I have to specify the username and password for the database in the string :

val jdbcString = "jdbc:oracle:thin:USERNAME/PASSWORD@//HOSTNAME:PORT/SID"

However, I do have a jceks file present on HDFS which contains the password. I was wondering if there is any way that I can leverage that file to connect to JDBC instead of plain-text password? Like in Sqoop, we can do :

sqoop import -Dhadoop.security.credential.provider.path=jceks://hdfs/data/credentials/oracle.password.jceks

Thanks.

Upvotes: 1

Views: 3756

Answers (2)

philantrovert
philantrovert

Reputation: 10092

This was achieved using CredentialProviderFactory.

import org.apache.hadoop.security.alias.CredentialProviderFactory

val conf = new org.apache.hadoop.conf.Configuration()
val alias = "password.alias"
val jceksPath = "jceks://hdfs/user/data/alias/MySQL.password.jceks"

conf.set(CredentialProviderFactory.CREDENTIAL_PROVIDER_PATH, jceksPath)

//getPassword Returns Array[Char]
val password = conf.getPassword(alias).mkString

Upvotes: 5

Jean de Lavarene
Jean de Lavarene

Reputation: 3773

The Oracle JDBC thin driver doesn't support extracting the password from a jceks file. It supports wallets instead (password can be stored in a wallet).

Upvotes: 0

Related Questions