Sandesh Puttaraj
Sandesh Puttaraj

Reputation: 333

Error in connecting to oracle local database from apache spark java

Below is the code that I am using to connect to oracle local database from apache spark java program, but I am getting error:

java.sql.SQLSyntaxErrorException: ORA-00903: invalid table name

Here is my code:

public static void main( String[] args )
{
  String MYSQL_DRIVER = "oracle.jdbc.driver.OracleDriver";
  String MYSQL_CONNECTION_URL = "jdbc:oracle:thin:@//192.168.2.85:1521/stage";

  JavaSparkContext sc = new JavaSparkContext(new SparkConf().setAppName("SparkJdbcDs").setMaster("local[*]"));
  SQLContext sqlContext = new SQLContext(sc);

  Map<String, String> options = new HashMap<String, String>();
  options.put("driver", MYSQL_DRIVER);
  options.put("url", MYSQL_CONNECTION_URL);
  options.put("dbtable", "select LOC_ITEM_ID from loc_item_master");
  options.put("partitionColumn", "LOC_ITEM_ID");
  options.put("user", "ad_tables"); 
  options.put("password", "ad_tables"); 
  options.put("lowerBound", "4467");
  options.put("upperBound", "4749");
  options.put("numPartitions", "10");
  try {
        Dataset<Row> jdbcDF = sqlContext.load("jdbc", options);
        List<Row> employeeFullNameRows = jdbcDF.collectAsList();

        for (Row employeeFullNameRow : employeeFullNameRows) {
            System.out.println(employeeFullNameRow);
        }

        sc.close();

  } catch (Exception e) {
        e.printStackTrace();
  }

}

Below is the POM file

<dependencies>

        <dependency>
            <groupId>com.databricks</groupId>
            <artifactId>spark-csv_2.11</artifactId>
            <version>1.5.0</version>
        </dependency>
        <!-- https://mvnrepository.com/artifact/com.univocity/univocity-parsers -->
        <dependency>
            <groupId>com.univocity</groupId>
            <artifactId>univocity-parsers</artifactId>
            <version>2.3.0</version>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-mllib_2.11</artifactId>
            <version>2.0.1</version>
        </dependency>

    <dependency> <!-- Spark dependency -->
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_2.11</artifactId>
      <version>2.0.1</version>
    </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>2.0.1</version>
        </dependency>

        <dependency>
    <groupId>com.oracle</groupId>
    <artifactId>ojdbc6</artifactId>
    <version>11.2.0.3</version>
</dependency>
</dependencies>

Upvotes: 1

Views: 1660

Answers (2)

BHANUMATHI H M
BHANUMATHI H M

Reputation: 271

You can try this below approach.. It will work.

public class OracleDatabaseConnection {

 public static void main(String[] args) {

  System.setProperty("hadoop.home.dir", "C:\\AD_classfication\\Apachespark\\winutil");
  JavaSparkContext sc = new JavaSparkContext(new SparkConf().setAppName("SparkJdbcDs").setMaster("local[*]"));
  SQLContext sqlContext = new SQLContext(sc);
  SparkSession spark = SparkSession.builder().appName("JavaTokenizerExample").getOrCreate();
  Properties connectionProperties = new Properties();
  connectionProperties.put("user", "ad_tables");
  connectionProperties.put("password", "ad_tables");
  String query = "sparkour.people";
  query = "(select LOC_ITEM_ID from loc_item_master)";
  Dataset<Row> jdbcDF2 = spark.read().jdbc("jdbc:oracle:thin:@//192.168.2.85:1522/stage", query,connectionProperties);
  jdbcDF2.show();

 }
}

Upvotes: 2

T. Gawęda
T. Gawęda

Reputation: 16086

Your dbtable is incorrect, try:

 options.put("dbtable", "(select LOC_ITEM_ID from loc_item_master) as loc_item_id");

Everything in dbtable should be a valid FROM fragment.

You cannot write: select ... from select LOC_ITEM_ID from loc_item_master, but you can write select ... from (select LOC_ITEM_ID from loc_item_master) as loc_item_id

Of course also table's schema must be ok, you can specify it using schema.table style in query

Upvotes: 0

Related Questions