Saurabh Chakraborty
Saurabh Chakraborty

Reputation: 121

How to fix 'Exception: Java gateway process exited before sending its port number' in Eclipse IDE

I am trying to connect MySQL using pyspark in pydev environment of Eclipse IDE. Getting below error:

Exception: Java gateway process exited before sending its port number

I have checked Java is properly installed and also set PYSPARK_SUBMIT_ARGS to value --master local[*] --jars path\mysql-connector-java-5.1.44-bin.jar pyspark-shell in windows-> preferences->Pydev->Python Interpreter->Environment.

Java Path is also set. Tried setting it via code also but no luck.

#import os
from pyspark import SparkContext
from pyspark import SparkConf

from pyspark.sql.context import SQLContext

#os.environ['JAVA_HOME']= 'C:/Program Files/Java/jdk1.8.0_141/'

#os.environ['PYSPARK_SUBMIT_ARGS'] = '--jars D:/Softwares/mysql-connector-java-5.1.44.tar/mysql-connector-java-5.1.44/mysql-connector-java-5.1.44-bin.jar pyspark-shell'

conf = SparkConf().setMaster('local').setAppName('MySQLdataread')

sc = SparkContext(conf=conf)

sqlContext = SQLContext(sc)

dataframe_mysql = sqlContext.read.format("jdbc").option("url", "jdbc:mysql://localhost:3306/").option("driver", "com.mysql.jdbc.Driver").option("dbtable", "XXXXX").option("user", "root").option("password", "XXXX").load()

dataframe_mysql.show()

Upvotes: 2

Views: 8290

Answers (1)

Frank
Frank

Reputation: 154

My problem was slightly different, I am running spark in spyder with windows. when I am using

from pyspark.sql import SQLContext, SparkSession

I had the issue and followed google search links and not able to solve the problem.

Then I changed the import to:

from pyspark.sql import SparkSession
from pyspark import  SQLContext

and the error message disappeared.

I am running on Windows, anaconda3, python3.7, spyder Hope it is helpful to someone.

Edit:
Later, i found the real problem is from the following. When any of the configuration was not working properly, the same exception shows up. Previously, I used 28gb and 4gb instead of 28g and 4g and that cause all the problems I had.

from pyspark.sql import SparkSession
from pyspark import  SQLContext
spark = SparkSession.builder \
    .master('local') \
    .appName('muthootSample1') \
    .config('spark.executor.memory', '28g') \
    .config('spark.driver.memory','4g')\
    .config("spark.cores.max", "6") \
    .getOrCreate()

Upvotes: 0

Related Questions