HISI
HISI

Reputation: 4797

What's the difference between Sparkconf and Sparkcontext?

I've encountered a problem with pyspark when I've made Import Pyspark from Sparkcontext but I found that it can be imported from sparkconf as well, I'm asking what's the difference between those two spark class libraries.

Upvotes: 3

Views: 6014

Answers (2)

Reeves
Reeves

Reputation: 736

SparkConf is a configuration class for setting config information in key value format

SparkContext is the main entry class for establishing connection to the cluster.

Implementation of SparkConf-->

class SparkConf(object):

def __init__(self, loadDefaults=True, _jvm=None, _jconf=None):
    """
    Create a new Spark configuration.
    """
    if _jconf:
        self._jconf = _jconf
    else:
        from pyspark.context import SparkContext
        _jvm = _jvm or SparkContext._jvm

In this SparkContext is imported in the constructor, so you can pass the sparkContext. Similarly in SparkContext we have sparkConf as parameter so that you can pass sparkConf to it.

Thus you are setting values of configuration in both the ways.

Upvotes: 0

Chandan Ray
Chandan Ray

Reputation: 2091

Sparkcontext is the entry point for spark environment. For every sparkapp you need to create the sparkcontext object. In spark 2 you can use sparksession instead of sparkcontext.

Sparkconf is the class which gives you the various option to provide configuration parameters.

Val Conf = new sparkConf().setMaster(“local[*]”).setAppName(“test”)
Val SC  = new sparkContext(Conf)

The spark configuration is passed to spark context. You can also set different application configuration in sparkconf and pass to sparkcontex

Upvotes: 2

Related Questions