Subramaniya Sai S
Subramaniya Sai S

Reputation: 13

Unable to Execute More than a spark Job "Initial job has not accepted any resources"

Using a Standalone Spark Java to execute the below code snippet, I'm getting the Status is always WAITING with the below error.It doesn't work when I try to add the Print statement. Is there any configuration I might have missed to run multiple jobs?

15/09/18 15:02:56 INFO DAGScheduler: Submitting 2 missing tasks from Stage 0 (MapPartitionsRDD[2] at filter at SparkTest.java:143)

15/09/18 15:02:56 INFO TaskSchedulerImpl: Adding task set 0.0 with 2 tasks

15/09/18 15:03:11 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources

15/09/18 15:03:26 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources

15/09/18 15:03:41 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources

JavaRDD<String> words = input.flatMap(new FlatMapFunction<String, String>()    //Ln:143
        {
            public Iterable<String> call(String x)
            {
                return Arrays.asList(x.split(" "));
            }
        });
// Count all the words
System.out.println("Total words is" + words.count())

Upvotes: 0

Views: 5872

Answers (1)

Henri Benoit
Henri Benoit

Reputation: 725

This error message means that your application is requesting more resources from the cluster than the cluster can currently provide i.e. more cores or more RAM than available in the cluster.

One of the reasons for this could be that you already have a job running which uses up all the available cores.

When this happens, your job is most probably waiting for another job to finish and release resources.

You can check this in the Spark UI.

Upvotes: 1

Related Questions