Forec
Forec

Reputation: 1

Spark: Increase the number of tasks/partitions

The number of tasks in Spark is decided by the total number of RDD partitions at the beginning of stages. For example, when a Spark application is reading data from HDFS, the partition method for Hadoop RDD is inherited from FileInputFormat in MapReduce, which is affected by the size of HDFS blocks, the value of mapred.min.split.size and the compression method, etc.

The screenshot of my tasks

The tasks in the screenshot took 7, 7, 4 seconds, and I want to make them balanced. Also, the stage is split into 3 tasks, are there any ways to specify Spark the number of partitions/tasks?

Upvotes: 0

Views: 2794

Answers (1)

Robin
Robin

Reputation: 695

The task dependents on the partition. You can set the partitioner for the RDD, In the partitioner you can set the number of partitions.

Upvotes: 0

Related Questions