MetallicPriest
MetallicPriest

Reputation: 30765

Why spark always uses a single core on my PC?

Even when I give a parameter to the groupByKey function, for example groupByKey(4), when I check with the top command, spark is still using one core. I run my script like that.

spark-submit --master local[4] program.py

So, why spark only uses one core when I tell it to use 4?

Upvotes: 1

Views: 287

Answers (1)

Francois G
Francois G

Reputation: 11985

You're running this on Linux, if the tags to your question are to be trusted. Under linux, top does not, by default, show every thread (it shows every process). local[4] tells spark to work locally on 4 threads (not processes).

Run top -H to pick up the threads.

Upvotes: 2

Related Questions