Reputation: 590
I'm trying to understand how spark transforms the logical execution plan to a physical execution plan
I do 2 things:
So I was expecting 2 jobs only to be executed by the DAG
Why is this creating 3 jobs total?
and why did it need 3 different stages for this?
Upvotes: 0
Views: 64
Reputation: 590
I even went as far as removing the header from the file, and forcing inferSchema to disable, still 3 jobs:
Upvotes: 0