Alejandro A
Alejandro A

Reputation: 1190

Best practices for Sparkessions Pyspark

Have a flask api, everytime a endpoint is called triggers a job in Spark, open a Session which I close when it's done. The job itself takes less than 2 Minutes.

I start the session and close it everytime the method is called.

Is it better to start a single Session and keep it always alive or just open and kill it everytime?

Thanks

Upvotes: 1

Views: 247

Answers (1)

Shubham Jain
Shubham Jain

Reputation: 5526

This will cause issue if you recieve bulk requests then multiple spark session will be created with the same resources.

Better keep the spark session running and serve the request using the same

Upvotes: 1

Related Questions