Lazykiddy
Lazykiddy

Reputation: 1535

Python server handling simultaneous SQL-requests efficiently

Having a server which should be able to handle hundreds of requests at the same time, I decided to do this in python and currently it is up and running, now I'm willing to optimize this code since I want to make this as lightweight as possible.

Having the following server layout:

Incoming connection
       ||
       \/
 _________________
| Handling server |  <|
|_________________|   |
       ||             |
       \/             |
      FORKING --------/
       ||
       \/
 _________________  30 Queries  ______________
| Dataprocessing  | ---------->| MySQL-Server |  
|_________________| <----------|______________| 
       ||             
       \/       
    RESPONDING & closing connection

What now happens is that if there are x simultaneous incoming connections, the MySQL-server gets really slow for a short amount of time.

Now I am looking for a way to 'merge' different mysql-requests dynamically or to reduce the workload as much as possible.

An example of a question that I could ask for this is:

How can I handle many MySQL-requests in a given timeframe (5 seconds?) while keeping the workload on the MySQL-server as low as possible

Upvotes: 0

Views: 165

Answers (1)

pcalcao
pcalcao

Reputation: 15990

There is no easy/one-size-fits-all approach to achieve this, so what I suggest are some approaches you might consider adapting to your particular case.

The best way of relieving load on your MySql server is to use a caching layer in between, if that applies in your case, but it depends.

  • Scenario 1: Lots of select queries

You can use caching if the information you're fetching doesn't get changed very frequently compared to the rate of select queries.

  • Scenario 2: Lots of insert/update statements

This is trickier, if several statements involve changing the same tuple, you can buffer them in memory for a short time (summing them up, whatever you can do depending on your application logic), and send the aggregated results to MySql instead of each one.

Another possible approach is to check if what's taking the time is processing the actual queries or the establishment of the connections. If it's the later, you could use an approach where you pool several connections and reuse them, this will probably speed things up as well.

Upvotes: 3

Related Questions