Pig
Pig

Reputation: 2122

Fastest way to send multiple HTTP requests

I have an array of about 2000 user objects (maps) that I need to call an API to get the user detail -> process the response -> update my local DB as soon as possible. I used Go's waitgroup and goroutine to implement the concurrent request sending method, however to call 2000 requests it would take about 24 seconds on my 2014 Macbook Pro. Is there anyway to make it faster?

var wg sync.WaitGroup

json.Unmarshal(responseData, &users)
wg.Add(len(users))

for i:= 0; i<len(users); i++ {
    go func(userid string){
        url := "https://www.example.com/user_detail/"+ userid
        response, _ := http.Get(url) 
        defer response.Body.Close()
        data, _ := ioutil.ReadAll(response.Body)
        wg.Done()
    }(users[i]["userid"])
}

wg.Wait()

Upvotes: 1

Views: 6914

Answers (1)

Jonathan Hall
Jonathan Hall

Reputation: 79784

This sort of situation is very difficult to address in general. Performance at this level depends very much on the specifics of your server, API, network, etc. But here are a few suggestions to get you going:

  1. Try limiting the number of concurrent connections.

    As mentioned by @JimB in comments, trying to handle 2000 concurrent connections is likely inefficient, for both the server and client. Try limiting to 10, 20, 50, 100 simultaneous connections. Benchmark each value, and tweak accordingly until you get the best performance.

    On the client side, this may allow re-using connections (thus reducing the average per-request overhead), which is currently impossible, since you're initiating all 2000 connections before any of them complete.

  2. If the server supports HTTP/2, make sure you're using HTTP/2, which can be more efficient (with multiple requests--so this really depends on #1 above, too). See the documentation about debugging HTTP/2.

  3. If the API supports bulk requests, take advantage of this, and request multiple users in a single request.

Upvotes: 4

Related Questions