Reputation: 149
We are using gorilla mux framework for handling web requests which I suppose automatically runs on all the cpu cores. Is there a benefit of using go routines in such a case for cpu intensive processes for eg looping through a large object?
Upvotes: 3
Views: 2890
Reputation: 79536
I suppose automatically runs on all the cpu cores.
You suppose wrong. Sort of.
As of Go 1.5, Go will use all of your cores, by running go routines on different cores. But if you don't use go routines, there's no way for it to take advantage of this.
Is there a benefit of using go routines in such a case for cpu intensive processes for eg looping through a large object?
There can be. But you're asking the wrong question.
You don't use Go routines primarily to take advantage of different CPU cores (although this can also be a benefit). You use Go routines to keep your program from blocking while doing something that takes a while.
In the case of a web application, most requests are usually not CPU intensive at all. But they usually spend a lot of time (in computer terms) waiting around for things to happen. They wait for DNS lookups on the request hostname, they wait for the database to look up user credentials to establish a session, they wait for the database to store or return rows to produce the HTTP response, etc.
Without go routines, while doing these things, your server would be unable to do anything else. So if your typical HTTP request took, say, 1 second, to look up DNS, validate an authorization cookie, look up results from a database, and send a response, no other HTTP client could be served simultaneously.
Fortunately, the http
package, which is used by Gorilla (and practically every other web framework for Go) already uses Go routines to handle requests. So you're already using (at least) one Go routine per HTTP request.
Whether it makes sense to use additional go routines is more up to your application design than "using more CPU cores."
Some suggestions:
Upvotes: 11