Reputation: 175
I come across many blog that say using rabbitmq improve the performance of microservices due to asynchronous nature of rabbitmq.
I don't understand in that case how the the http response is send to end user I am elaborating my question below more clearly.
user send a http request to microservice1(which is user facing service)
microservice1 send it to rabbitmq because it need some service from microservice2
microservice2 receive the request process it and send the response to rabbitmq
microservice1 receive the response from rabbitmq
NOW how this response is send to browser? Does microservice1 waits untill it receive the response from rabbitmq? If yes then how it become aynchronous??
Upvotes: 11
Views: 3441
Reputation: 1238
I asked a similar question to Chris Richardson (www.microservices.io). The result was:
You use something like websockets, so the microservice1 can send the response, when it's done.
microservice1 responds immediately (OK - request accepted). The client pulls from the server repeatedly until the state changed. Important is that microservice1 stores some state about the request (ie. initial state "accepted", so the client can show the spinner) which is modified, when you finally receive the response (ie. update state to "complete").
Upvotes: 1
Reputation: 2103
You could also use events and make the whole flow async. In this scenario microservice1 creates an event representing the user request and then return a requested created response immediately to the user. You can then notify the user later when the request is finished processing.
I recommend the book Designing Event-Driven Systems written by Ben Stopford.
Upvotes: 1
Reputation: 1224
I would take another look at your architecture. In general, with microservices - especially user-facing ones that must be essentially synchronous, it's an anti-pattern to have ServiceA
have to make a call to ServiceB
(which may, in turn, call ServiceC
and so on...) to return a response. That condition indicates those services are tightly coupled which makes them fragile. For example: if ServiceB
goes down or is overloaded in your example, ServiceA
also goes offline due to no fault of its own. So, probably one or more of the following should occur:
ServiceB
and have ServiceA
subscribe to pick up what it needs so that it can fulfill the user request without explicitly calling another serviceServiceA
and ServiceB
into a single service that can handle requests without having to make external callsYou can also send the HTTP request from the client to the service, set the application-state to waiting
or similar, and have the consuming application subscribe to a eventSuccess
or eventFail
integration message from the bus. The main point of this idea is that you let daisy-chained services (which, again, I don't like) take their turns and whichever service "finishes" the job publishes an integration event to let anyone who's listening know. You can even do things like pass webhook URI's with the initial request to have services call the app back directly on completion (or use SignalR, or gRPC, or...)
The way we use RabbitMQ is to integrate services in real-time so that each service always has the info it needs to be responsive all by itself. To use your example, in our world ServiceB
publishes events when data changes. ServiceA
only cares about, and subscribes to a small subset of those events (and typically only a field or two of the event data), but it knows within seconds (usually less) when B
has changed and it has all the information it needs to respond to requests. Each service literally has no idea what other services exist, it just knows events that it cares about (and that conform to a contract) arrive from time-to-time and it needs to pay attention to them.
Upvotes: 2
Reputation: 11469
Good question , lets discuss one by one
Synchronous behavior:
Asynchronous behavior:
Client sends the request, There's another thread that is waiting on the socket for the response. Once response arrives, the original sender is notified (usually, using a callback like structure).
Now we can talk about blocking vs nonblocking call
When you are using spring rest then each call will initiate new thread and waiting for response and block your network , while nonblocking call all call going via single thread and pushback will return response without blocking network.
Now come to your question
Using rabbitmq improve the performance of microservices due to asynchronous nature of rabbitmq.
No , performance is depends on your TPS hit and rabbitmq not going to improve performance .
Messaging give you two different type of messaging model
Using Messaging you will get loose coupling and fault tolerance .
In short above all are architecture style how you want to architect your application , performance depends on scalability .
You can combine your app with rest and messaging and non-blocking with messaging.
In your scenario microservice 1 could be rest blocking call give call other api using rest template or web client and or messaging queue and once get response will return rest json call to your web app.
Upvotes: 2
Reputation: 11085
It's a good question. To answer, you have to imagine the server running one thread at a time. Making a request to a microservice via RestTemplate is a blocking request. The user clicks a button on the web page, which triggers your spring-boot method in microservice1. In that method, you make a request to microservice2, and the microservice1 does a blocking wait for the response.
That thread is busy waiting for microservice2 to complete the request. Threads are not expensive, but on a very busy server, they can be a limiting factor.
RabbitMQ allows microservice1 to queue up a message to microservice2, and then release the thread. Your receive message will be trigger by the system (spring-boot / RabbitMQ) when microservice2 processes the message and provides a response. That thread in the thread pool can be used to process other users' requests in the meantime. When the RabbitMQ response comes, the thread pool uses an unused thread to process the remainder of the request.
Effectively, you're making the server running microservice1 have more threads available more of the time. It only becomes a problem when the server is under heavy load.
Upvotes: 4