Reputation: 81
I am talking to an API server that uses single-use auth tokens which are re-issued (i.e. a new token is generated) every request. If two API requests are issued at the same time using the same token, one fails due to an expired token and error handlers can be triggered unnecessarily.
What I would like to do is build in a waiting mechanism so only one HTTP request is issued at a time, and subsequent requests are queued until the previous ones are completed.
I would like to build this in a generic way so I can have one service layer to issue requests with so it is transparent to the upper layers however I am struggling to find a neat way to do this. Any suggestions?
-- edit --
I should have mentioned that all subsequent API call URLs / request bodies depend directly on the response of the previous API call, e.g. GET /books?apiKey=ABC
returns:
{
'apiKey': 'XYZ',
'names': [...]
}
The next request that was queued must wait for this response and append the new apiKey: GET /authors?apiKey=XYZ
In the above example, performing GET /authors?apiKey=ABC
would
result in an error.
If there is a strong dependence between two HTTP requests (e.g. request book, then request authors for that particular book based on its id), they can be serialised using a flatMap, nested subscriptions etc. I need similar functionality to this but in a generic way where I can add requests onto a queue that is serialised at runtime.
-- edit two --
I have two components like the following, lets say both call an API call on startup.
class ComponentA {
ngOnInit(api:<BookApi extends CommonApi>) {
this.api.list().subscribe(x => {...});
}
}
class ComponentB {
ngOnInit(api:<CityLookupApi extends CommonApi>) {
this.api.list().subscribe(x => {...});
}
}
All the API services extend a common API to hide the apiKey handling behind it so the components can use a higher level call to aid comprehension. The issue is that if the components both are initialised at near the same time, both calls can use the same api key and one will fail. However, I cannot create these Observable batches up front using e.g. forkJoin as they are created and subscribed to asynchronously at run-time as they are created in their separate components.
Upvotes: 1
Views: 6674
Reputation: 17762
I would consider to use mergeMap
together with its concurrency
parameter.
This is a snippet of code that simulates what I intend
import {Observable} from 'rxjs';
import {Subject} from 'rxjs';
const requestsRemoteService = new Subject<string>();
const requestStream = requestsRemoteService.asObservable();
// this function simulates the execution of the request
const requestExecution = (input: string) => {
return Observable.of(input + ' executed').delay(2000);
}
// this is what the service should to queue the requests
requestStream
// the last parameter, set to 1, is the level of concurrency
.mergeMap(requestInput => requestExecution(requestInput), 1)
.subscribe(
data => console.log(data),
err => console.error(err),
() => console.log('DONE')
)
// these are the various requests coming to the service
setTimeout(() => requestsRemoteService.next('First request'), 1);
setTimeout(() => requestsRemoteService.next('Second request'), 2);
setTimeout(() => requestsRemoteService.next('Third request'), 3);
Upvotes: 1
Reputation: 7875
you can use Observable.concat(...myArrayOfAjaxRequestObservable)
to have each observable consumed one by one.
Only Observable.contact have to subscribe himself to your ajax request. Is is only way to be 100% sure request will not be trigger by another part of your code.
more information here : https://www.learnrxjs.io/operators/combination/concat.html
---UPDATE---
Sample : https://stackblitz.com/edit/angular-66ic6c?file=app%2Fapp.component.ts
Upvotes: 2
Reputation: 16837
Yanis-git answers the queuing problem. If you want to block all the request while the other request is happening, use exhaustMap
.
Upvotes: 2