Reputation: 3633
I've got a 'script' that does thousands of requests to a specific API. This API will only allow 5 requests per second (and probably it measures differently then me). To make the requests I'm using request-promise
framework, and I've superseded the normal request-promise
function with this:
const request_promise = require('request-promise')
function waitRetryPromise() {
var count = 0 // keeps count of requests
function rp(options) {
const timedCall = (resolve) => setTimeout( ()=>resolve(rp(options)),1000) // recursive call
count += 1
if (count % 3 == 0) { // recalls after a second on every third request
return new Promise(timedCall)
} else {
return request_promise(options)
}
}
return rp
}
const rp = waitRetryPromise()
Once around 300 requests (give or take) are fired off in short succession, these requests start to interfere with each other. Does anyone have a better solution? I thought the recursive call to this same function would help, and It did but it didn't solve the problem. Maybe there is a pattern to queue requests, and do them a few at a time? A library perhaps?
Thanks!
Upvotes: 8
Views: 8477
Reputation: 5066
You can use mutex to synchronize concurrent processes running on different threads and limit them by timer:
import { Mutex } from 'async-mutex';
const wait = (ms: number): Promise<void> => new Promise((res) => setTimeout(res, ms));
export class RateLimiter {
private readonly mutex = new Mutex();
private delay = 1000 / 2;
public async run<T>(cb: () => Promise<T>): Promise<T> {
const release = await this.mutex.acquire();
try {
return cb();
} finally {
// you can add `await` before to delay the result
wait(this.delay).then(release);
}
}
}
An example usage:
// ...NestJS Controller
private readonly rateLimiter = new RateLimiter();
@Get('random')
public async random() {
const result = await this.rateLimiter.run(async () => Math.random());
return result;
}
📍 The mutex lib documentation
Upvotes: 0
Reputation: 7746
My code will run the TimedQueue
so as long as there is work to be done. The process()
method resolves when all work is finished:
class Queue {
constructor() {
this.queue = [];
}
enqueue(obj) {
return this.queue.push(obj);
}
dequeue() {
return this.queue.shift();
}
hasWork() {
return (this.queue.length > 0);
}
}
function t_o(delay) {
return new Promise(function (resolve, reject) {
setTimeout(function () {
resolve();
}, delay);
});
}
class TimedQueue extends Queue {
constructor(delay) {
super();
this.delay = delay;
}
dequeue() {
return t_o(this.delay).then(() => {
return super.dequeue();
});
}
process(cb) {
return this.dequeue().then(data => {
cb(data);
if (this.hasWork())
return this.process(cb);
});
}
}
var q = new TimedQueue(500);
for (var request = 0; request < 10; ++request)
q.enqueue(request);
q.process(console.log).then(function () {
console.log('done');
});
Upvotes: 4
Reputation: 1
OK, rather than recursing the call to rp etc, just make sure you delay between requests by an appropriate amount ... for 5 per second, that's 200ms
function waitRetryPromise() {
let promise = Promise.resolve();
return function rp(options) {
return promise = promise
.then(() => new Promise(resolve => setTimeout(resolve, 200)))
.then(() => request_promise(options));
}
}
const rp = waitRetryPromise();
Upvotes: 5
Reputation: 2192
I am not sure but maybe you get some idea from below
function placeAnOrder(orderNumber) {
console.log("customer order:", orderNumber)
cookAndDeliverFood(function () {
console.log("food delivered order:", orderNumber);
});
}
// 1 sec need to cook
function cookAndDeliverFood(callback){
setTimeout(callback, 1000);
}
//users web request
placeAnOrder(1);
placeAnOrder(2);
placeAnOrder(3);
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
Upvotes: -2