Reputation: 110
I am making an web app which has large amount of data to be handled. i have inserted around 10 million of data into mongoDb database. Now when i am fetching all the data at same time through ajax, request is taking too much time to fetch the data. so what to do make data load fast?
router.get('/ekcroredata', async (req, res)=>{
const data = await gameBids.find();
res.json(data);});
Upvotes: 1
Views: 2968
Reputation: 28
Actually node js is not feasible to handle large data so you have two options
Upvotes: -3
Reputation: 1152
You are not supposed to send all the data to the client in one go. Response time will depend on the size of the data. instead you should implement pagination. Fetch the records in a bunch of fixed size by calling the same API repeatedly.
router.get('/ekcroredata', async (req, res)=>{
let { page_number, page_size } = req.query; // keep increasing the page_number in the successive call by client
const data = await gameBids.find().skip((page_number)*page_size).limit(page_size)
res.json(data);
});
Upvotes: 3