Reputation: 191
How can I send large data from NodeJS to Client? I tried this two: 1. Socket.io request 2. Ajax GET request But both way are slow. I am using mongoDB. Data size would be 1,000,000+ objects(1-2GB) but even 10,000 data sending, It is so slow. How can I make it more faster? (Data reading time from MongoDB to NodeJS is no problem. )
This is Socket.io code
=> NodeJS
io.sockets.on('connection', function(socket) {
socket.on('getData', function() {
var items = TestModel.find();
items.find({},function(err,obj){
for(var i=0;i<obj.length;i++){
socket.emit('responseData', obj[i]);
}
socket.emit('complete',"Item length : ");
});
});
});});
If I use socket.emit('responseData', obj), I ll get a overflow err
=> index.HTML
<html>
<head>
<meta charset="utf-8"/>
<script src="/socket.io/socket.io.js"></script>
<title></title>
</head>
<body>
Socket.io Test
</body>
<script>
window.onload = function() {
var socket = io.connect();
socket.emit('getData');
var i=0;
//socket.emit('getName');
socket.on('responseData', function(data) {
i++;
});
socket.on('complete',function(data){
alert(data+i);
});
};
</script>
</html>
This is Ajax GET code
=> NodeJS
app.get('/load',function(req,res){
var items = TestModel.find();
console.log("Model loaded");
items.find({},function(err, obj){
res.send(JSON.stringify(obj));
console.log("Sent Models");
});
});
=> Ajax
$(function(){
$(document).ready(function(){
$.get('/load', {}, function(data){
...
});
});});
Upvotes: 2
Views: 1776
Reputation: 11781
Check and/or enable compression, you could get up to 10x gain, depending on data and format. There's a tradeoff between compression quality and server cpu usage, in general, start with best compression algorithm browser (or client) supports.
If the client is under your control, you could even try xz
or something more exotic :)
Tag your responses (https://en.wikipedia.org/wiki/HTTP_ETag) and add hooks for If-None-Match: ...
request header field.
Depends on what the client is using the data for, is it saved? processed? displayed? In many cases client can start performing actions on part of data. If that part is at the start of response, streaming it makes a lot of sense
Clients could use Range: ...
request header field to download segments of data. In which case single client could issue multiple requests against segments of given dataset simultaneously. This typically helps if client's network is crappy (e.g. another continent).
Perhaps client can process or "show" a simplified representation of data first. If there's a human user behind the client, they may wish to quickly check your data before making a decision to download it. Pretty common for images and design documents where a thumbnail or preview is generated server-side.
Upvotes: 1