Reputation: 332
A complex situation in here!
The situation now: We have a main server doing only his stuffs. Data is changing every second on it. We need a web widget(html data) to share with other websites. That widget must be refreshed every one minute. The widget data will be changed every second. All other website's visitors must see that information. We can't handle such high traffic.The server is needed online 24/7,and they would not connect to it every one minute. I'm talking about a million impressions per month.
The solution we're working on: Get a several hosting plans. All hostings will store the HTML data that will be showed to the visitors. Every hosting account will do a cronjob every one minute to our main server. Gets the html and store it till next cronjob. Thats how we move the traffic from our main server to other place. Now is the part where the website's visitors will connect to the html stored at our hostings. The code down is connecting with the first hosting server,if he doesn't answer in some time,it will connect to the second one.And loop till some of them returns the HTML data. Of course if they get 100% loaded we'll get another new hosting.
<script>
server_1 = 'http://hostingserver_one.com/';
server_2 = 'http://hostingserver_two.com/';
wait_for_response = 5000;
one_minute = 60 * 1000;
half_minute = 30 * 1000;
right_away = 1;
current_refresh_minute = one_minute;
current_refresh_server = server_1;
function ajaxRequestInfo() {
$.ajax({
type: 'GET',
url: current_refresh_server,
timeout: wait_for_response,
data: {},
success: function(data) {
$(".data_for_refresh").html(data);
},
complete: function(data) {
window.setTimeout(ajaxRequestInfo, one_minute);
},
error: function() {
changeRefreshServer();
window.setTimeout(ajaxRequestInfo, right_away);
},
async: true
});
}
function changeRefreshServer() {
if (current_refresh_server == server_1) {
current_refresh_server = server_2;
} else if (current_refresh_server == server_2) {
current_refresh_server = server_3;
} else if (current_refresh_server == server_3) {
current_refresh_server = server_1;
}
}
$(document).ready(function() {
ajaxRequestInfo();
});
The question is: Is that the best way to be done?! If not whats better. I'm sure many of you already passed that situations but it's my first :)
Upvotes: 1
Views: 180
Reputation: 13801
Talking about one million with html files looks strange to me, ofc you can handle it more precisely.
Two thing you need to concern is that
load balancing and consuming more memory than one web server can provide.
If your account is high-load i mean have a too much load then should go for another server instead of bearing slowness of single server. On the other hand if you want to host many applications on single server you need a more memory a server can provide
Basically, I installed the standard wordpress on the server that makes the ProxyPass. Then I configured the site and installed the extensions and templates. I configured a SQL DB on this server, in our case it is also the proxy but it would be ideal to isolate it on its server or to an external database service like Xeround. In my WordPress, apache, mysql and memcached configurations, I always specify the internal private network IPs since all my servers at iWeb are Smart Servers. This eliminates traffic on the public network. It makes the setup much safer.
I read it here you can find more ideas Here
ultraking is the s/w you need to concern
Now,in you case you have a multiple html files which are going server by server to find out user's particular file . But that's not a good idea to find in a all server. Instead make one json object will have a information regarding which server contains which information So now scenario would be User > hits your website (contains information object about each server) >> user request for a file > fileter json object > hit to your uniquer
This will more reduced traffic to your server.Thanks
[edit]
Its all depends on a planning strategy you are doing for handling users. You can not handle a single user if you strategy and a right choice of tool doesn't site on a benchmark.
Upvotes: 3
Reputation: 630
1 million impressions might sound like a lot of traffic but in reality most web servers can handle a load like this under normal situations. If you are on a shared server or VPS I can understand 1 million might be too much load, you should look into dedicated hosting or increasing your VPS specs if this is your situation. You also might have scaling problems with your code / database etc that are slowing things down.
While there is nothing wrong with your approach this could be better solved with a better setup, if you want to go the two server route you should look into load balances and hosting on the cloud with elastic resources like Moo recommended AWS or Azure for example rather than relying on client side Javascript code to send traffic to the correct server.
Upvotes: 0
Reputation: 710
A million per month isn't too bad. You can handle it even with the clients connecting to you directly to fetch the data. Just host the bulk of your widget externally and have it download a tiny text file from your main server with javascript xmlHttpRequest, then chew it into a user-friendly design. Just for comparison, my server gets 6 million GET hits per month. The average size of each one is 15 kB. That's much more than you will need. And here is the surprise: This "beast" of a server is a SOHO DD-WRT router under my table. :)
EDIT: You could still manage that with on-the-fly gzip enabled. Not that that makes sense with a sub-100 byte output that I suggest you to do.
Upvotes: 0
Reputation: 29854
Your architecture looks strange to me: it should not be for the client (your website visitors) to "load balance" between servers. This is not their concern and worst this will not make things much better on your server side, since they will attempt to create the connections anyway and thus create some load.
You should put some form of load balancing in front of an array of web servers that serve your (HTML) content.
The web servers should also be isolated from your "data/main" server using some shared cache to avoid loading the latter. See Memcached for instance.
It is hard to get into details in a Stackoverflow answer. Also the number your are citing do not look that dramatically high to me and I have a feeling (and feeling only) that a minimum number of reasonably sized servers with caching properly enabled should easily cope with these values.
Upvotes: 0