RachelD
RachelD

Reputation: 4089

How to process massive data-sets and provide a live user experience

I am a programmer at an internet marketing company that primaraly makes tools. These tools have certian requirements:

Currently we are writing in PHP, MySQL and Ajax.

My question is how do I process LARGE quantities of data and provide a user experience as the tool is running. Currently I use a custom queue system that sends ajax calls and inserts rows into tables or data into divs.

This method is a huge pain in the ass and couldnt possibly be the correct method. Should I be using a templating system or is there a better way to refresh chunks of the page with A LOT of data. And I really mean a lot of data because we come close to maxing out PHP memory and is something we are always on the look for.

Also I would love to make it so these tools could run on the server by themselves. I mean upload a .csv and close the browser window and then have an email sent to the user when the tool is done.

Does anyone have any methods (programming standards) for me that are better than using .ajax calls? Thank you.


I wanted to update with some notes incase anyone has the same question. I am looking into the following to see which is the best solution:

These are in no particular order and the one I choose will be based on what works for my issue and what can be used by the rest of my department. I will update when I pick the golden framework.

Upvotes: 2

Views: 814

Answers (3)

Nick Zinger
Nick Zinger

Reputation: 1174

I think you can run what you need in the background with some kind of Queue manager. I use something similar with CakePHP and it lets me run time intensive processes in the background asynchronously, so the browser does not need to be open.

Another plus side for this is that it's scalable, as it's easy to increase the number of queue workers running.

Basically with PHP, you just need a cron job that runs every once in a while that starts a worker that checks a Queue database for pending tasks. If none are found it keeps running in a loop until one shows up.

Upvotes: 1

Strixy
Strixy

Reputation: 578

Since you're kind of pinched for time already, migrating to Node.js may not be time sensitive. It'll also help with the question of notifying users of when the results are ready as it can do browser notification push without polling. As it makes use of Javascript you might find some of your client-side code is reusable.

Upvotes: 1

seferov
seferov

Reputation: 4161

First of all, you cannot handle big data via Ajax. To make users able to watch the processes live you can do this using web sockets. As you are experienced in PHP, I can suggest you Ratchet which is quite new.

On the other hand, to make calculations and store big data I would use NoSQL instead of MySQL

Upvotes: 1

Related Questions