Reputation: 10219
I have a Drupal site with a lot of calculations and database requests on each page load (running on an Amazon EC2 server). I am curious how my site would hold up if it became popular or in some other way received heavy traffic. Perhaps the most important thing for me is to locate potential bottlenecks in my code.
What are the best tools for stress testing and finding bottlenecks in a Drupal site? Right now I'm not using any cache module. I've read about the MemCached module and some about Varnish.
Anyone who can share their experiences?
Upvotes: 2
Views: 1011
Reputation: 49
Apache Benchmark. Send no cookies to simulate anonymous traffic and stress the Varnish caching layer. Send rando cookie to stress the Drupal caching layer (where you are hopefully using memcache) Send a login cookie to stress the DB layer.
We run massive Drupal sites on php. Scaling does cost in resources for multiple webheads, database clustering, separate memcache and file servers, but you have to balance that cost against hiring developers for refactoring your code into a different language, and code maintenance.
Upvotes: 0
Reputation: 2587
Generally php does not do great in really large scale projects. That is why google does not support php in their app engine.
Facebook which was made in php had to be compiled into c++ for better performance.
Having said that here are the some of the tools ( I have not used them )
http://www.webload.org/ http://xdebug.org/ - to profile your php code besides debugging
cheers, Vishal
Upvotes: 1