Informative articles on various technologies ...
A client contacted us to assist them in finding a solution for slow page times for their site.
All the pages of the site were slow, and taking 2.9 to 3.3 seconds.
Upon investigation, we found that one view was responsible for most of that time.
However, the query execution itself was fast, around 11 ms.
But, the views rendering time was obscenely high: 2,603.48 ms!
So, when editing the view, you would see this at the bottom:
Query build time 2.07 ms
Query execute time 11.32 ms
View render time 2,603.48 ms
If your Drupal site suffers occasional slow downs or outages, check if crawlers are hitting your site too hard.
We've seen several clients complain, and upon investigation we found that the culprit is Google's own crawler.
The tell tale sign is that you will see lots of queries executing with the LIMIT clause having high numbers. Depending on your site's specifics, these queries would be slow queries too.
This means that there are crawlers that accessing very old content (hundreds of pages back).
Here is an example from a recent client:
Khalid of 2bits.com, Inc, was interviewed by Modules Unraveled, on Drupal Performance.
Recently a client complained that their site has been suffering from slow downs that were not there before.
Upon investigating the history of resource utilization, we found that memory usage has increased, and at the same time, the load average has increased as well.
To cut a long story short, we found out that initially, the site had neither dblog nor syslog modules were enabled, and they were enabled before the issues started. They started when syslog was enabled.
Together with Alan Dixon of Black Fly Solutions, Khalid Baheyeldin of 2bits.com, Inc. gave a presentation on Web Site Performance, Optimization and Scalability at Drupal Camp 2011.
The slides from the presentation are attached below.
By design, the Drupal CAPTCHA module disables page caching for pages it is enabled on.
So if you enable CAPTCHA for user login and/or registration forms, those pages will not be cached. This is often acceptable.
However, if you enable CAPTCHA for comments, and have the comment form visible at the bottom of each node, then a big portion of your site's pages will not be cached in the page cache at all.
We recently conducted a performance assessment for a client, and the main problem was something really simple, but also very detrimental to the site.
The site used the Service Links module, version 6.x-2.x.
The following are the performance figures for the site, in seconds, before any tuning.
Quick Tabs is a widely used Drupal module. Site builders like it because it improves usability in some cases by reducing clutter.
Incidentally, the way this module works has cause us to run across performance issues caused by certain uses. See previous article about Quick Tabs can sure use more caching and a case study involving Quick Tabs.
Aggressive crawlers that hit your web site a lot can cause performance problems.
There are many ways to identify aggressive crawlers, including writing custom scripts that analyze your web server logs.
One tool that we found to be useful in analyzing which crawlers hit the site the most today or yesterday is Go Access.
Go Access is available for Ubuntu Natty Narwahl (11.04) only, but not earlier LTS releases.
Earlier today, I presented at DrupalCamp Toronto about 3.4 million page views a day, 92 million per month, one server and Drupal.
As promised, the slides are attached, as a PDF for everyone's reference.
Is your Drupal, Backdrop CMS or WordPress site slow?
Is it suffering from server resources shortages?
Is it experiencing outages?
Contact us for Drupal, Backdrop CMS and WordPress Performance Optimization and Tuning Consulting
Do you use any of our Drupal modules?
Did you find our Drupal, WordPress, and LAMP performance articles informative?
Follow us on Twitter @2bits for tips and tricks on Drupal and WordPress Performance.
Contact us for Drupal and WordPress Performance Optimization and Tuning Consulting