Informative articles on various technologies ...

Overcoming long Views rendering time on Drupal sites

A client contacted us to assist them in finding a solution for slow page times for their site.

All the pages of the site were slow, and taking 2.9 to 3.3 seconds.

Upon investigation, we found that one view was responsible for most of that time.

However, the query execution itself was fast, around 11 ms.

But, the views rendering time was obscenely high: 2,603.48 ms!

So, when editing the view, you would see this at the bottom:

Query build time        2.07 ms
Query execute time     11.32 ms
View render time    2,603.48 ms

Google Crawler hitting your site too aggressively?

If your Drupal site suffers occasional slow downs or outages, check if crawlers are hitting your site too hard.

We've seen several clients complain, and upon investigation we found that the culprit is Google's own crawler.

The tell tale sign is that you will see lots of queries executing with the LIMIT clause having high numbers. Depending on your site's specifics, these queries would be slow queries too.

This means that there are crawlers that accessing very old content (hundreds of pages back).

Here is an example from a recent client:

Avoid excessive disk writes by avoiding PHP errors in your code

Recently a client complained that their site has been suffering from slow downs that were not there before.

Upon investigating the history of resource utilization, we found that memory usage has increased, and at the same time, the load average has increased as well.

To cut a long story short, we found out that initially, the site had neither dblog nor syslog modules were enabled, and they were enabled before the issues started. They started when syslog was enabled.

Remember that the CAPTCHA module disables page caching

By design, the Drupal CAPTCHA module disables page caching for pages it is enabled on.

So if you enable CAPTCHA for user login and/or registration forms, those pages will not be cached. This is often acceptable.

However, if you enable CAPTCHA for comments, and have the comment form visible at the bottom of each node, then a big portion of your site's pages will not be cached in the page cache at all.

Do not configure the Service Links module with TinyURL.com

We recently conducted a performance assessment for a client, and the main problem was something really simple, but also very detrimental to the site.

The site used the Service Links module, version 6.x-2.x.

The following are the performance figures for the site, in seconds, before any tuning.

 3.67 http://example.com/product/215
 2.64 http://example.com/product/572
68.32 http://example.com/list1
65.11 http://example.com/list2

How Google and Bing crawlers were confused by quicktabs

Quick Tabs is a widely used Drupal module. Site builders like it because it improves usability in some cases by reducing clutter.

Incidentally, the way this module works has cause us to run across performance issues caused by certain uses. See previous article about Quick Tabs can sure use more caching and a case study involving Quick Tabs.

Identifying aggressive crawlers using Go Access

Aggressive crawlers that hit your web site a lot can cause performance problems.

There are many ways to identify aggressive crawlers, including writing custom scripts that analyze your web server logs.

One tool that we found to be useful in analyzing which crawlers hit the site the most today or yesterday is Go Access.

Getting Go Access

Go Access is available for Ubuntu Natty Narwahl (11.04) only, but not earlier LTS releases.

Pages

Is your Drupal or Backdrop CMS site slow?
Is it suffering from server resources shortages?
Is it experiencing outages?
Contact us for Drupal or Backdrop CMS Performance Optimization and Tuning Consulting