A few months ago, we saw something strange at a client. They were facing slow cache queries, such as the following ones.

2507.25 1 cache_get SELECT data, created, headers, expire FROM cache_menu WHERE cid = '1:en'
1303.68 1 cache_get SELECT data, created, headers, expire FROM cache WHERE cid = 'content_type_info' 

They are running MySQL on a separate box from the one that has PHP running. Running the same SQL locally on the MySQL box did not show the same slow performance as running it from the PHP box.

Upon closer investigation, it was found to be that they had the link between the boxes set to 10MBps only, instead of the full 1000MBs that it can do. Once the servers were configured for the proper speed, performance became much better.

What is interesting is that there is a MySQL option to get around such a problem on the software level. MySQL provides a flag for the mysql_connect() function that would compress the data sent. See MySQL Performance Blog: Large result sets vs. compression protocol.

This has also been reported more than 3 years ago for Drupal in issue #11891. So, I created a patch for Drupal 7 (applies to Drupal 6 RC1 as well) that you can download and apply. Ideally, this would be a settings.php option that turned on for large sites that are on more than one box.

Comments

Thu, 2009/04/09 - 04:14

Thats a very good information.

Is your Drupal or Backdrop CMS site slow?
Is it suffering from server resources shortages?
Is it experiencing outages?
Contact us for Drupal or Backdrop CMS Performance Optimization and Tuning Consulting