We had a need from a client where they wanted to check certain complex conditions from a relatively big decision matrix. Without going into specifics, they wanted to check a progressive set of rules for users and taking certain actions when all the conditions were met.

Checking one or two conditions is not a problem on a modern day web site, but because the rules are progressive and have to be all checked from the start every time, that involves a lot of processing, a lot of database queries and basically a lot of time.

So, if a user triggers one of the conditions, the page would take some to respond back to the user. Obviously not something we would like to have the users experience.

That means we had to think of a way to delay the processing somehow. We thought of several options, which we will discuss here and explain the pros and cons of each.

1. Do the processing on the spot, immediately.

The pros of this is that the user will know the results immediately (e.g. gaining user points).

The con is that the page may take several seconds to respond back. The user can be impatient, and do the action one more which would cause problems.

So, we decided against this option.

2. Do the processing at the end of the page.

For this option, we stored the user IDs in a static variable that holds an array of users, and delayed the processing to the end of the page request using hook_exit().

The code to do this is as follows:

function mymodule_nodeapi($op, $node, ....) {
if ($op = 'insert') {
mymodule_schedule($node->uid);
}
}
function mymodule_schedule($arg) {
static $users = array();
if ($arg == 'get') {
return $users;
}
// Store the user ID to the static array
$users[$arg] = $arg;
}
function mymodule_exit() {
foreach(mymodule_schedule('get') as $uid => $x) {
mymodule_process($uid); 
}
}
function mymodule_process($uid) {
// Process the conditions for $uid here
} 

What this does is delay the "heavy" processing until after the page has displayed its results to the browser, and hook_exit() is called.

The pros is that we now delay the processing and the user experience is not affects. Also the results are near-immediate.

The cons is that we are using memory to hold the uid's to process. If the page does not complete for any reason, we lost that info in memory, and it cannot be reprocessed.

3. Do the processing at cron.

The third option is to delay the processing until cron is being run.

This has the pros that load is minimized on the server, and user experience is unaffected.

The con is that the results are delayed until cron is run, which can be up to an hour for most sites.

The ideal helper module to do this with is a little known gem that is job_queue.

In a previous post we described how to reduce page load times by sending emails from cron rather that on page submission.

Here is the code to do the same using the job_queue module:

function mymodule_nodeapi($op, $node, ....) {
if ($op = 'insert') {
job_queue_add('mymodule_process', 'Process matrix', array($node->uid), '', TRUE);
}
}
function mymodule_process($uid) {
// Process the conditions for $uid here
} 

The TRUE argument prevents adding of duplicate requests to process the same user ID.

Is your Drupal or Backdrop CMS site slow?
Is it suffering from server resources shortages?
Is it experiencing outages?
Contact us for Drupal or Backdrop CMS Performance Optimization and Tuning Consulting