What is the best approach to limiting database traffic on a large query?
I have a database that I am dealing with that is updated every few hours remotely (not for a specific period of time), and I have no control over its administration. I have web clients connecting to it to view the information it contains. These clients (coded using PHP and Javascript) can check the database frequently (impatient users) even if there can be no changes in the database itself, and the check will involve a rather lengthy query involving many lookups and cross-references, etc. d ..
So to cut down on database queries and save everything, what would be the best way to limit the number of times clients will actually run the entire query on the database?
To make things crystal clear, I cannot change the database, I can only query it. But I have full control over the source of the web client.
source to share
I would look at Zend_Cache .
Each request will only be executed once for the lifetime of the cache (which can be easily changed). Sessions are not a viable solution as the same request will run at least once for each user. Sessions can also be easily updated.
You need to check if the result set exists in the cache. If not, cache the result set using a unique identifier:
$query = "SELECT * FROM really_big_table";
$cache_query = md5($query);
if(!$result = $cache->load($cache_query))
{
$result = $db->fetchAll($query);
$cache->save($result, $cache_query);
}
By using the cache, you set your own time interval when data is refreshed for all users.
source to share