Author: Joseluis Laso
Viewers: 152
Last month viewers: 2
Package: PHP Preemptive Cache
Read this article to learn how to apply the Preemptive Cache class in practice to optimize the performance of scripts that retrieve data from databases.
Contents
Introduction
The Preemptive Cache class
How to Use the Class in Our Scripts
Other Uses
Conclusion
Introduction
The first part of this article presented techniques to profile and detect slow code. It also describes how caching can often provide great performance improvements in many cases that cause your application code to be too slow.
In this part of the article it is presented a solution that uses a caching class to solve a real problem in practice, in the case accessing database records.
The Preemptive Cache class
The Preemptive Cache class was created to fetch any type of data and keep frequently accessed data in memory in a private array with some extra features.
The way the class works is pretty simple. When we create an object, we pass a closure (callback function) with the code that retrieves the data that the application needs. This idea of passing a closure function is to make it generic so it is independent of the origin the data.
The class simply stores the data in memory in a private array, so it becomes immediately available in case the application needs to retrieve the same record of data.
How to Use the Class in Our Scripts
The best way explain how it works in practice is showing an example. A small example is presented here to apply all the things that were presented in the first part of this article.
First of all, this is not very realistic example. Usually we will not find cases as simple like example presented here. The main goal of this example is to explain how to use the class in practice to demonstrate the concepts presented in the article.
In real world situations you will find more complex problems that intricate complicated relations and conditions.
The idea is to offer a starting point to demonstrate the theory presented here so far. You, the reader, as a developer, sure can fill the gaps and relate this solution with your own project real problems, if not now, I am sure you will do it in the future.
The base of the example is:
- We have families with a name and ratio value
- We have products with a name, ratio value that belong to a family
- We have a lot of data about products and costs
- We need to generate a XML with a record for each from a table but the computed value is calculated as cost * ratio_product * ratio_family
- The XML has to summarize the total cost of all the involved records
php script.php -h"host" -u"user" -p"password" -d"database"
It is an easy way to avoid needing to have config files and to run again and again with different values if we need it, and so let us to focus on the important matters.
The default values for these fields are: host=localhost, user=root, password="", database=test, you can omit the parameters that match the default values of your installation of MySQL database.
Obviously it is advisable to change it for less common credentials but that is up to you to decide if you want to do that because this is just an example to demonstrate how the package works.
Then, the order to run the things is:
- mysql localhost -uuser -ppassword -dtest < creation.sql
- php create-sample-data.php -uuser -ppassword -dtest -hlocalhost -n1000 -m10000 -o1000 # n means the quantity of fake family records, m for data records and o for products
- time php sample-unoptimized.php -uuser -ppassword -dtest -hlocalhost # write down the result in seconds of this command
- time php sample-analysis.php -uuser -ppasword -dtest -hlocalhost # will give us a little analysis of the DB read
In my tests, in order to obtain interesting values, I have used 100.000 records for data table, 10.000 products and 1.000 families.
The idea with this is trying to simulate a realistic behaviour. To be honest, reading 1000 records probably does not need a cache like that.
All the records generated for data, products and family tables are randomized, maybe your results and mine may differ for this reason.
The important thing is that you understand the process and that you can encounter with a similar problem in the future.
All the process last long time with these premises, for this reason I have implemented a little progress bar to not desperate while the process take action.
The results of the execution of the unoptimized script already were seen in the first part of this article, let me show the quantity of seconds that this script lasts.
And, when run the optimized script the result is this one:
Keep in mind, that if we limit too much the quantity of records that the cache will keep in memory, It will most likely increase our execution time instead of decrease. In this simple example you can see that we are assigning 1000 records in the cache constructor.
In the real life, when the script has to do a lot of things and connect with an external server the slowdown probably does not matter. In my local environment with any number less than the total of records to read the optimized script takes more time than the unoptimized one.
Here is the part of the constructor call.
$familyCache = new PreEmptiveCache( function($id) use ($conn) { return mysqli_query( $conn, sprintf( 'SELECT `p`.`name` AS `product_name`, `p`.`ratio` AS `product_ratio`, '. '`p`.`family_id` AS `family_id`, `f`.`name` AS `family_name`, `f`.`ratio` AS `family_ratio` ' . 'FROM `product` as `p` '. 'LEFT JOIN `family` AS `f` ON `p`.`family_id`=`f`.`id` '. 'WHERE `p`.`id`= %d', $id ) )->fetch_assoc(); }, array( 'maxRecordsCached' => 99999, 'mode' => PreEmptiveCache:: LESS_OLDEST_MODE, 'debug' => $debug, ));
Other Uses
We can use this simple cache system to improve the performance not only when accessing data from a database, but also when accessing a remote service, like weather forecasts, currency exchange rates, last tweets, etc..
All the data that is susceptible to be stored to prevent access again to an expensive resource can be handled with this method, and so improve the performance results of the script because we do not need to access again to the source of data so frequently. Obviously we need to be certain that the data does not change while our script is running.
Conclusion
In this article we have learned to how to increase the performance of scripts using caching to avoid accessing repeatedly expensive resources that do not change frequently. This is a general solution that can be used in many performance problems, including the ones that you may be having in your current projects.
If you liked this article or have questions regarding the performance optimization through the caching approach presented here, post a comment the this article here.
You need to be a registered user or login to post a comment
1,616,107 PHP developers registered to the PHP Classes site.
Be One of Us!
Login Immediately with your account on:
Comments:
2. Cache vs per-request - Robert Boyer (2015-09-15 19:26)
Confused how a web service call can apply memory cached info... - 0 replies
Read the whole comment and replies
1. HTML generation and Memcache - Keimpe de Jong (2015-09-15 11:54)
HTML generation and Memcache... - 1 reply
Read the whole comment and replies