Is there a way to store a 200MB immutable data structure in memory and access it from within a script?
The interesting thing about this question is the number of directions you can go.
I'm not sure if caching is your best option simply because of the large dataset and the relatively low number of requests on it. Here are some ideas.
1) Create a disk with disks . Link your mysql database table to use the ramdisk partition. I've never tried this, but it would be fun to try.
2) Linux usually has a very fast filesystem. Create a structured filesystem that splits records into files and simply calls file_get_contents () or file_exists (). Of course, this solution will require you to create and maintain a filesystem, which is also fun. rsync can help update the current filesystem.
Example:
/002/209/001/299.txt
<?
$file = $this->build_file_from_ip($_GET['ip']);
if(file_exists($file)) {
// Execute your code.
}
?>
source to share
Have you tried a NoSql solution like Redis ? The entire dataset is managed in memory.
Here are some benchmarks .