Is there a way to store a 200MB immutable data structure in memory and access it from within a script?

I have a list of 9 million IP addresses, and with a set of hash tables, I can make a constant time function that returns if a particular IP address is on that list. Can I do this in PHP? If so, how?

+2


source to share


5 answers


The interesting thing about this question is the number of directions you can go.

I'm not sure if caching is your best option simply because of the large dataset and the relatively low number of requests on it. Here are some ideas.

1) Create a disk with disks . Link your mysql database table to use the ramdisk partition. I've never tried this, but it would be fun to try.



2) Linux usually has a very fast filesystem. Create a structured filesystem that splits records into files and simply calls file_get_contents () or file_exists (). Of course, this solution will require you to create and maintain a filesystem, which is also fun. rsync can help update the current filesystem.

Example:

/002/209/001/299.txt

<?
$file = $this->build_file_from_ip($_GET['ip']);
if(file_exists($file)) {
    // Execute your code.
}
?>

      

+2


source


This sounds like the perfect Bloom Filter app to me. Check out the links that can help you get it done as soon as possible.



+4


source


I think throwing it into memcache will probably be your best / fastest method.

+2


source


If reading the file in sqlite would be an option, could you take advantage of indexes, thereby speeding up searches?

Otherwise memcached is an option, but I don't know how existence would be checked if you do it with a pure php lookup (pretty slow my guess)

+2


source


Have you tried a NoSql solution like Redis ? The entire dataset is managed in memory.

Here are some benchmarks .

+1


source







All Articles