What's the best way to cache an archive file?

We have a project page that consists of user files, media, etc., and we want to allow the user to export it all into one ZIP file. We are using unix and mysql to store all this data currently, and our main goal is to minimize the download / performance time from all the processing and compilation of all files into a zip file.

My idea was to cache the zip file to a temporary directory and store the entire CRC checksum for each file in the zip into a separate text file. Every time a user tries to export, I first check each CRC file and compare it to the list before adding or removing files from the zip file.

But my other problem is the space the zip file will take up since we can have many users.

IMHO this is probably the dumbest way to do it, so can any of you please suggest a better way to deal with this problem?

thanks ~ CodeNoobian

+1


source to share


3 answers


This is a replay of the premature optimization, just use a very light compression, aka "faster" and worry about speed if that really is a problem.



+2


source


If the transfer / download speed is not a concern, I recommend using an uncompressed tar file. TAR is a very simple format, so it will be easy to write code to update its sections when multiple files have changed. Also, leaving it uncompressed would be a huge win in server CPU time.



Of course, leaving it uncompressed will take up a lot of space on your server. But since it is uncompressed it can remove the need to keep a copy of the file cache altogether, if you can build it fast enough, you can just create it on the fly as needed. Then you don't have to worry about keeping the CRC and updating the TAR.

0


source


Regular sound and image files compress pretty well to start with, right? It might be worth looking at your payload to see how much you buy with compression.

0


source







All Articles