Moving data from DB to static files, runtime variables

I am using playframework 1.2.4 and a mysql database.

I have several tables that are in the DB, but are pretty static. This is about 100-300 records with 4-7 keys. I want to take them out of the DB.

In such cases, what would be the best place to store the information?

  • New game config file?
  • XML of some kind with a parser?
  • JSON / YAML file with access (this id doesn't know it :)) from playframework / java?
  • Java file that stores all information in a class?

What's the best solution and what's the playframework sermon playing in it?


source to share

3 answers

I would group your options into the following:

  • Relational database
  • XML or JSON (be it a configuration file or a separate document).
  • External in-memory caching like Memcached (also available from Play).
  • Store data as a Java data object, or use a Plain, unallocated cache that stores data on the Java heap.

You have two issues that you will need to deal with: 1.) the performance of your data access and 2.) the maintainability of your dataset. We can deal with them separately.


You have two performance considerations. The first is the access / latency time, the second is the amount of data stored in memory. When it comes to access times, the database will certainly have the largest backlog. Numbers 2 and 3 will vary depending on your hardware and OS - you will need to do some testing on your particular setup to really know for sure which is best. Having to fetch an XML / JSON file from disk will certainly be much slower than accessing it from an in-memory cache, but operating systems are usually pretty smart about keeping frequently accessed files in memory. Perhaps the overhead of using the caching system ends up slower than just using a static file.

Your data should eventually become a Java object so you can interact with it from your code. So # 1,2 and 3 will require some serialization of the external data into a Java object, which will take a while. Java variable access (# 4) will be the fastest as it does not require additional serialization.

In terms of the amount of data held in memory, # 2 and # 4 require you to keep the entire dataset in memory every time you run the code (i.e. load any of the respective pages). This might not be a big problem for 7 x 300 values ​​(8kB if they were all floating point), but when you expect one server to serve thousands of clients, this additional memory consumption used on every page load can actually become bottle neck for you. You could ease this cost if you could store the data statically. Option # 3 would only store the data in memory once and then share it across all requests, which shouldn't be a big memory penalty. # 1 will most likely function like # 3 (worst case),as it will keep one copy in memory all over the world. These methods will store one global copy of the data in memory and then serialize the corresponding code into Java objects (possibly one row of the dataset).

Finally, note that the downside of using a Java based internal internal cache is that it requires scalability. Be sure to read the details here .


Another issue is how difficult it is for you to maintain the data. It looks like you are against an RDBMS based solution due to maintenance and deployment issues. Memcached probably won't offer you much of the database, as you still need to ensure cache consistency across your deployment environments. At this point, you are left with options # 2 and # 4. In my opinion, it is much easier to work with a dataset stored in XML or JSON than it is in Java, so I would say that the advantage relates to an XML based dataset from the point in terms of maintainability.


So XML is probably the best solution for maintenance and deployment, Java objects are the best in terms of performance. Luckily, we can use the tool to get the best of both worlds. PojoXML allows you to convert POJOs (Java Objects) to XML and vice versa. This way, you can store your data in XML and then use PojoXML to transform that document into a Java object whenever your dataset data changes. Then you get the performance benefits of storing your static data in compiled Java code, with the ability to support XML usage. Just remember to save the data as statically so you don't consume 8KB of memory for every page load.



I will still store it in the DB. The "static" data that you call is typical data that hardly changes, and the best place for any data is the database.

You wouldn't get any benefit by moving this out of the DB into a static file. You will now have two places for your data - the database and the static file, which will eventually contain more service.



Since your data volume is not too large and the data is static, the easiest way, and I think the most efficient one, would be to pass your data to the java project as a Hashtable instance. Initialize this table when the application starts and use it when needed.



All Articles