Serving raw data from memory (not a database)

I'm thinking of writing a small SPA that serves as a kind of "base" viewer for assets in a video game (vehicles, weapons, etc. that players can use). I want users to be able to create advanced queries about what types of vehicles / weapons / etc. they want to see, based on a number of properties, each of these objects (so that you can filter items in Amazon or Newegg based on item property values).

My question has to do with where and how to store this data.

It takes the form of a 10MB JSON file, too big to tie into the SPA client code, which means I'll need to serve it up somehow (maybe an Express app). I could easily throw this into MongoDB, but I'm wondering if it's possible that this is overkill.

In such a case, when your data is on the order of 10MB (an array of ~ 10,000 elements), can your Express application just load the JSON file into an array of in-memory object objects and have REST endpoints just do your own JavaScript array operations to filter and sorting the main collection to create a subarray (or a single item) to return? Is there a reason not to do this?

Obviously, if your data is of a certain size, you would like to throw it into the database, but I am very fuzzy where this cut point is, or how you would go about it (assuming that I suggested it isn't completely ridiculous).

+3


source to share


1 answer


As long as you don't change the content of your object much, it is certainly possible and easy to do. There are even fairly reliable query options for you. If you need to update information, you limit scalability by keeping it all in memory and increasing the risk of data loss.

Update



For queries, I am assuming that you are using the latest versions of Node that support ES5 and 6 array prototypes. If so, using Array.filter is a great basic query. If you need more complex or complex things, or cannot rely on modern JavaScript, the next option is Lodash, which has some pretty rich collection operations.

0


source







All Articles