Parsing Clob & JSON versus dedicated table for nodes in JSON
We are currently evaluating various data-tier designs for a project.
We have JSON
data (nothing more few megs
and less than 500
all-inclusive nodes) coming from different sources. We have completed work on two approaches. We also expect no more 500
concurrent requests.
- Save
JSON
bothCLOB
to the database and thepython json
module to analyze and retrieve the requiredfields
one used by the web application. - Create a table for
JSON nodes
and save it in a selectionoracle table
and retrieve it for use withweb application
Any kind / suggestion of third degree required prior to our POC testing and performance testing.
Thank you for your help. I know the question has a broader scope, but I think it is specific enough to be addressed here.
UPDATE:
- We are using Oracle 11g
- The requirement is not for reporting, but for displaying content in a web application.
- The web app is supposed to have
10k-30k requests /day
and we will parse the JSON for each request. So10k-30k
once / day.- This is just one part of a large web application.
What we are trying to set up is parsing is it worth having 500 odd columns (many tables)
vs JSON which can be handled with Python dicts which are faster anyway.
source to share
No one has answered this question yet
Check out similar questions: