Parsing Clob & JSON versus dedicated table for nodes in JSON

We are currently evaluating various data-tier designs for a project.

We have JSON

data (nothing more few megs

and less than 500

all-inclusive nodes) coming from different sources. We have completed work on two approaches. We also expect no more 500

concurrent requests.

  • Save JSON

    both CLOB

    to the database and the python json

    module to analyze and retrieve the required fields

    one used by the web application.
  • Create a table for JSON nodes

    and save it in a selection oracle table

    and retrieve it for use withweb application

Any kind / suggestion of third degree required prior to our POC testing and performance testing.

Thank you for your help. I know the question has a broader scope, but I think it is specific enough to be addressed here.

UPDATE:

  • We are using Oracle 11g
  • The requirement is not for reporting, but for displaying content in a web application.
  • The web app is supposed to have 10k-30k requests /day

    and we will parse the JSON for each request. So 10k-30k

    once / day.
    1. This is just one part of a large web application.

What we are trying to set up is parsing is it worth having 500 odd columns (many tables)

vs JSON which can be handled with Python dicts which are faster anyway.

+3


source to share





All Articles