Why do we need a materialized view in oracle when Oracle Data Pump can update data?

We have two schemas on another oracle server. We plan to use one schema as a transaction database and another schema for reporting.

Since the transaction database is the entry point for any user data, we want this data to be replicated / sent to the reporting schema periodically. We thought about creating materialized view logs in the transaction database and a materialized view in the reporting database. We then planned to perform a planned quick update using Db Link.

However, the DBA suggested that we use Data Pump, which will export and import the entire schema. The update must be done once a day. Which one is the best solution in terms of performance and network utilization?

+3


source to share


1 answer


Summary

Both approaches have their pros and cons. There is no general answer. You need to navigate.

More details

Materialized Views by Database Reference

You have two options to make this work.

The first option is to use the fast update function. With this feature, Oracle updates the materialized view on every change using an efficient delta mechanism. This can save a lot of data when only small portions of the tables change during each day. And the data is always fresh. But this mechanism does not work with all types of tables (for example, problems with certain joins and LOB columns). And when your database goes down the materialized view will be out of sync and needs to be rebuilt. Last but not least, accounting for delta changes puts additional pressure on all write operations in the original table.



The second possibility is to use a full update, eg. with materialized view groups (DBMS_REFRESH package). This will always cause a full refresh, but it will not add additional write pressure to keep the book.

In both cases, the two databases are now tightly coupled: changes in one database also trigger changes in the other database. And you can't put the databases on your own: they always need a fast connection with low latency and no firewall tampering. And ultimately, the databases must be at the same time, or you risk losing the materialized view.

ETL tool

Alternatively, you can always use some sort of ETL tool that pulls data from one database, transforms the data according to some given rules, and loads the result into another database. You can (incorrectly) use oracle Datapump for this task, or use any third party tool. The ETL tool can cache data, transfer it over any WAN connection, and transform it when the source or target database schemas need to be changed. Usually the ETL tool also provides some kind of delta mechanism.

With an ETL tool, you are more flexible, but you depend on a component outside of your database that needs to be supported.

+2


source







All Articles