Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Limitations of Data Mart

...

Client data that does not reside in the Eagle warehouse cannot normally be built in Data Mart tables. There are limited opportunities to load data directly to Data Mart tables and bypass the operation of the PACE engines. Contact your Eagle Relationship Manager for additional informationClients should pursue these only after consulting with Eagle.

Eagle’s best practice is to load all data into the warehouse first, and then load the pertinent data into the datamart tables.

If you must introduce data outside the warehouse to supplement Mart tables, best practice is to load all data into the warehouse and then load this data into the Data Mart tables. As a last resort, use views, either materialized or non-materialized, of the required databases. Common fields between views and Mart tables enable joins between the two. Keep the following issues in mind when using views:

  • Performance implications. Views can severely impact the performance of the Mart. You should not use them without being fully aware that performance degradation can occur.

  • Data Congruency. Data directly in the warehouse such as transactions may be updated, but positions in the Mart may not match up. When you use a view, you pull data directly from the database and not from the Mart which may have out of sync data

When you use Data Mart, you can use views to supplement your Eagle data. This practice can offer an opportunity to use the Data Mart schema as an integration point for enterprise information delivery. See “How to Move Non-OLAP Data to the Mart. The Eagle Portal Access SQL Query is used by clients to create queries and dashboards that source data from non-Eagle sources, and integrate those sources with their Eagle data. See the section Eagle Portal Administration for information.

...

Field attributes are used to build all of the data in the Mart, so Mart data tables contain no values that cannot be produced by field attributes. However, there are some steps you can take to compensate for this limitation:You can build views

  • Views against warehouse fields can be built, and

...

  • field attributes built against those views.  These field attributes can underlie fields in Mart tables.  However, prior to V11.0 there is a step you must take to support view-based field attributes in Data Mart. 

  • Some warehouse tables are structured in ways that make OLAP processing impossible.  For these,

...

  • a view may be created directly against the warehouse table data, and

...

  • included in the Data Mart database.

  • Performance needs to be considered when using views as mentioned in the “Ties to the Eagle Data Model” section.

Consider performance when using views as described in the “Ties to the Eagle Data Model” section.

Use Cases Not Supported by Data Mart

...

Most Eagle clients design data workflow to follow a daily cycle:

  • Once markets close and regular-hours trading ends, the daily accounting update cycle sees trades updating positions, prices applied to create market values and reference and analytical data updated. Accounts are reconciled and submitted for automated and manual error checking.

  • When this activity is complete, end-of-day processing such as entity builds commences, with performance measurement calculations usually coming last. Further data QA and approvals take place.

  • With all warehouse contents in a validated state, clients build their Mart and use it to deliver enriched information during the ensuing business day.

In an ideal world, there would be no reason to make further changes to data values updated as of past completed business days. However, this is not the case in the real world. Vendor data corrections impact previously loaded security information. Late trades and trade reversal/re-entry activity affect accounting data values in the open accounting period.

It is physically possible to use Process Manager to build a workflow that automatically detects every such intra-day data change and updates the Mart accordingly soon afterward. However, such a “real-time” Mart is likely to produce undesirable results, primarily for two reasons:

  • A real-time approach may allot insufficient time for data QA, especially when the consumer is a client of the firm. Proper evaluation of updated information must allow for at least the possibility of human intervention in exception-management mode. It is unlikely that this type of discipline can be applied to high-frequency intra-day changes.

  • If updates take place on a continuous random basis, there is no “snapshot stability” of information. Unstable data is likely to lead to more questions of how recent or how clean the numbers on the page or screen are.

Two approaches to incorporation of data changes are considered best practices:

...