Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Limitations of Data Mart

...

Client data that does not reside in the Eagle warehouse cannot normally be built in Data Mart tables. There are limited opportunities to load data directly to Data Mart tables and bypass the operation of the PACE engines. Contact your Eagle Relationship Manager for additional information.

Eagle’s best practice is to load all data into the warehouse first, and then load the pertinent data into the datamart tables.

...

When you use Data Mart, you can use views to supplement your Eagle data. This practice can offer an opportunity to use the Data Mart schema as an integration point for enterprise information delivery. See “How to Move Non-OLAP Data to the Mart. The Eagle Portal Access SQL Query is used by clients to create queries and dashboards that source data from non-Eagle sources, and integrate those sources with their Eagle data. See the section Eagle Portal Administration for information.

...

Consider performance when using views as described in the “Ties to the Eagle Data Model” section.

Use Cases Not Supported by Data Mart

...

  • Account reconciliation. You can reconcile data between sources using source-specific snapshots. However, for most reporting purposes, best practices is for you to reconcile first and then load only validated, reconciled data into the warehouse.

  • Accounting and Performance operational reports. Accountants and Performance Analysts engaged in intra-day operational and data QA activities may send pre-approval data to the Eagle warehouse that is relayed to Data Mart as a part of automated processes such as Process Manager. You should weigh the costs and benefits of such a workflow, and guard against unintended reporting of unapproved in-flight accounting and performance fields.

  • Ad hoc reporting. Investigational queries into the database may go against fields that would otherwise support few, if any, reporting needs. Such fields may not be a good use of Data Mart “real estate”, making reporting against them a very appropriate use of OLAPs. Note that “data mining” against more reportable fields with the aid of a cubing application (for example) is not a very appropriate use of Data Mart.

Real-Time Reporting

Most Eagle clients design data workflow to follow a daily cycle:

Once markets close and regular-hours trading ends, the daily accounting update cycle sees trades updating positions, prices applied to create market values and reference and analytical data updated. Accounts are reconciled and submitted for automated and manual error checking.

When this activity is complete, end-of-day processing such as entity builds commences, with performance measurement calculations usually coming last. Further data QA and approvals take place.

With all warehouse contents in a validated state, clients build their Mart and use it to deliver enriched information during the ensuing business day.

In an ideal world, there would be no reason to make further changes to data values updated as of past completed business days. However, this is not the case in the real world. Vendor data corrections impact previously loaded security information. Late trades and trade reversal/re-entry activity affect accounting data values in the open accounting period.

It is physically possible to use Process Manager to build a workflow that automatically detects every such intra-day data change and updates the Mart accordingly soon afterward. However, such a “real-time” Mart is likely to produce undesirable results, primarily for two reasons:

A real-time approach may allot insufficient time for data QA, especially when the consumer is a client of the firm. Proper evaluation of updated information must allow for at least the possibility of human intervention in exception-management mode. It is unlikely that this type of discipline can be applied to high-frequency intra-day changes.

If updates take place on a continuous random basis, there is no “snapshot stability” of information. Unstable data is likely to lead to more questions of how recent or how clean the numbers on the page or screen are.

Two approaches to incorporation of data changes are considered best practices:

  • Batch each day’s corrections and accounting restatements during the day, and evaluate those using standard data QA procedures. As part of the overnight Mart build, incorporate these changes into selective required rebuilds of the Mart for all past dates affected.

  • Take the same approach, but do so at one or two regular times during the day as well as at end of day.

Both of these approaches allow for proper data validation and promote a stable Mart whose “vintage” can be understood by users, while keeping reportable information quite current with ongoing changes to securities and accounting.