Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Develop a Data Mart Implementation Plan

...

Use the Data Mart Planning Template

Global Professional Services Eagle has a template that is very useful for storing all of the Data Mart and regular meta data elements involved in a Mart, and can serve as the basis of an implementation plan.

...

When more than one type of data must be reported in more than one source hierarchy, the number of required source rules can proliferate and lead to several snapshots and heavy duplication of data. For this reason, you should make choices to limit the required list of snapshots. See Sharing of Security Details Among Snapshotsfor additional information.

...

System performance in building Mart data is enhanced when your design takes advantage of parallel processing by multiple concentration (OLAP) engine instances. Each Data Mart table extension is populated under the direction of a separate Model Manager instance that generates its own set of underlying OLAP processes. This means that the use of extensions can increase parallel processing and speed population of the Mart.

A Global Professional Services study has recorded the advantages of extensions in populating performance measurement data in particular. See the Data Mart White Paper in Knowledge Base article 8459. Performance Analysis OLAP processes underlying a Data Mart model are among the most I/O and compute-intensive intensive of any, so extensions can be particularly helpful in minimizing run time as you build performance linked returns, risk fields and other complex calculations. When you assign performance fields to an extension, it is best to group fields together that you would logically assign to the same Performance Analysis OLAP report, such as a single attribution group or a series of performance link analysis fields that span either a short or a long-term time span.

...

If you use Process Manager to build your Mart, pay special attention to the following settings described in the following sections.

Avoid the Sequential Processing Setting

...

Performance returns are typically stored in the database at the security, group and fund levels. When a Data Mart performance OLAP process runs to build performance analysis fields, it computes returns for all levels represented in the performance dictionary that is linked to the model by default. For the Fund Summary and group models, you can use the Performance Report Options link in the Filters/Mappings tab of the model user interface to limit computing of returns to just what is required by the model. For example, Entity level for Fund Summary. Note the following:

  • If you are building performance attribution fields where your fund and/or group returns are based on rollup of lower-level contributions, you must not limit the rollup process or you will not get the expected attribution data at a rollup level. Note that this does not apply to models where you use dynamic performance rather than a performance dictionary.

  • If you plan to report security-level performance returns linked to your groups, you can use the Create Mapping for Details option on the group model’s Filters/Mappings tab to create a convenient table of pointers from group table groups to detail table rows. However, if you do this, you cannot also limit the OLAP build to exclude the detail level. You must choose one or the other. If the choice is to avoid building the OLAP to security level, you can still associate groups and their underlying detail members by including grouping fields as data fields in the detail model, and using reporting to sort on those fields shared by groups and detail, for example, Country and Sector.

Economize on Associated-Benchmark Fields where Possible

You can associate one or more index entities entity with a fund as its custom list of benchmarks. Then you can define performance analysis field attributes for the fund that point to any of its associated benchmarks. In Data Mart, all of these associated-benchmark fields are built on the same table rows as the fund, which is convenient for reporting purposes. Further, if the benchmark is of the custom blended variety and fund-specific, or if the field links benchmark returns since the inception date of the fund, only these associated-benchmark fields will do the job.

However, you might need to report benchmark returns whose values do not depend upon association with a particular fund. For example, the QTD return of the “vanilla” S&P 500 index as of a given date is the same number for all funds. Obviously you can reduce your Data Mart build requirements by building that number just once rather than once per fund. Benchmark returns may be built on their own rows in Fund Summary. Then for each individual fund you can populate Fund Summary with the entity IDs of its associated benchmarks, and design your SQL accordingly.

Engage Eagle Technical Services

Database tuning can almost always make a well-designed Mart run even better. In addition, if your Data Mart requirements involve hundreds of fields or thousands of funds, hardware sizing also deserves expert attention. Eagle Technical Services offers the benefits of their experience with a variety of significant Data Mart implementations, and can add substantial value to your Mart planning process.

...

Mart configuration is usually considered an administrator function, and is not a business user function. Data Mart’s privileges in User Manager are granular enough to support the differentiation among configuration roles that most organizations need. Where there is more than one Data Mart role required, the roles usually fall under the following categories:

  • Administration, with all privileges.

  • Model management, with Maintain, Configuration, Migrate and View privileges.

  • Production, with Submit and View privileges.

  • Data quality assurance, with Audit, View and Validate privileges.

Adopt an Archive Policy

Archiving Data Mart tables can improve index rebuilding and query performance simply by reducing the number of rows to be navigated. A logical policy is one that preserves each data frequency (daily and monthly) for the time span that reporting may need it. Many clients preserve monthend data indefinitely, but keep daily data for a few months or a year.

...