Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Protect Data Mart Field Attributes

You should protect the field attributes you use to build your Data Mart fields against unintended changes by PACE users unaware of their use in your Mart models. Any change could produce invalid data in your reports, and some changes, especially changes of field type, could cause a model to fail. It is difficult to detect a field attribute change except by checking the update of each field used in a model.

The Usage capability that shows which components use a field attribute includes Data Mart as one of those components. Because of this, if you use a field attribute to underlie a Data Mart field, you may not delete that field attribute until you delete the Mart field or use the Modify feature in the Fields tab of the model user interface to change the Mart field’s underlying field attribute to another one. Usage also indicates the Data Mart schema of each Data Mart model where usage occurs.

In addition, field attributes used by Data Mart might be inadvertently changed for some other purpose in a way that undermines their intended use in the Mart. The following sections describe some simple ways to minimize this risk.

Restricting Access to Data Mart Fields

You can restrict the field attributes so that only one user or business group is authorized to change them. This requires that your Data Mart field attributes not be used for OLAP reports or other uses.

Limit the Number of Entities per Model Manager

Similar to the number of entities per OLAP process you can also control the number of entities per model manager. You can increase or decrease the default value of 1,000.

Using a Naming Convention to Clarify Mart Usage

You can reduce use of your Data Mart fields for unintended purposes by prepending them with “dm” or a similar convention.

Using Comments to Identify Mart Fields

Each field attribute features a Comments section displayed in the Data Mart field selector. You can display a standard comment during model field selection, leading you to select only designated Mart field attributes.

Usage Indicator in the Mart Field Add Window

A blue dot next to any field in the Add window, linked from the model UI Fields tab, indicates that the field is used in another model. Your field attribute list may contain multiple field attributes with the same description. If so, the blue dot indicates which of those same-named fields you have used elsewhere in the Mart. You can then reselect the blue-dot fields to maintain consistency of business definition across models.

Clarifying the Identity of Grouping Fields

You may find it useful to use a naming convention to distinguish the Mart table column names of your grouping fields. This is because when these fields are in a group-level table, your report developers may find it difficult to quickly identify which are the grouping fields and which are regular non-grouping fields. A suggestion is to prepend them with “GRP_” or append them with “_GRP”.

Converting Group Models to Dynamic Performance

Use Dynamic Performance to convert a group-level model from using a performance dictionary to the dynamic performance approach. This requires that the Mart model’s grouping does not use a performance dictionary, and that the dictionary is mapped to the Mart model in the Fields tab. This restriction applies because you cannot change the grouping structure of a group-level model once it is populated with data. Use the Remove link opposite the Performance Dictionary Mapping link to cancel use of the dictionary and use dynamic performance going forward. You can change back to use of the dictionary if necessary.

The most common use case for doing this is if your organization has decided to discontinue running performance calculations for a particular performance dictionary, and use a dynamic approach instead. If you do this, you can also apply this policy to the Data Mart.

Using Filtering

Deciding When to Use Filtering

It is not likely that you will want to filter data in your Fund Summary or detail models, though you may do so if necessary. The most likely use case for filtering is if you have certain types of securities, trades or positions that you must always exclude from reporting, and which you can identify by a simple filter.

However, it often makes sense to filter group-level data. Group models are made for delivering information in an analytical way, and a filter can omit information that is not pertinent to the analysis provided by the model. An example is a grouping on ranges of equity beta, in which it is natural to filter out all but equity securities.

Filtering for Specific Versus General Purposes

Any filter in a Data Mart model is applied to just one type of data at a time, such as positions, cash, trades, or security. This means that in a group model you may filter for example, trades for purchases only, but leave your position data unfiltered. It also means that an across-the-board filter such as “supervised assets only” must be applied to each type of data separately and identically. So, to create a group model for fixed-income only featuring position, performance and transaction fields you would need to set up a filter for “investment type = fixed income” three times in the Filters and Mappings tab.

When you filter data in a group model, map details to the group, and use the map table “pointers” in your report SQL query to retrieve data, detail rows are only retrieved for detail rows that pass the filter. This means that a group model filtered for fixed income reports only bond positions under a set of bond groups. However, this does not mean that the Position Detail table loses any non-bond positions; only the map table from groups to positions is missing rows for non-bonds as a result of the filter.

Filtering Performance Data with Dynamic Performance

Regular Performance Analysis OLAP reports based upon performance dictionaries do not support filtering because returns and other performance data stored for a dictionary are pre-computed based upon an unfiltered portfolio.

Performance Analysis reports may be generated without a performance dictionary. This is the Dynamic Performance approach that is supported by Data Mart. Since Dynamic Performance computes security-level returns first and then aggregates and links them, filtering can take place during the Mart build process to eliminate any investment category from the analysis that may be required.

Simplifying Entity Hierarchies

Entity Hierarchy field attributes allow you to report data for groups of entities that have parent-child relationships such as members of a composite, where composite members can themselves be composites, etc. You can maintain more than one such hierarchy, if necessary, to express all of the composite relationships among your entities.

Best practice for reporting Entity Hierarchies in Data Mart is to set up and regularly submit, for each distinct entity hierarchy structure supported, a dedicated group-level model whose groups are the Entity Hierarchy field attributes of the hierarchy. You must select some field for any model, but you can simply choose a single position field like market value just to enable the model to populate its table. The model should be built for every entity at the topmost level of the hierarchy, which might be a single entity if the top level is “total firm”.

Your SQL queries in reports and other downstream applications will use the hierarchy table as a directory to enable you to find the up- and down-hierarchy entities for any given entity. You must also populate the Fund Summary model for all entities.

Supporting Portfolio Look-through

Look-through reporting at the detail level is supported in a Position Details look-through model and table. The granularity of a row in this table is the look-though position piece. The piece is the position in the portfolio or in any child find at whatever level of nesting. This is the same granularity as in a position OLAP report having look-through fields. The data is returned flattened for look-through in Data Mart.

Look-through in Data Mart Group Level Models

Group level models support look-through by enabling look-through processing. By default, the Enable Look thru Processing checkbox on the Model dialog box is blank. When you select it, it conditions the model to support only look-through processing.

Supporting Long-Term Summaries of Trade Activity

You may need to report summaries of trade or cash activity over long periods of time, such as the sum of buys year-to-date. Any generic trade field attribute can build this type of data. While it is possible to rebuild these field attributes over a long time period every day in the mart, this is an inefficient practice. A more efficient strategy is to build that summary field on a month-to-date basis daily, then sum over a set of monthly fields at the report level, perhaps with the use of a general-purpose stored procedure.

Supporting General Ledger Data

Data Mart supports General Ledger data. Although Eagle Accounting was enhanced to facilitate building ledger data in Data Mart, other accounting systems may send ledger data to Data Mart if they load the gl_detail_posting table of the ledger database in the same way as Eagle Accounting does.

Eagle Accounting sends ledger data under different sources for different accounting bases and for daily versus monthly balances. If you want to support more than one basis or periodicity in your reporting, it is best practice to use a separate snapshot for each.

Supporting NAV Data

Data Mart does not directly support fields from the Net Asset Value (NAV) table. However, it does support Dynamic Mutual Fund field attributes. A series of field categories and about 20 different effects were added to the Dynamic Mutual Fund field attribute type to allow a wide range of NAV fields to be supported in Data Mart.

Limitations of Selective Fields

The Build Selective Fields option has a number of significant benefits, such as selective back-filling. For example, when you add one or more new fields to a model that you have been populating for a while you can use this option to populate values for the new fields only. Otherwise, you have to submit the model to populate every field of each row. When you re-populate fields that already have data values, the values are not deleted, but re-generated. This can lead to an unintended restatement of reported data values.