Assign Benchmarks to Entities

You can assign any number of benchmarks to an entity, and can use the same benchmark for different reporting purposes. This is available for any of the core entity types and their custom entity types, such as the following: Aggregate, Benchmark, Composite, Performance Composite, Portfolio, Sub-Portfolio.

Define Benchmark Definitions

A code list allows you to store a short description that represents the benchmark priority and a long description that represents the benchmark description. For example, suppose the primary benchmark has a short description "1" and long description Primary Comparison Portfolio. As a result, you can both limit the number of benchmark options and customize the name of each benchmark number.

The internal code list is called Benchmark Definitions. The following table lists the scripted values, and you can add to and change these. For example, you can rename the third index to Client Reporting Benchmark.

Short Description

Long Description

Short Description

Long Description

1

Primary Comparison Index

2

Secondary Comparison Index

3

Comparison Index 3

4

Comparison Index 4

5

Comparison Index 5

A fund can have N number of benchmark type assignments. These are usually used for performance reporting. For example; primary BM, marketing BM or Attribution BM. The idea is that firms want to be able to say “The fund we manage gained 10% this year while its primary benchmark Dow Jones only gained 7%”. Of course, a fund can change its benchmark over time so the idea is that in Eagle product there is both a running history as well as a current imprint.

Assume today is 12/31/2019 and a fund has just been assigned its benchmarks. Note that the current table would reflect the latest history. Since there’s only one history record the assignments are the same.  The Entity Type column here is used to reflect what type of assignment it is.

The entity detail table also stores composite memberships, peer group memberships and such. For example, example when the table is used for composite membership the entity type is set to COMP (example not shown below). So for example COMP1 might have 3 member entities AND also have 3 benchmark assignments. In such a case you’d see 6 rows (3 with entity type COMP and 3 with INDX)

 Entity Detail Table example

Now assume some time has passed and the fund is changing its attribution benchmark as of 6/30/2020 and the other 2 assignments are not changing. This means 6/30 is what we unofficially call a change date. You see below that all three assignments must be stored for the change date. In addition, because there’s an updated Most Recent History Date, the current table must also be updated to reflect that.

 Entity Detail Table example

To complete the picture, assume that again some time later (10/31) the fund decides it no longer needs a marketing benchmark. This means that as of change date 10/31 the fund only has 2 assignments. Therefore the history table should reflect that (note there is no longer a row with list order 2 as 2 associated with Marketing BM). Also, because  10/31 is now the latest history date, the current table should reflect this as well.

 Entity Detail Table example

Entity detail table

The following figure shows a Russell 3000 Index used as both a primary comparison index and a marketing benchmark. You can display all data where the effective date is the current date.


Additionally, you can display all data where the effective date is as of a specified date. See the following figure.

Editing Entity Dialog Box - Benchmark Tab

Maintain Benchmark Assignments

You can manually maintain benchmarks using the Entity Maintenance window Benchmark tab, which displays the benchmark assignments in a grid.

You can also use Eagle's generic Benchmark Assignment interface to assign benchmarks. For more information, see the Supported Generic Interfaces V17 Wiki. If you are automating the assignment of benchmarks, the benchmark-to-portfolio relationships are stored in the ENTITY_DETAIL table.

To assign benchmarks to an entity:

  1. From any Eagle window, click the Eagle Navigator button to access the Eagle Navigator.

  2. Enter Benchmarks in the Start Search text box.

  3. Click the Benchmarks (Performance Center).
    You see the Performance Center and the Benchmarks workspace with a list of entities.

  4. Select the entity you want to maintain, and click the Edit option on the Home tab ribbon.
    The Edit Entity workspace appears with the Details option open by default in the lower part of the page.

  5. Click the Benchmark option under the Entity Details tab.
    You see the  Benchmark Details for the selected entity.

  6. To assign a benchmark to the entity, click the Add option on the Home tab ribbon.
    You see the Add Benchmark Assignments dialog box.

  7. At the top of the dialog box, use the As of Date selector to select a specific date for the benchmark assignment. The default is the current date.

  8. For each benchmark you want to assign, select the appropriate row in the Benchmark Classification column and then select a value from the Benchmark Name drop-down list.

  9. You can select the Process Across Change check box to link the benchmark history across benchmark changes, or clear the check box to use the benchmark definition as of the report date.
    For more information about selecting an approach, see Link Data Across Benchmark Assignment Changes.

  10. Click Save.
    The Add Benchmark Assignments dialog box closes. In the As of Date column, you see the assigned benchmark’s number from the benchmark code list.
    You can assign more benchmarks as needed for specific dates. When you have completed assignments, click View and Submit Changes.

Link Data Across Benchmark Assignment Changes

Benchmark assignments for an entity can change over time, and Eagle Performance supports assigning a benchmark as of any date historically. When a benchmark changes, you may need to link the benchmark data to reflect the benchmark changes. For example, a portfolio changes style and requires a benchmark that has a better fit. You use the original benchmark up to the date of the style change and use the new benchmark going forward. When you calculate the inception to date return, you want it include the returns of the original benchmark up to the style change and to include the return of the new benchmark after the style change.

Choose an Approach

If you need to link benchmark data to reflect the benchmark changes, Eagle Performance provides the two general approaches. Because your approach affects the way you set up benchmark entities, consider in advance whether you prefer to:

  • Process Across Changes. Configure benchmark assignments so that the benchmark data reflects how the benchmark changes over time and links the benchmark data. This approach does not require you to create a unique Custom Benchmark for each portfolio. If you use Eagle's:

  • Portfolio Performance solution, you can select this approach for benchmarks used in Performance Analysis reports for Portfolio Performance with Performance Analysis, Performance Link Analysis, Performance Risk Analysis, Global Attribution, and/or Performance Attribution fields. You can also use this approach for benchmark data used in Performance Query Tool queries based on Performance Analysis fields.

  • Retail Fund Performance solution, you can use this approach for Performance Analysis reports and Dynamic NAV Returns reports with Dynamic Mutual Fund Returns fields. For more information about using benchmarks for retail funds, refer to the Retail Fund Performance User Guide.

  • GIPS Composite Management solution, you can select this approach for benchmarks used in Performance Analysis and Composite Analysis reports for composites with Composite Performance Analysis fields. For more information, refer to the GIPS Composite Management User Guide.

  • Create Custom Benchmarks. Create a unique Custom Benchmark for each portfolio that has a change for a benchmark assignment. In other instances where there is no benchmark assignment change for the entity, the system retrieves the benchmark as of the report data and uses it to provide benchmark data for the entire reporting period presented. For example, you can use a linked custom benchmark for this purpose.

The advantage to using this approach is that it allows you to save the definition of the benchmark to the database rather than have the system create it "on the fly." You can then create a Commit Journal entry for the custom benchmark and use a system-generated description of the custom benchmark's composition at a point in time for verification and reporting purposes.

A drawback to this approach is that you must create one custom benchmark per entity if the benchmark changes and you need to link across benchmark assignments. For this reason, this process can become cumbersome if you add many portfolios.

Set the System Default Value for Process Across Changes

The Process Across Changes check box in the entity Benchmark tab identifies whether to link benchmark history across benchmark changes. The Sys Item 26 setting, located on the Performance System Parameters page, determines the default value that appears in the Process Across Changes check box. You can change the default value that appears each time you create a new entity and add a benchmark assignment.

To change system settings related to the Processing Across Changes check box:

  1. From any Eagle window, click the Eagle Navigator button to access the Eagle Navigator.

  2. Enter System in the Start Search text box and click the System Parameter (Performance Center).
    You see the Performance Center and the Performance System Parameters workspace.

  3. In the row for Sys Item 26, click the Edit link, select one of the following values:
    U (Unchecked). The Process Across Change check box is cleared, displaying no check mark.
    C (Checked). The Process Across Change check box is selected, displaying a check mark.

  4. Click Save.
    The new value appears in the corresponding Sys Value cell.

Select the Process Across Changes Option to Link Benchmark Assignments

You can select a value for the Process Across Changes check box in the entity Benchmark tab when you add or edit a benchmark assignment, as described in Maintain Benchmark Assignments. See the following figure.



If you set the Process Across Changes option to:

  • Selected. Links benchmark history across benchmark changes for the specified benchmark definition. Retrieves benchmark data from different benchmarks.

If you select this option for the benchmark assignment, and select the Use Entity History option in the Performance Analysis report profile, the report reads benchmark data historically for the report run date.

  • Cleared. Does not link benchmark history across benchmark changes for the specified benchmark definition. Retrieves the benchmark assignment as of the report date and uses it to provide benchmark data for the entire reporting period. You must define a Custom Benchmark per portfolio if the benchmark changes and you need to link benchmark data.

You can configure the default value for this check box.

Use the Performance Analysis Report to Link Benchmark Assignments

The following describes how to set up the Performance Analysis report to link benchmark assignments and how the report processes that information.

Set Up the Performance Analysis Report to Link Across Benchmark History

If you plan to use the Performance Analysis report to link data across benchmark assignment changes, you must perform the benchmark setup tasks described in Set the System Default Value for Process Across Changes and Select the Process Across Changes Option to Link Benchmark Assignments.

Also consider the following when you set up the Performance Analysis report profile:

  • Use Entity History Option. In the Performance Analysis report profile, select the Use Entity History option to allow the report to read the benchmark assignments associated with the entity history.

Otherwise, if you do not select the Use Entity History option, the system uses the current entity definition to identify the benchmark assignment and the historic benchmark data to process across changes.

  • Submit with Override Benchmark Assignments. If you run the Performance Analysis report in Submit with Override mode and use the report profile's Select the Benchmark tab to change a benchmark assignment, the report uses the Process Across Changes check box setting of the overriding benchmark assignment to determine whether to link benchmark history.

For example, if you override the primary benchmark with the secondary benchmark and the secondary benchmark has the Process Across Changes check box selected, the report links across the benchmark history for the secondary benchmark. If you override the primary benchmark with the secondary benchmark and the secondary benchmark has the Process Across Changes check box cleared, the report does not link across benchmark history for the second benchmark. Similarly, if you override the primary benchmark with a specific entity such as SP500, the report does not link across benchmark history for the specific entity.

How the Performance Analysis Report Uses the Process Across Changes Option

The Process Across Changes setting you define for a given benchmark definition can vary from benchmark assignment to benchmark assignment. For example, if you reassign the primary comparison index for an entity several times, it is possible that one benchmark assignment has the Process Across Changes check box selected for the primary comparison index, and another benchmark assignment for the primary comparison index has the Process Across Changes check box cleared. For this reason, it is important to understand how the system identifies the appropriate benchmark assignment data and corresponding Process Across Changes setting when you run a report, and determines whether and how to link benchmark data.

The Performance Analysis report can link benchmark changes over time when you configure the benchmark assignment to do so. During report processing, the system performs benchmark processing once for each portfolio on the report, even if the portfolios share the same benchmark. It does this because even though the benchmark shares the same portfolio, portfolios do not have the same assignment history.
During Performance Analysis report processing, the system does the following:

  1. Based on the fields in the report, the system selects the benchmark assignments and Process Across Changes indicator for any benchmark definitions (primary, secondary, and so on) from entity history if you selected the Use Entity History option or from the entity record if you did not select the Use Entity History option. Note that if you selected the Process Across Changes option and you did not select Use Entity History option, system must still use the historic benchmark data to process across changes.

  2. For all the assignments with the Process Across Changes option selected, the system creates a new temporary benchmark entity for the combinations of each portfolio and each benchmark definition. For example, PORT1-BM1, PORT2-BM1, PORT3-BM1.

  3. For all the assignments where you did not select the Process Across Changes option, the system creates a temporary entity with the benchmark entity ID. For example, SP500, LEHAGG, RU2K.

  4. For all the assignments with the Process Across Changes option selected, the system selects all the benchmark assignments that occurred between the begin and end dates of the fields that reference each benchmark. This is similar to the performance data fetch except instead of returns, all the benchmark assignments between two dates are returned.

The system additionally selects the closest assignment with an effective date prior to the start date of this date range, in case that assignment was in effect at the start of the period. It also selects the first assignment with an effective date after the end date of the range.

The system determines the sources and performance models to query data from using one query for all the benchmark entities against the performance model and sources in the report rule. A second query selects the Default Dictionary and source. (For information about default dictionaries, see Use Default Dictionaries for Reporting. Based on the data returned for each of the benchmarks for the entire period, the source and performance model are selected. This same source and performance model are used for all the portfolios on the report that use that benchmark regardless of the time period.

For example, if PORT1 uses SP500 for two years ago, RU2K for the prior year, and SP500 for the current year, then SP500 & RU2K are queried for the entire three year period. SP500 is not queried for two years ago, RU2K for the prior year, and SP500 again for the current year. This optimizes this process by sharing the source/performance model determination process across all the portfolios in the report and not making it portfolio specific.

  1. After the system determines the sources and performance models to use for each entity, it forms the benchmark query to retrieve the benchmark data for all the benchmarks over the longest period on the report.

  2. The system queries and retrieves benchmark data for each benchmark over the longest period from the report.

  3. For all the assignments where you did not select the Process Across Changes option, the system uses the returns for each of these benchmarks as it does from the benchmark data fetch.

  4. For each temporary benchmark entity created (Process Across Changes selected), the system stores the associated returns from each of the corresponding benchmarks and assignment periods with that temporary benchmark.

  5. Any Performance Analysis, Performance Link Analysis, Performance Risk Analysis, Performance Attribution, or Global Attribution fields use the temporary portfolio specific benchmarks for assignments with the Process Across Changes option selected.

Use EagleEye Analysis to Identify Benchmark Assignments Used

If you use EagleEye Analysis to research information in Performance Analysis report results, be aware that the report Input section provides information about any fields that use benchmark data over time. An additional column shows the underlying benchmark assignments. For more information about using EagleEye Analysis, see Performance Analysis and Reporting.

Use the Performance Query Tool to Link Benchmark Assignments

The following describes how to set up the Performance Query Tool to link benchmark assignments and how the queries process that information.

Set Up the Performance Query Tool to Link Across Benchmark History

If you plan to use the Performance Query Tool to use Performance Analysis fields to link data across benchmark assignment changes, you must perform the benchmark setup tasks described in Set the System Default Value for Process Across Changes and Select the Process Across Changes Option to Link Benchmark Assignments. You must also select the Use Benchmark History check box in the Query report profile to allow Performance Analysis fields to link data across benchmark assignment changes. For details, see the Performance Query Tool User Guide.

How the Performance Query Tool Uses the Process Across Changes Option

The Performance Query Tool uses the Process Across Changes setting in a similar manner as the Performance Analysis report uses it.

The Process Across Changes setting you define for a given benchmark definition can vary from benchmark assignment to benchmark assignment. For example, if you reassign the primary comparison index for an entity several times, it is possible that one benchmark assignment has the Process Across Changes check box selected for the primary comparison index, and another benchmark assignment for the primary comparison index has the Process Across Changes check box cleared. For this reason, it is important to understand how the system identifies the appropriate benchmark assignment data and corresponding Process Across Changes setting when you run a report using the Performance Query Tool, and determines whether and how to link benchmark data.

The Performance Query Tool report can link benchmark changes over time when you configure the benchmark assignment to do so and select the Use Benchmark History check box in the Query report profile. During report processing, the system performs benchmark processing once for each portfolio on the report, even if the portfolios share the same benchmark. It does this because even though the benchmark shares the same portfolio, portfolios do not have the same assignment history.

When processing the report with the Use Benchmark History check box selected, the system does the following:

  1. Based on the fields in the report, the system selects the benchmark assignments and Process Across Changes indicator for any benchmark definitions (primary, secondary, and so on) from entity history if you selected the Use Entity History option or from the entity record if you did not select the Use Entity History option. Note that if you selected the Process Across Changes option and you did not select Use Entity History option, system must still use the historic benchmark data to process across changes.

  2. For all the assignments with the Process Across Changes option selected, the system creates a new temporary benchmark entity for the combinations of each portfolio and each benchmark definition. For example, PORT1-BM1, PORT2-BM1, PORT3-BM1.

  3. For all the assignments where you did not select the Process Across Changes option, the system creates a temporary entity with the benchmark entity ID. For example, SP500, LEHAGG, RU2K.

  4. For all the assignments with the Process Across Changes option selected, the system selects all the benchmark assignments that occurred between the begin and end dates of the fields that reference each benchmark. This is similar to the performance data fetch except instead of returns, all the benchmark assignments between two dates are returned.

The system additionally selects the closest assignment with an effective date prior to the start date of this date range, in case that assignment was in effect at the start of the period. It also selects the first assignment with an effective date after the end date of the range.

The system determines the sources and performance models to query data from using one query for all the benchmark entities against the Query Tool's performance model and source. A second query selects the Default Dictionary and source. (For information about default dictionaries, see Use Default Dictionaries for Reporting. Based on the data returned for each of the benchmarks for the entire period, the source and performance model are selected. This same source and performance model are used for all the portfolios on the report that use that benchmark regardless of the time period.

For example, if PORT1 uses SP500 for two years ago, RU2K for the prior year, and SP500 for the current year, then SP500 & RU2K are queried for the entire three year period. SP500 is not queried for two years ago, RU2K for the prior year, and SP500 again for the current year. This optimizes this process by sharing the source/performance model determination process across all the portfolios in the report and not making it portfolio specific.

  1. After the system determines the sources and performance models to use for each entity, it forms the benchmark query to retrieve the benchmark data for all the benchmarks over the longest period on the report.

  2. The system queries and retrieves benchmark data for each benchmark over the longest period from the report.

  3. For all the assignments where you did not select the Process Across Changes option, the system uses the returns for each of these benchmarks as it does from the benchmark data fetch.

  4. For each temporary benchmark entity created (Process Across Changes selected), the system stores the associated returns from each of the corresponding benchmarks and assignment periods with that temporary benchmark.

  5. Any Performance Analysis fields use the temporary portfolio specific benchmarks for assignments with the Process Across Changes option selected.