Understand PDC Workflow

The Portfolio Data Center workflow is similar to traditional workflows for security reference data, but PDC workflows for entity reference data differ from those used for security reference data.

Vendors create and own security reference data. Each vendor has its own interpretation of what a characteristic value should be. These variances force firms to use a gold copy process to derive their own final version of a security.

A vendor does not determine entity reference data, and it does not have the same gold-copy processing requirements. Security reference data emanates from, and vendors own it.

Each vendor provides its interpretation of what certain reference data characteristics should resemble. Variances in vendor interpretations causes firms to use gold copy processing to create their own version of a blended, final version of a security.

Entity data follows a different path. An investment firm decides when to add new accounts and is the ultimate owner of the data. Vendors do not instruct firms to open new accounts and do not dictate to firms how to specify the reference data for new accounts.

PDC excludes gold copy processing because it does not need it. PDC does not composite multiple sources of vendor data. Instead, it enables you to store and view multiple sources of vendor-supplied analytics and ratings for an entity.

PDC enables you to validate and cleanse entity reference data from internal and external sources. PDC acknowledges that vendors do not own the core entity reference data of a firm. However, most firms have an upstream CRM system that is the point of origination for new accounts. PDC supports workflows in which the system loads upstream data files with full or partial entity setup data.

PDC provides for multiple validation checks throughout the workflow. For example, the system can validate the source upstream CRM system data and any enrichments applied to the original data. You also can validate the data when you use policies (workflow templates) on the PDC user interface to create a new entity.

The typical workflow consists of the following steps:

  1. Load the (CRM-based) entity reference data.

  2. (Optional) Validate the source system data.

  3. (Optional) Enrich the source system record.

  4. Resolve the exceptions (release or override exceptions).

  5. Release the final entity record.

  6. Distribute the final entity record.

The following topics describe each step in further detail.

Load Entity Reference Data

Entity reference data sometimes comes in the form of an upload file from a firm's internal system. The system loads to the RULES database tables in the Eagle data warehouse, creating a centralized data repository.

Best Practices

  • When you load a new entity, that data goes into the master tables—Entity table (ENTITY) and Entity Extension table (ENTITY_EXTENSION) and the history tables—Entity history (ENTITY_HIST) and Entity Extension History table (ENTITY_EXTENSION_HIST).

  • When you load entity reference data for funds already in the system, the data goes into only the history tables.

A typical first-stage setup for an entity includes the following data:

  • Identity codes, such as core and reference system identifiers

  • Key account attributes, which vary according to client and business unit

  • Key dates, such as inception, performance inception, fiscal end, and termination

  • Name values, such as common name, long name, and legal name

You also can load and view the following related information, which is not subject to validation and enrichment:

  • Account-level performance (total returns)

  • Benchmark assignments

  • Lot-level holdings

  • Peer group assignments

  • Performance transactions

  • Security-level holdings

  • Share class statistics, including net asset value (NAV), market price, distributions, submission and redemption shares, and subscription and redemption dollars

  • Transactions

  • Vendor-provided entity analytics and rankings

  • Vendor-provided peer group analytics and rankings

Validate Source (CRM) Data

To validate source (CRM) data, the system uses validation rules that assign a pass or fail status to each field. Validations are optional, but you can apply them to any field. In addition, you can use the rules that Eagle provides or create your own customized rules. You use validation rules to perform simple data integrity checks, such as ensuring that the Currency Code field does not contain a value for a sanctioned country. You also can use them to perform complicated data integrity checks.

Maintain Most Recent Master Records

You do not need gold copy processing. However, entities have a master record. Instead of a gold copy consisting of data from multiple vendors, an entity master record is the latest version of the historical imprint for the entity. When you create a historical imprint of the core characteristics for an entity, Portfolio Data Center maintains it in the master record.

Enrich Master Entity Records

Optionally, you can use the fields in the master entity record to create an additional enriched field. For example, to form an enriched Entity Legal Name field, combine the Entity Name field and the Product Type field.

Validate Master Entity Records

Optionally, you can have the system validate master entity records to ensure the completeness and completeness of the entity reference data. You also can customize the fields that are subject to verification (enabled/disabled) as needed. The system updates only valid data to the master record.

Resolve Exceptions

When you create or edit an entity, the system creates a list of exceptions. You can use PDC to review the invalid entity reference data to resolve the exceptions. The system does not post fields that contain exceptions to the master record until you fix or release the exception.

Release Entities

After all errant data, all the validations pass and no errors or pending authorizations exist, PDC releases the master entity record leased to the RULES database in the Eagle data warehouse. To enable targeted downstream processes, partially release an entity. To do so, release certain fields (fields that have passed validation or were not subject to validation) before resolving validation errors. The release level determines the extent to which the master record is ready for use by downstream systems.

The system sets the release status for the overall entity to the lowest release level for which all fields associated with that release level have been released. For example, the release level of an entity is Released for Trading because all Level 1 fields have passed. Therefore, the system releases all Level 1 fields to the Eagle data warehouse.

If some Level 2 and all Level 3 fields also pass, the system releases those Level 2 and Level 3 fields (that have passed validation) to the Eagle data warehouse. However, the release level of the entity remains Released for Trading. Every remaining Level 2 field must pass before the Release Level for the entity can change to Released with No Exceptions.

Distribute Master Entity Records

The system releases Master Entity records for internal activity, and you can export them to downstream systems.

No internal Eagle modules depend on a fund release level. An impact analysis will be done later to determine how to integrate this (new) variable. For example, you still can run OLAP reports, DataMart reports, and performance reports for an entity even though you can release the entity only partially. In the short term, you must customize the system to achieve workflows that require such dependencies.