32 Parking Plaza
Suite 506
Ardmore, PA 19003

Insight: Effectively Overseeing Your Exposure/Transparency Dataset

The problem with an exposure/transparency dataset is the sheer volume of unstructured datapoints collected. Below you’ll find our three core tenets for overseeing this important dataset.

TOPIC
Data Management & Migration
DATE
Jun 27, 2024

“The precise statement of any problem is the most important step in its solution,” Edward Lydston Bliss, Jr, journalist.

 

 

The problem with an exposure/transparency dataset is the sheer volume of unstructured datapoints collected.  There can be well over 100 different datapoints per fund per period. And the sources of data can vary drastically, spanning holdings-level data in Excel files to blended top-level exposures in PDF tearsheets.

 

The challenge for any investment office is ensuring that all of these data points are accurate and current.

 

Our recommended approach for overcoming this challenge includes a well-documented process and a change-based review.  Below you’ll find our three core tenets for overseeing your exposure/transparency dataset so that it is an asset – not a headache – for your team.

 

 

Tenet 1: Documentation, Documentation, Documentation

The starting point for overseeing your exposure/transparency dataset is to create a table that documents your process for each fund including:

  • Fund name and identifier
  • Frequency of update
  • Transparency data type
  • Notes
  • Source document(s)
  • Subscribed for services (yes/no)

Your documentation is the starting point that can be used to more easily review data freshness, your invoice from your service provider (if one is used), and data quality.

 

Your documentation can also be used by your service provider (if applicable) as a playbook for how the dataset should be maintained.

 

 

Tenet 2: Implement Data Quality Checks Tailored to Your Use of the Data

While you may check every balance and transaction in your accounting data, it’s humanly impossible to check every datapoint in your exposure / transparency dataset.  That means you must “work smarter, not harder” when it comes to reviewing it.

 

The key is to be proactive and thoughtful about how you perform quality checks.  Strive to find a happy medium between deploying an army to check every datapoint and checking no datapoints at all.  We recommend applying a scheduled, change-based review process such as the one outlined below.

 

Parameter What to Review
Period over period change in unrealized value greater than X% Unrealized value
New holdings in the period Metadata assigned to the new holding

  • Security type
  • Geography
  • Sector
IPOs that occurred in the last quarter Security type is marked as public
Top X holdings across portfolio Unrealized value, realized to date, etc.

 

Iterate the review process over time so your checks continue to be focused on areas of the dataset that are prone to inaccuracies or staleness.

 

 

Tenet 3: Take Your Exposure/Transparency Data Beyond Reporting

Once your dataset is current and accurate, exposure/transparency data can be so much more than just a reporting tool for geographic, sector, and security type exposure in your portfolio.  Consider the potential of these sample use cases:

  1. 1. Analyze drivers and detractors of fund performance by comparing manager reported attribution against system calculated attribution.
  2. 2. Track style drift for each investment by reviewing historical exposures.
  3. 3. Review proforma exposures against policy targets prior to a new investment by aggregating prospective fund data with existing portfolio exposure / transparency data.
  4. 4. Calculate your exposure to market risk (beta) versus manager skill (alpha) by using the beta of underlying holdings or proxies.

 

The use cases and types of data collected for exposure/transparency are only increasing.  Technology is now becoming available to automate the collection of some of the data from manager reporting – so that you can be your own data steward.  However, this dataset requires normalization of geography, sector, security type, and security master, so it still involves manual work.  While this should not deter you from pursuing this technology, it should help you realize what technology can automate and what still requires services or internal resources.

 

Regardless of how you decide to create and maintain your exposure/transparency dataset, it is imperative that you have procedure documentation, routine data quality checks, and a champion on your team that will push for its use.  With these tenets in place, you will not only be able to more effectively oversee this important dataset but extract more value from it as well.

 

In other news, we would like to introduce you to Brian Thomas who recently joined us from PwC’s asset and wealth management audit practice.  We are very excited to have him join the Union Park team!

 

Thank you for your continued support and engagement.  We hope you have a wonderful summer.

 

 

Union Park Consulting

Get quarterly insights that help you make smarter decisions.

Effective results start with effective execution.

Looking to optimize your investment operations?