As our financial world continues to evolve, every company – public and private – faces increasing scrutiny of their financial statements. This extends not only to the financial performance but also to the quality of the data. In this posting, I’m going to show you how FDM (Financial Data Quality Management) can be used to proactively identify data quality issues.
The Workflow of FDM includes 4 steps – Import, Validate, Export & Check. Technically there are 5 steps but for the sake of this post I’ll limit the discussion to 4. Come back soon and I’ll discuss this illusive 5th step. We know that during Import, data is imported to the FDM application. During Validate, FDM is ensuring that all data has been mapped (and in the case of HFM integrations that the resulting intersection is able to accept data) and finally during Export, data is loaded to the target application. So what is this Check step and why should I pay attention to it? Well, let’s explore.
An often overlooked point about FDM is that FDM does not include any hierarchical information. FDM does not know that the base level Cash account rolls up to Current Assets and that Current Assets rolls to Total Assets. This relationship is maintained in the target system –
The Check step produces a report within the FDM interface which displays target system (ex: HFM, Essbase) values. This point cannot be understated. FDM is retrieving values that may have been impacted by a calculation, aggregation or translation from the
The Check report has 2 primary functions:
The display balances functionality allows you to display key indicators (ex: Net Sales, Total Assets) to the end-user immediately following the Export step. This enables the user to visually validate the balances in the target system against the source system. While this is useful for proactive users, if one of your users is not as proactive (gasp), this functionality will not help you or them actively issues.
As with the display balances functionality, the test balances functionality retrieves data from the target system. Next the balance is compared against an administrator defined condition to which the data must conform. If the data does not adhere to the condition, the data quality check is flagged as failing and the location does not “pass” the Check workflow step. The user is immediately aware of their data quality issue. As important, each of the people that are upstream of the data, including administrators, can monitor the data quality across each of the FDM locations to which they have access. I’ll take a moment to shamelessly plug my posting on FDM reports for more information on the Process Monitor reports.
One of the most common questions I get asked is: What data quality checks are/should be included on the Check report? The answer is: Whatever information is important to ensure that the data is accurate. Remember, the data quality checks can be performed at base or parent level members. So at a base level, you may check that the balance sheet is in balance. At a parent level, you may check that intercompany has been fully eliminated. There might be expenses that can only be booked in certain account and cost center combinations. All of these examples are valid for check report rules.
If you can articulate the rule as a having a boolean (true/false) outcome, you can likely creating a Check report rule to test the data. Take the most common example, is my balance sheet in balance; that is, do Total Assets equal Total Liabilities & Shareholder Equity. The check rule is asking (Total Assets – Total Liabilities & SE) = 0. This result can only be true or false. The next question I get asked is: Can I have a threshold? Meaning, maybe I want my balance sheet to be in balance within +/- 100. If you’re familiar with my blog, you won’t be surprised to hear me say, yes, this is certainly possible and actually is implemented quite a lot in the real world.
If you’re still reading you likely want to know how to utilize this functionality. The great news is that the check report is out of the box functionality; there is no increment software cost to using this functionality. Multiple versions of the check report are delivered by Oracle. To populate the report with data, FDM uses 2 additional components:
|Validation Entity||Defines the target system entities which will be displayed on the check report; also controls which entities are calculated and/or consolidated following a data load|
|Validation Rule||Defines the balances to display and the balances to evaluate/test.|
Each entry in the validation rule generally runs for each entry in the validation entity. If any of the balances in the Test section of the validation rule fails for any of the entries in the validation entity, the location fails the Check step.
Importantly, failing the check step does not prevent data from being loaded to the target system. For the Check report to be published, the data must be loaded to the target from which the report will retrieve data. This may seem obvious since I’ve constantly referenced that the Check report retrieves target system data but this is one of the questions that I am often asked. It should also come as no surprise that a custom FDM process can be implemented to roll back data in the event of a failed validation rule.
If you are not currently using the Check report, I hope this post has given you reasons to consider implementing it. The FDM administrator guide (see Validation Rules & Validation Entities sections) provides solid information that will help you enable the Check workflow step.
I hope this post has introduced you to another out of the box feature that increases the return on investment of your EPM suite (not just FDM!). If you have any questions, please feel free to post a comment.