Over the course of my various FDM implementations as well as participation on the OTN FDM message board, I’ve encountered a number of requirements that FDM addresses natively – no custom coding required. Recently, I had the highly original idea to compile a Top 10 list (regardless of your late night television allegiance, the Letterman Top 10 list likely resonates on some level) of the best out of the box functionality that FDM offers.
In compiling this list, I took a number of factors into account in my ranking including:
- Frequency of use
- Value/Benefit to the implementation
- Ease of Use/Maintenance
- Performance Impacts
Without further ado, here is my FDM Top 10 (ok 11) list for FDM. Click on each item in the list to get an overview of the functionality as well as the factors that led me to assign that ranking.
There is a common misconception that FDM is intended or best suited to load HFM or Enterprise only. This couldn’t be more untrue. The FDM adaptor suite includes a number of adaptors to load to the various EPM applications including Essbase and Strategic Finance. These adaptors are prewritten by Oracle and provide the same Workflow process that you are accustomed to using when loading a consolidation system.
Even though these adaptors are used frequently, I did not rank their existence higher because this list is really intended to focus on product functionality that is either little known or potentially underutilized. This 11th entry in the top 10 is intended to simply educate.
Kicking off the Top 10 list is FDM’s multiload functionality. Many users know FDM as a way to load monthly data to their EPM application. However, it should not be overlooked that FDM has the ability to process multiple periods of data within a single file – albeit requiring a highly specific format.
The multiload capability does have some opportunities for enhancement. Most notably, import formats and scripts do not run when processing a multiload file. 2 types of mutliload files are valid – Excel (.xls) or Text (.txt); the former only being able to process a maximum of 12 periods of data. While this may push a user toward and Excel format, keep in mind that Excel based multiload files process slower than their text counterparts. Lastly, the Intersection Validation that currently runs as part of the HFM adaptor Validate process does not execute when processing a mutliload file.
While mutliload does offer value, the opportunities for improvement forced me to rank this functionality at #10
Oracle provides over 100 reports out of the box. These reports include detailed information about
- Data residing in FDM
- Process monitoring
- Mapping definitions as well as changes
- System events
- Users & security
- Journal entries
- Batch processes
- Application metadata & settings
The reports are extremely valuable. I debated ranking them higher than 9th but I couldn’t justify dropping the ranking other functionality that can streamline the data integration process.
For anyone that has undertaken the task of creating a large number of journals in HFM, this functionality is for you. FDM can create a true HFM journal entry in any of the valid adjustment value dimension members (<Entity Curr Adjs>, <Parent Curr Adjs>, [Parent Adjs], [Contribution Adjs]) from a specifically formatted Excel template.
This functionality is especially useful during an application upgrade from Enterprise to HFM. I recently had a client requirement to maintain all of their journal detail from Enterprise when implementing HFM. Rather than extracting Enterprise journals (.jaf) and modifying to the HFM journal format (.jlf) – and don’t forget the dimensionality & mapping – I created a utility that converted the Enterprise journal extract into FDM required Excel format. In excess of 200 journals per time period were processed through FDM and created in HFM including entity & parent currency adjustments.
I am clearly a big fan of the journal functionality of FDM; however, like mutliload, there are some opportunities for enhancement to the functionality. Most notably, if attempting to load a journal a 2nd or subsequent time, the journal must first be deleted from HFM. While this is not an FDM flaw, an option to have FDM do this activity as part of the load process would further streamline this process. This is something that I will continue to advocate during the customer advisory board meetings.
Does your process require a user to log into FDM, process data then log into HFM (or Essbase) and run a consolidation? It doesn’t need to!
FDM will consolidate for you. FDM can be configured to consolidate 1 or more parents following a data load. The consolidation executed by FDM is as efficient as that executed by a user directly in the application.
The one concern that has been noted by several of my clients is – if I am loading 5 locations that all roll up to an HFM Europe parent, isn’t is a waste of time to consolidate Europe after each load. The simple answer is yes. To address this weakness, a custom process can be created that executes a consolidation only when all data has been loaded but alas that is beyond the scope of this post.
Does a single data point need to be loaded to multiple intersections in the target application? Then logic groups are for you. Logic groups can be used to create new records in the source data that can then be used to load data to multiple locations or perform allocations. I tend to shy away from the latter as I believe those are better suited to be defined and executed in the target application.
One note of caution, logic groups can drag down performance during import or anytime maps are reapplied (ex: through Process Logic/Maps) so use them only when truly needed.
We’ve hit the Top 5. Any of these features could easily be reordered based on the implementation requirements. Data protection is one of the most valuable features when integrating with HFM.
Bear with me as I give a brief overview of HFM behavior. When loading to HFM in replace mode (from FDM or directly in HFM), HFM will clear the subcube for each point of view being loaded. The subcube, at a high level, is the intersection of the accounts, ICP & custom dimensions. So when I load a file that has entity A, any account, ICP & custom dimension is cleared for that Entity, Scenario, Period & Year. This is HFM behavior – pure & simple.
The problem becomes when a process exists where a user must manually enter data in HFM. Input via a grid or webform will not impact the data but if data is reloaded from HFM, the manually entered data is cleared by default.
Enter data protection. Data protection will reach into HFM, extract records that it recognizes (based on dimensionality that you define) and append them to the data file to be loaded by FDM. While the subcube is still cleared, the data has been “added” to the FDM data file and essentially is “protected”
I used this feature in at least 75% of my implementations. While it does have some weaknesses and a slight performance impact, the value that this adds far outweighs any of those negatives.
Do you have multiple EPM application in your Oracle/Hyperion environment? There’s no need to have multiple FDM apps!
Back in the “old” days of Upstream, only 1 adaptor could exist in an application. Functionality was added right around the time that Hyperion acquired Upstream. In the 9.2 branded version of FDM, multiple adaptors could be imported & managed within a single application. This has allowed a single FDM application to be used to integrate with HFM & Essbase concurrently.
The value of this functionality is not to be understated. As you enhance your implementation, concurrent loading to multiple systems – including leveraging/sharing mapping – can be readily achieved using batch processing. Further discussion of batch processing will follow.
Have you ever had to map to different entities based on the G/L account? What about mapping to a different account based on the balance which is somewhat common for intercompany activity?
FDM has the ability to map 1 dimension based on the source or target value of another dimension on the record. For example, an operating expense account may map to a different entity based on the cost center on the record.
This is extremely valuable and a very common need when defining FDM mapping. One note of caution, if conditional mapping is not properly designed, performance has the potential to degrade significantly.
Do you have Oracle eBusiness Suite (eBS), PeopleSoft or SAP as general ledger systems? Are your source systems storing data in a relational database like Oracle or SQL Server?
If the answer to either of these questions is yes, then there’s great news for you. A flat file extract may not be required. The introduction of ERPi and the SAP source adaptor as well as import integration scripts provide a method to natively connect directly to those sources and retrieve data. ERPi & the SAP adaptor are easily an entire post (hint, hint, keep an eye out)
The benefit of this functionality is that the process can be further streamlined. The data integration is no longer dependant on when a source file is generated, FDM simply retrieves the data when told to do so. Additionally, the approach ensures that data is unedited by an end user – lending the process to better controls.
While there can be performance considerations, the benefits generally outweigh these.
At long last we’re at #1. Does this sound familiar: A company is headquartered in California. The European (let’s say London) user has loaded data through FDM and has gone home for the day. Corporate books an adjustment in the general ledger on behalf of the European user but does not have access to kickoff the extract program to get the flat file out of the ledger. So an email is sent and the European user reprocesses data in the morning when they are back in the office. And the cycle continues throughout the close. Ever worry about that European user taking a vacation?
FDM’s batch loader process – often in conjunction with the functionality highlighted at #2 – can drive these inefficiencies out of the data integration portion of the close process.
FDM batch loader is proven functionality that provides lights out automation of the workflow (Import, Validate, Export, Check) steps. The batch loader even has the ability to map new unmapped members to a suspense member to ensure that the data set can be processed into the target application.
I leverage the batch loader process in at least 75% of the FDM implementations that I have led. The workflow performance is generally better with batch loader because the web processes do not need to execute and like any automated process, a server can always process faster than a human behind the keyboard.
Generally, I enhance the process to include detailed email status reports that provide status monitoring and advanced error handling. But as with targeted consolidations as well as ERPi & the SAP source adaptor, that discussion will follow in another post.
Ladies & Gentlemen, that’s my FDM Top 11 out of the box features that I believe add the most value and that I see used most often. Feel free to comment on this posting about how would you rank each of these and any additional functionality that you think should have made the list.