Streamline Your Close

I’ve talked about FDM’s automation capabilities before but in this post I want to drill in a bit further.  I’m going to spend some time talking not so much about the technical but instead investigate how FDM’s automation can help your organization streamline the close process, increase confidence in the financial results and reduce the burden on your staff.

As I discussed in a previous post, FDM includes an out of the box automation component called the Batch Loader.  This component allows the FDM workflow (Import, Validate, Export & Check) steps to be executed without end-user interaction.  The batch loader requires files to be named according to a specific convention and saved to the OpenBatch folder of the FDM application NTFS structure.  When the batch loader is executed, files found in this directory are processed through FDM and loaded to the target EPM application such as HFM or Essbase.  Any location that is directly integrating to a relational data source using import integration scripts or the ERPi source system adaptor can also be triggered to process.

Great so we know about the technical functionality but let’s talk about a real world implementation.  There are several scenarios that I have seen over the course of my time as a consultant.

Scenario 1:

The FDM application is using direct integration to a source system like Oracle eBusiness Suite (eBS).  Throughout the close cycle data needs to be loaded every hour on the hour.

Each of the business units (or an administrator) needs to perform the FDM workflow.  Someone is literally clicking Import then Validate then Export.  On average this process can take anywhere between 3-15 minutes per business unit.  Let’s go with the low-end and say it is 3 minutes.  That’s 5% of a resource’s day.  At first blush you might think, that’s nothing.  Let’s do a little math and find out.

Let’s assume that the resource loading data makes 100k/year and that 5 days of the month are dedicated to loading data during the close.  On average there are 19-23 business days per month.  For the sake of this analysis, let’s go with 20.  So 25% of the month is related to close.  Of that 25%, 5% of the time is spent executing the FDM data load process.  So a little quick math tells us that annually, the cost of each resource manually loading data is $1,250 – for each resource performing the load.

Cost aside, the idea that someone needs to execute the data load manually every hour means that a person needs to divert their attention from their current task to perform the data load.  This can be distracting and is also prone to variability in terms of when the data load actually happens.  For example, if a person is working on an important analysis, do you want them to break from that activity to perform the data load.  Or consider if a person has a 2 hour meeting.  Are they late to the meeting because they were running the data load?  Do they leave the meeting at the half way point to go execute the data load?  Clearly I am using exaggerated examples but hopefully my point resonates.

Scenario 2:

Let’s consider a distributed model of FDM.  Some business units are using direct integration while others are loading flat files.  For certain flat file locations, the source system generates a flat file automatically – either on a scheduled basis or when a user executes an extract process in the source system.  The file is dropped to a shared network directory and an end-user needs to then login to FDM and process the file.

This situation is the one where I find the most inefficiency.  The time lag between when a file is produced and when it is actually processed through FDM can be minutes, hours or even days (ex: vacation, out of office) in some instances.  In this case the corporate consolidation of data is held up by 1 or 2 business units who have not yet loaded their data.  While FDM includes process monitor reports to identify that a particular business unit has not yet loaded, the process design does not mitigate this risk.

Great Tony, my process needs help.  What’s your answer?  Well if it’s not obviously where this is going, I’ll be obtuse and tell you that the FDM batch loader can streamline both of these processes.  In scenario 1, FDM can be scheduled every hour using the FDM Task Manager or an external scheduling application like Control M to trigger the data load process every hour.  Admittedly, scenario 2 is a bit tougher to address since there will be business units for whom the data file generation is coupled to an individual user.  However, even in this instance, if the user simply saves the file to a standard location (it does not have to be the OpenBatch directory!), a process can be designed to sweep for files as frequently as every 60 seconds.  In both scenarios, FDM batch loader can be used to reduce the amount of work an end-user needs to perform.

Still have doubts?  One argument I sometimes hear against automation is that the organization wants the business unit to own the financial results.  The fear is that if the data load is automated, the business units will not care what the numbers in the system are.  To that I would offer several rebuttals.

First and foremost, any errors encountered in the automated process (missing maps, invalid intersections, data load errors) can be trapped and reported to the appropriate user.  Second, the FDM check report can be used to test data quality.  With FDM’s automation, the check report can be output to PDF and emailed to the business unit controller, accountant or whomever is deemed responsible for the financial results.  They don’t even need to be an FDM user to get the status email.  Third, the use of HFM’s validation accounts and process management can prevent data from being promoted when certain defined metrics are not met.  Finally, the FDM automation process can be configured to generate Financial Report (FR) reports and/or books and again email those to the appropriate users.  All of these mechanisms provide an active feedback loop for the data “owners” that can be used to ensure data is accurate for a given business unit.

With automation the onus of the data load is transferred from an individual end-user to a computer.  Computers don’t need bathroom or lunch breaks, vacations, or sleep (ok maybe every once in a while they take a nap).  The process is standardized and repeatable.  The removal of variability helps improve data quality and streamline the process.  And likely most important for any FDM user that is currently manually loading data, FDM batch loader reduces the amount of tedious work that needs to be done on a daily and monthly basis.  I hope you will consider the use of this technology and reap the benefits of this out of the box component.

This entry was posted in Automation. Bookmark the permalink.

Leave a Reply