Solution Highlight: Creating HFM Journals From Planning

In today’s post, I wanted to take some time to talk about a recent solution I implemented for a customer.  The customer implemented Planning, HFM and FDM several years ago.  Very recently, a business realignment was announced.  Several product lines were being reorganized and would now be reported under new groups.  Sounds easy enough for the Oracle Hyperion suite, just move the base members under a new parent right?  Well that would make for one pretty short blog post.  Fortunately for the purposes of this post, it’s not that easy.

At this customer, the Planning product dimension is more granular than the product dimension of HFM.  While we could simply move the level 0 members in Planning, the base members in HFM represent a combination of the products to be moved as well as those that will continue to be reported in their legacy reporting parent.  Given this, we needed to find a way to move a portion of the data from one base member to another base member.  So what are our options?  There are a number of available options to address this but those that were entertained were:

  • Input data directly into HFM using an HFM web form
  • Load data to HFM using SmartView
  • Create journal entries in HFM

Well if you were paying attention to the title of this post, you already know that the customer elected for option 3.  Before we jump into the details of the solution, let’s quickly address why this was the right option for the customer.  HFM journal entries are easily identifiable and thereby auditable.  While this customer has a data type dimension that allows them to identify the “source” of the data, journal entry reports allow the customer quickly produce a supporting detail report that can be provided to internal and external auditors.  Additionally, internal HFM customers do not need to comb through data audit logs to identify the person that “booked” the adjustment as they would need to in the case of a SmartView or Data Form entry.

Ok, let’s talk about the solution.  The customer was planning to manually create the journal entries in HFM.  After I wiped away a tear, I took a few minutes to share some knowledge about FDM’s ability (insert shameless prior post plug here) to create HFM journal entries. The customer is pretty sharp and before I could even finish sharing my proposed solution, he said something along the lines of: “great, we’ll create a SmartView template, refresh from Essbase into the FDM excel template and load through FDM to create the journals.”  I’m impressed dear customer but I see your SmartView and I raise you a flat file.

So you might be thinking, FDM can only create journals from Excel, Tony’s bluffing and if this customer calls him, he’s going to lose his stack.  Well I’m not a gambler.  When I am sharing knowledge about FDM, it’s usually based on my previous experience or my tinkering with the application.  So I told the customer that his idea would work but I can take it a step farther; FDM can generate an Essbase extract, process the extract through FDM and load the data as a journal to HFM.  Well as I said before, this customer is sharp (gratuitous compliment here in case the customer reads this post) and he said: Let’s do it, that will save us a ton of time!  So off I go.

My first task is to generate an extract from Planning.  Thankfully, I am surrounded by an all-star Planning and Essbase team.  The Planning resource threw together a DATAEXPORT calc script in quick order.  So now I move on to setting up FDM to execute the calc script and pick up the resulting data file.  Having FDM run the calc turns out to be the easy part, the challenge is that Essbase is running on a Linux box.  You might be thinking, that’s not tough Tony, just FTP the file after the extract completes.  And you would be right, it’s not that hard but for a guy that never worked in DOS and lives mainly in the Windows world (FDM & HFM), I had a little learning on the fly to do.  And one of the things I learned is that FTP against a Linux box is CASE SenSitIVe.  Well after a few choice words and some Googling, I get past that hurdle.  So FDM is kicking off the data extract and FTPing the resulting file to the FDM box.

Now the easy part…or so one would think.  I created a simple import format to read the Essbase data extract file.  I created my maps to transform the Essbase dimensionality to HFM dimensionality.  So my data is ready to load to HFM.  Not yet.  The data that came from Planning represents the “new” product home.  There is no offsetting entry in Planning.  The FDM generated journal would be unbalanced in HFM.  Sure unbalanced journals are sometimes fine in HFM but in this instance we need to generate the offsetting entry.  Well, we are in luck because FDM has logic groups which are used to duplicate data records.  So I created a complex logic group that creates a “new” product data line that represents the offsetting entry.  A small mapping exercise to map the new data records and now I have a balanced journal.  Next I had to update a few integration and location settings to enable the load of journals from a flat file.  Finally a quick script to make sure that the journal for the period is posted as a single journal entry with a consistent name.  This allowed the client to quickly identify the FDM generated journals.  As an additional benefit, the client can unpost a single journal quickly in the event of a modification to the data in Planning application – not that a set of restatements would ever be modified.

Great, FDM is generating the data file and producing a journal in HFM.  Everything is complete right?  Not so fast.  The next step is putting a nice wrapper around the process.  When I design a solution for a client, I like to minimize the number of steps they need to complete in order to leverage the solution.  So while I could create a task flow that allows them to generate & FTP the Essbase extract, this would mean that they would need to kick off this process, wait for it to complete and then manually execute the workflow steps.  Sure this will work but the client has to restate 24 months of history.  2 distinct processes to simply generate HFM journals is not an overly efficient process.  So back to scripting I go.  I created a simple event script that automatically kicks off the Essbase extract and FTP process and takes the resulting file and imports it into FDM.  All of these steps happen in the background when the user imports a simple dummy trigger file at the FDM Import Workflow step.

So let’s recap.  A user imports a dummy trigger file at the Import Workflow step.  In the background, FDM initializes an Essbase DATAEXPORT calc and then FTPs the resulting data to the FDM server.  This export file overwrites the dummy trigger file and the resulting data set is imported to FDM.  The import screen is populated with the Planning/Essbase data extract.  The remaining workflow (Validate & Export) continues and a single journal is posted in HFM that contains all of the restatement adjustments.

Can you guess what the final bow (think Lexus’s December to Remember giant red bow on top of a car) on the solution is?  Well if you’re a regular visitor to my blog, you likely correctly guessed batch automation.  That’s right.  I provided a table for the customer to specify the time periods that should be moved from Planning/Essbase to HFM.  Once the list of periods is specified, they click a single Task Flow.  The end to end process is executed and a status email is sent indicating when the process is complete.

So there you have it, a process that could have consumed weeks of the customer’s time runs in a matter of a few minutes per period and under an hour for an entire year’s worth of data.  As a customer, would you think that the cost of a consultant was worth the time savings?  If you’re on the fence, please bear in mind that this process is reusable for future restatements and all of the email status report associated with the lights out process is completely reusable by any FDM automation process.


I hope this solution highlight helps you understand just one of the many powerful things that you can do with your FDM application.  FDM (and FDMEE) continue to be a pivotal application within the Oracle Hyperion EPM stack.  Extending what the application does is an excellent way to relieve the burden from your staff, increase confidence in your financial data and improve the return on your technology and consulting investment.  As always, questions and comments are welcomed.

This entry was posted in Automation, Extending FDM and tagged , , , , , , . Bookmark the permalink.

One Response to Solution Highlight: Creating HFM Journals From Planning

  1. Curro says:

    Hi,

    nice post.

    I did not test with Essbase on Linux (works successfully in Windows) but you could also export data from Essbase using a Report Script and running it with Essbase Adapter API.

    Function mListDataQueryFile receives the Extract path as third parameter. You can use your inbox folder as path. If it works with Linux, then FTP would not be needed.

    DATAEXPORT may perform better if large volume of data to be extracted.

    Regards

Leave a Reply