In my years of implementing FDM I have noticed that many customers aren’t overly interested in FDM at the beginning of the sales cycle or even the project. FDM is not sexy. While it’s great to hear that you can load data easily, it’s not all that exciting. People are intrigued by colorful graphs, trended P&Ls and automated Cash Flow statements, not little orange fish. As a result, I think the criticality of FDM sometimes is overlooked.
But data and data integration is absolutely paramount to the success of your project and the long-term usage of your EPM system. And that is what brings me to this point of this post. What makes someone great at FDM? Anyone with a little bit of experience can build a location, import format or validation rules. Sure these are important parts of an application but the critical elements (IMHO) are mapping and scripting.
Accurate and efficient mapping is critical to data quality as well as performance. My 3 part series on mapping is an excellent resource and a great primer for anyone interested in learning or honing their skills. In this post I am going to focus on FDM scripting.
Before we delve in too deeply, let’s talk about what FDM scripting is and why it is important. FDM scripts are used to extend the capabilities of the application. They use VBScript as their language but heavily leverage the FDM API as well as SQL. While FDM scripts use the VBScript language, they are not .VBS files. They cannot be executed natively without using the FDM application or the Task Manager or UPSShell.
There are 3 types of scripts:
|Event||Execute during defined application events such as logging into the application (Startup) or After Loading a file to the target system (AftLoad)|
|Import||Execute when a file is imported into FDM and the import script is assigned to the import format|
|Custom||All other scripts that extend the application functionality; for example, Batch Loading. Custom scripts can be executed on an ad-hoc basis or scheduled|
So why is FDM scripting important? I think about FDM script as being akin to HFM rules or Essbase calc scripts. Yes HFM can consolidate financial data without rules, but the balance sheet won’t have current retained earnings without rules and would obviously be incorrect. Well the same applies for FDM, yes FDM can load data to HFM or Essbase but scripting is what enables you to make the process lights out or to ensure that data is cleared from Essbase properly. While it’s technically not required, it’s highly likely that your application will use some level of scripts.
So we’ve laid some groundwork. Let’s talk about the good, the bad and the down right ugly of FDM scripting. Here are some basic principles to which I try to adhere with my scripting/coding.
- K.I.S.S. – No that’s not a reference to the awesome rock band. Keep It Simple Stupid.
- Be efficient & dynamic
- Employ basic coding “best practices”
Principle 1: K.I.S.S.
This principle is first and foremost. If your script is 1200+ lines, you likely have room for improvement. While there is no magic number for how many code lines a script should be, if you can’t summarize what the script is doing in 3 or 4 high level bullet points, it’s likely doing too much.
One guiding principle that I have learned over the years is to use variable names that have meaning. A variable called f1 likely means nothing but something like strSrcFile tells the person reading the script that the variable is holding a string and it is the source file. This approach lends itself to principle 3 because in some ways the code becomes self documenting.
Continuing with the KISS principle, don’t recreate out of the box application functionality. For example, I saw a script where the writer was making a copy of the file that was to be processed by the batch loader. The script was literally moving the file into a new folder that was based on the date and time the batch ran. If the author were more familiar with how the application worked, he/she would have known that the batch loader automatically archives data files into a new directory. This entire section of code is redundant and not needed. Worse, this code increased the amount of file storage that the application used since each batch processed file is saved twice. Learn what the application can do before determining if a script is really needed.
Principle 2: Be Efficient & Dynamic
If you find yourself copying and pasting code within your script and making small changes then you might want to look at creating a loop or an embedded subroutine or function. Repetitive code makes the script larger and more maintenance intense. In the below screenshot, the very same block of code was executed multiple times. Nearly 25 lines of code that did almost exactly the same thing were repeated 3-4 times. This resulted in 75 lines of unnecessary code. Replacing this repetitive code with a simple loop would have resulted in a script that had 6% fewer lines. And that was just 1 change.
Additionally, should the logic change for this block of code it will require updating in 3-4 blocks as opposed to a single block. The drawback seems fairly obvious.
On the topic of efficiency, this can also refer to code execution (time to process) efficiency. While this is a very important topic, tuning is part science and part art (bonus points if you know when I have said that before). I would love to delve more into this topic but a blog post simple won’t do this justice.
The idea of writing dynamic code is a guiding principle in my development. I never want a client to have to update a path in a script or create a folder. Requiring this manual intervention means that the script has room for improvement. In the below, a simple file system command could create these directories that the author documented as needing to be created before the script could be executed.
To illustrate my point, here is a truncated snippet of code to manage the folder creation:If FSO.FolderExists(strFolder) = False then FSO.CreateFolder strFolder End if Finally, use UNC paths in scripts. Pointing to the D: or E: drive is a recipe for disaster. By using universal naming conventions, you ensure that the script is location independent and therefore portable across environments.
Principle 3: “Best Practices”
One of the things I loved in Peter Fugere’s HFM book was his discussion of best practices. The term “best practice” is often used as a sledge-hammer to silence critics or end a conversation when one does not want to address the question more fully. As such, I try to avoid that hammer but this is a case where I think there will be few arguments.
I consider the above principles my best practices but there are basic coding techniques which widely considered best practice:
- Document your code – to assist future users of the code
- Declare your variables at the beginning of the script – this prevents duplicate declaration which would lead to script execution failure
- Indent your code – this is critical for readability, especially in large scripts. This is one of my biggest pet peeves as it becomes very difficult to read a script that includes a block of code 50-100 lines long with multiple nested conditional blocks (If Then, For Next Loop, etc)
This is an example of what to avoid as it pertains to point 3:
And the same code with proper indentation
A few simple indents and the logic of this section of the script is much more understandable.
You may have noticed that this post did not spend any time talking about the process that happens before you sit down to actually write a script and the subsequent steps after development is complete. Requirements Gathering, Design, Testing & Roll Out are all extremely critical elements of an overall application and FDM scripts. The intent of this post was to discuss the activities of the development cycle. Please do not underestimate the importance of the other phases of the project life cycle. It doesn’t matter how simple and efficient a script is; if it doesn’t address the needs of the end-user, it’s not a good script.
Throughout the time I have maintained this blog, I’d classify my visitors into a few categories (in no particular order):
- Oracle Customers that are in the sales cycle and are researching what exactly is FDM and how can it be used in your organization
- Oracle customers (new and existing) that are searching for an implementation partner
- Oracle customers that are in the middle of an implementation, upgrade or are simply looking for ways to more fully utilize the software
- Consultants that are researching an issue or looking to expand his/her skill set
Whichever bucket you may fall into, I hope this post resonates on some level for you and helps you think about ways to evaluate your current scripts and develop a plan for improving them. As always, if you have questions or comments, please feel free to post. One disclaimer, I will not answer questions that relate to a specific script in your application. It would be a disservice to you for me to comment on your script without having the full context of the intended usage as well as the full body of the script.