hfm, planning and essbase with a side of...

34
HFM, Planning and Essbase with a side of DRM: A Case Study of EPM Implementation Angie Caruthers Manager, Hyperion Systems

Upload: vuongkhanh

Post on 07-Mar-2018

241 views

Category:

Documents


1 download

TRANSCRIPT

HFM, Planning and Essbase with a side of DRM:

A Case Study of EPM Implementation

Angie Caruthers

Manager, Hyperion Systems

McAfee, a wholly owned subsidiary of Intel Corporation (NASDAQ:INTC), is the

world's largest dedicated security technology company. Backed by global threat

intelligence, our solutions empower home users and organizations by enabling

them to safely connect to and use the Internet, prove compliance, protect data,

prevent disruptions, identify vulnerabilities, and monitor and improve their

security. McAfee is relentlessly focused on constantly finding new ways to keep

our customers safe.

About McAfee

Founded in 1987 as the world’s largest dedicated security company

Global research for real-time threat intelligence

Integrated solutions and services

Compliance processes built into solutions

Single management platform for optimized security

Overview� Where We Were

� Current State

� Future Challenges

Implementation Challenges� HFM

� DRM

� Planning

Q&A

Topics

Where We Were

4

� Implemented Hyperion Planning and Essbase in 2008

� Transactional financial data captured via SAP 4.6c ERP System

� Consolidated financials using SAP ECCS

� Accounting prepared GAAP financials via intensive excel spreadsheets; took at least a half day to produce and tie out

� Ported actuals from SAP 4.6c to SAP BW to Hyperion Essbase for managerial reporting and forecasting; took almost 8 hours to port data between systems

� Disparate sources of reporting resulted in time consuming reconciliation between Accounting’s manual Excel spreadsheets and Hyperion Essbase.

� Hyperion metadata managed via SQL and csv files

� We have implemented:● Hyperion Planning – used for annual budget and monthly forecasting

● Hyperion Essbase – used for reporting employee and SKU level detail

● Hyperion HFM – used for consolidations of GAAP and Managerial financials; automated external allocations

● Hyperion DRM – used to manage metadata within Hyperion systems (HFM and Planning)

● Hyperion FDM – used to transfer financial data from SAP extracts to HFM

● Star Analytics – used to control and automate batch processes

� Actuals fed directly from SAP SPL (Special Purpose Ledger) to Hyperion HFM; total process including consolidations takes about 20-25 minutes

� GAAP and Managerial Non GAAP Financials are produced out of Hyperion HFM and ported to Planning for full year forecasting

� Planning applications have been rebuilt to mirror HFM dimensionality

� Metadata for HFM and Planning applications are maintained in DRM

� Metadata request/approval processes remain manual

� Metadata and data load processes automated via Star Command Center

Current State

5

� Modifications to align Planning dimensionality with HFM have created some challenges in aggregation timing

� Planners have requested the ability to input forecast data in either local or US$ currencies while maintaining automated currency conversions

� Need to implement a workflow solution and define more effective request/routing/approval processes to take advantage of full metadata management capabilities in DRM

� Defining/agreeing upon SOX requirements and testing methodologies for DRM

Future Challenges

6

HFM Implementation Challenges

7

● Special Purpose Ledger – getting data out of SAP

● Local Currency / Group Currency issues

● Managerial Rules

● Automation

● Time Stamp

● SOX Challenges

Special Purpose Ledger

8

Requirement:

� The ability to load GL Account balances by cost center from SAP into HFM.

Issue:

� We were on an old version of SAP (SAP 4.6c).

� Both of the FDM adapters were compatible with SAP 4.6c, but only from a GL Account perspective. They weren’t configured for cost center level information.

Resolution:

� Created a special purpose ledger within SAP to combine GL Account and cost center level data together.

� As transactions are posted to SAP, they are also posted to the special purpose ledger

� Created a program that produces an extract file from the SPL, that is used to load into FDM.

Local / Group Currency Issues

9

Requirement:

� We needed the capability to preserve historical Group Currency Values for certain GL Accounts (intercompany and investment assets), while maintaining ability to tie back to SAP and ability to quickly load override data to HFM.

Issue:

� Initial approach was to load only LC values from SAP but this did not satisfy requirement to tie to SAP. Secondary approach of using FDM to implement ‘find and replace’ function for affected GL accounts took excessive amount of time.

Resolution:

� Created one extract file from SAP, that had both LC and GC currencies.

� Because the FDM automation process via Star Command center required unique filenames for each FDM load step, we created a script that made a copy of the SAP extract file, renamed it, and saved it, to be used to load the GC data.

� Created a secondary script in FDM that utilized an alternate “USD override” account hierarchy to identify the GL Accounts that needed the historical GC values, and load only those accounts with the GC value.

� Load original SAP extract file for full LC load purposes, and load the second file using secondary FDM script to load only the GC values for USD override accounts.

� Utilize HFM Consolidation rule to copy GC values from USD Override accounts into the main account hierarchy and then perform currency translation.

Managerial Rules

10

Requirement:

� The ability to automate the adjustments between GAAP financials and Managerial Non GAAP results.

Issue:

� Final consolidated GAAP Financials were produced in Excel by accounting, while Non GAAP financials were produced in Hyperion Planning by FP&A; therefore adjustments for managerial purposes were manually top-sided into Hyperion Planning.

Resolution:

� We have unified the source of truth for both GAAP and Managerial results in HFM

� We are using the “scenario” dimension in HFM to capture GAAP vs Managerial results

� Created a script that takes the actuals from the GAAP scenario, and copies them to the Managerial scenario

� Apply a consolidation rule that automates most of the managerial adjustments, such as exclusions and reclasses, and posts those adjustments to a specific member in the Custom 2 dimension called “Managerial Adj”.

� A few adjustments are posted via manual journals in HFM, but still segregated as “Managerial Adj” in Custom 2.

Custom 2 dimension – Data Source

Custom 2 is used for Data Sources in order to facilitate tracking of data changes from initial SAP upload thru each manipulation of data within HFM. The definition for each member is as follows:

Adj – This member stores data that is manually adjusted in HFM through HFM manual journals.SAP_Upload – This member stores data coming directly from SAP.Mng_Adj – This member is for all business rules impacting changes to the managerial scenario through a HFM business rule.Mng_Jrnl – This member stores manual journal entries only applicable in the managerial scenario.None – This is the default member in the custom 2 dimension. It is used in some of the business rules where the other members are not a valid choice.Alloc_Source – This member holds all the allocated data. At the consolidated level this intersection should be zero.DataForm – This member is currently used to store the headcount data.

Automation

12

Requirement:

� The ability to automate data load processes, from data extracted from SAP to final consolidations in HFM, as well as maintain ad-hoc data load (consolidations) ability.

Resolution:

� We were already using Star Analytics Command Center with another system within finance

� Utilized built in HFM/FDM adapters in Star Command Center along with custom processes to initiate and control sequencing and error notification of complete consolidation process – from metadata load to FDM processing to final HFM consolidation and reporting.

� Star Analytics simply takes the jobs that already exist and links them together, so that when one job ends, the next one starts seamlessly.

Star Command Center - Overview

Star Command Center processes

Star Command Center - Process Detail

Pick up SAP FX rate and Data extracts and derive substitution variables based on filenames

FDM ETL and data load to HFM

Execute consolidations in HFM

Extended Analytics extracts to SQL database

Load HFM actuals into Planning/Reporting application

Timestamp

Error handling/email notifications and file archiving

Time Stamp

16

Requirement:

� The ability for worldwide users to know when actuals were last loaded from SAP into Hyperion HFM for consolidations

Issue:

� HFM didn’t provide a time stamp capability that shows when results were last loaded to the system for consolidations.

Resolution:

� Created a customized HTML report within Hyperion Workspace that shows when the last extract file was received from SAP and loaded into Hyperion HFM

� Integrated the custom HTML report into the Star Command Process via Powershellscript

� HTML report includes multiple timezone reporting so worldwide users don’t have to convert from US Central Standard Time

HFM Timestamp

SOX Challenges

18

Requirement:

� Consolidation system must have ITGC SOX compliant access, change management and operational process with evidentiary documentation retained.

Things to Consider:

� Many discussions around defining scope of SOX inclusion for IT control testing: HFM/FDM, Star Command Center and Financial Reports included; DRM and Essbase excluded

� New paradigm for ITGC controls in that IT retains responsibility for server maintenance and access control reporting yet business takes responsibility for application related access and change control

� Disabled all native Admin user access to components defined as in-scope for SOX

DRM Implementation Challenges

19

● Lack of included workflow solution

● Reconciliation of SAP level 0 metadata

● Ensuring metadata continuity with versioning capability

● Using derived properties

● SOX requirements

Metadata Workflow Solution

20

Issue:

� Stakeholders and implementation team were not clearly informed that the DRM version we purchased (11.1.2.1) did not contain a built in workflow solution.

Things to Consider:

� Though Oracle does offer a bolt-on workflow solution to integrate with DRM, there are several third party vendor offerings as well.

� We understand that the newest version of DRM (v11.1.2.3) DOES include the workflow module and will be evaluating this upgrade option in addition to other workflow solutions.

� Before implementing DRM make sure there is a clearly defined request and approval process for metadata (even if it is manual) and that all affected business users have a clear understanding of the process and dependent users.

Metadata reconciliation to SAP

21

Requirement:

� Ensure that metadata in HFM and SAP match.

Issue:

� DRM implementation was only in scope for the Hyperion applications with SAP serving as the source for level 0 metadata.

Resolution:

� Created a nightly automated reconciliation process using metadata extracts from SAP and comparing those to the current level 0 members in DRM.

� Used as a failsafe to identify any new level 0 members that did not follow the manual workflow process and to prevent dataload errors in HFM as a result of missing metadata.

� Email notification of errors to both Hyperion Admin team and Accounting Consolidation team.

Ensuring metadata continuity with versioning capability

22

Issue:

� Manual version archiving process subject to user errors that affected HFM data integrity. An older version was inadvertently copied to the new working version and used to populate HFM, resulting in data loss.

Resolution:

� Created a process that compares the current HFM metadata to the new DRM extract to be loaded to HFM and identifies any level 0 members that are in HFM but do not exist in the DRM extract.

� Used the Star Command Center to execute this comparison prior to every HFM load and to abort the HFM load if any reconciling items are identified.

� Email notification of errors to Hyperion Admin team.

Using Derived Properties

23

Requirement:

� Automate as many metadata properties as possible via coded logic to reduce manual input volume and ensure continuity of attribute determination.

Things to Consider:

� If multiple derived properties are used to arrive at final required property, it is difficult to tell how many/which properties are ultimately used to arrive at the final property.

� How does one tell which properties are for end use and which are drivers or in some cases if a single property is used for both purposes.

� Need to identify which properties are derived but may require manual override. We used a custom Category to group these properties.

� Beware of property proliferation and clearly label the purpose (or multiple purposes) of every property, particularly if a property already exists and you determine it can also be used to drive additional properties.

SOX Challenges

24

Requirement:

� As DRM becomes the source for metadata, auditors place more emphasis on ITGC SOX controls versus SOX process controls.

Things to Consider:

� ITGC key controls surrounding separation of DEV/PRD environments and change control become issues that do not conform to auditors previous experience with transactional general ledger systems.

� As multiple approvers are given access to DRM for individual hierarchies, SOX access control testing may become more difficult to define and evidence.

� Our inquiries and research to date suggest that there is not a uniform experience in SOX controls for DRM – you will be blazing a new trail with your audit team.

Planning Implementation Challenges

25

● Replicating HFM dimensionality

● Loading GAAP vs Managerial data

● Timing of data updates

Replicating HFM dimensionality

26

Requirement:

� Ensure seamless user experience for HFM and Planning users by using same dimensionality.

Issue:

� Separating the Legal Entity and Cost Center into two dimensions opened the possibility for planning users to enter forecast data into invalid company code/cost center combinations.

Resolution:

� Created a separate non-consolidating input member in the Entity dimension and used this member in the POV of Planning forms.

� Created a business rule to cross reference the cost center to the correct company code and copy input data to the correct company code/cost center intersection. BR set to ‘Run on Save’ in the Planning form.

� Planner’s write access to the Entity dimension is restricted to the input member.

Entity Forecast input

Loading GAAP v Managerial data

28

Requirement:

� Ensure seamless user experience for HFM and Planning users by using same dimensionality.

Issue:

� Loading separate GAAP and Managerial datasets from HFM resulted in large volumes of duplicated data and excruciatingly long aggregation times in Planning cube.

Resolution:

� Modified the Type dimension to include three level zero members: GAAP Only, Managerial Only, and Common data

� Created additional SQL View to compare the GAAP vs Managerial datasets from the Extended Analytics extracts to identify all datapoints common to both datasets with outliers defined as either ‘GAAP only’ or ‘Managerial only’.

� Bulk of HFM data is loaded to the Common member reducing redundant data and aggregation times.

Type dimension in Planning cube

Timing of Data Refreshes

30

Requirement:

� Planning cube should reflect up-to-date Actual data to match HFM at all times.

Resolution:

� Upon completion of each HFM data load/consolidation process, additional steps in the Star Command Center process were added to extract data from HFM via Extended Analytics and load data to Actual scenario in Planning cube.

� Updated data within the Actual scenario is also copied to the Forecast scenario for completed periods to ensure up-to-date full year forecast values.

� Significant calc script tuning required to reduce gaps in data availability during aggregations of Actual and Forecast scenarios, particularly during bi-hourly refreshes at quarter-end.

� Job frequency and tuning continue to be areas under watch for improvement opportunities.

Q&A

Q&A

Infrastructure diagram

Custom 1 dimension – Cost CentersCustom 1 primarily stores cost center level information where applicable. SAP only stores cost center information for 60000 type accounts and not for any other income statement accounts and balance sheet accounts. For balance sheet and revenue accounts the default Custom 1 member is None, but for COGS accounts and 70000-90000 accounts we required additional classifications for reporting purposes. These non-SAP defined classifications are assigned via a UD 1 code assigned to the GL accounts.

Used for 50000 series accounts Used for 70000/90000 series accounts

Serv_Supp_None GA_NoneCOGS_Prod_None Opex_NoneCOGS_Amort_NoneCOGS_IntelIC_NoneCOGS_Other_None

Additionally, a UD1 code is assigned in the custom 1 dimension on the GAAP only cost centers. This is used to determine which cost centers are GAAP only and totally excluded from the managerial scenario. Logic in the HFM consolidation rule is set to find the cost center members with the GAAP UD1 code and zero out any value associated to those cost centers in the managerial scenario.

Custom 1 dimension – Cost CentersCustom 1 also includes cost center members for IT and Facilities expense allocations at each of the five primary functional groupings.

The allocation cost centers created are:

Serv_SuppIT_Alloc Serv_SuppFac_AllocProd_COGSIT_Alloc Prod_COGSFAC_AllocRD_IT_Alloc RD_Fac_AllocSM_IT_Alloc SM_Fac_AllocGA_IT_Alloc GA_Fac_Alloc

These cost centers are used to hold the allocated expense from IT and Facilities. In custom 1 we have set up a parent TBA ( to be allocated). Under TBA is a parent for all the Facilities cost centers and a parent for all the IT cost centers. Expense data is summarized at this functional level and then it is allocated based on the headcount within the five primary functional groups. McAfee enters the headcount via a dataform in HFM for Service Support COGS, Product COGS, RD, SM and GA. Logic is included in the consolidation rule in HFM to calculate the headcount percentage based on the headcount total by classification. The allocation percentages are then used to record the expense value to be loaded to each functional grouping in the allocation cost centers listed above. The value is not overstated in HFM because an offsetting entry is recorded in the FacilityAllocOut/ITAllocOut members to net the Facilities and IT totals under the TBA parent to zero after allocation.