oracle hyperion financial data quality management considerations for a scaled, expedited and...
TRANSCRIPT
<Insert Picture Here>
Oracle Hyperion Financial Data Quality Management
Considerations for a scaled, expedited and integrated approach on data quality
NCOAUG – Aug 15, 2008 1:20 – 2:00 pmMatthias Heilos, Pinnacle Group Worldwide
Introduction – Matthias Heilos
• Consultant at Pinnacle Group Worldwide
• Hyperion Expertise: Financial Data Quality Management, Essbase, Planning, Financial Management
• Prior: European IT & Management Consulting firm• Business Intelligence • EPM, Reporting, Planning, CRM
Agenda
• Introduction to FDM• Situation at a Fortune 100 client• Enhancing FDM to succeed• Automation / Integration• Useful features• Questions
What is FDM?
Oracle’s Hyperion Financial Data Quality Management • Is a transformation tool that feeds source level data to
consolidation, reporting, planning, and analytical applications
• provides an audit trail to the source financial data, helping ensure data integrity and mapping consistency that allows for easy reconciliation
• offers a consistent, end user-friendly environment that provides a uniform data collection process for all reporting units within the organization
Source: FDQM Quick Start Guide
FDQM Architecture
FDQM
Batch Loader (optional)
Import Formats Imported
DataValidated
DataExport
OutputFiles
Email Notification aboutData Quality (custom)
Unmappeditems?
Mappings
HFM
Ess-baseEss-base
Custom SystemCustom System
LoadCustom
DBCustom
DB
Source Files
Source Files
Oracle E-BusinessOracle E-Business
New
Situation at a Fortune 100 client
• M&A data integration of 11 locations
• Data volume: 3.5 million records (3 locations > 1 mio), time frame: 2 hours
• > 50 attributes• Complex multi-step
mappings• Automated and integrated
process
• Import and Mapping takes very long
• Too many attributes• Problems with DB
transaction handling• Multi-step mappings not
supported• Export fails due to large
amount of data
Requirements Problems faced
FDM can meet these
requirements using
Pinnacle’s FDM EnhancerFDM can meet these
requirements using
Pinnacle’s FDM Enhancer
Limitation: Too many attributes
• Import process step• Problem: too many attributes• Solution: “FDM Extension”
• Add row number to each record in source file• Separate dimensional data and attributes, process
attributes via FDM extension (custom attribute table)• Merge data during FDM Export based on row number
Internal Processes – Overview
ImportImport
Delete (optional) Import data
Map data
ValidateValidateFix mappings
(manual / auto-map)Reapply mappings
ExportExport
Export data
LoadLoad
Load data to target system Validate results
API Event Script
Expediting the Import process
• Import process step• Problem: takes very long
Import method
Description Time to process # rows1
100K 500K 1G
Import Format
Parse data file based on Import Format
1:09 6:03 13:10
Integration Script
Access data directly from source database, add FDM meta data in script
3:46 19:40 41:42
FDM Enhancer2
Pinnacle’s generic script for files and DB as data source
0:29 2:45 5:51
1 Tests performed in test environment, results may vary 2 Administration of “FDM Enhancer” available through User-Frontend (like Import Formats)
ImportImport
Delete Import data
Map data
Pinnacle’s Integration is at least 50% faster than out-of-the-box features
Expediting the Mapping process
• Mapping process step• Problem: takes very long
• Mapping types besides Explicit and Between:• IN: should not contain many values, rather split 1 large
mapping into several mappings with only few values• LIKE: convert * * to 1* 1*, 2* 2* etc. (map-thru)
ImportImport
Delete Import data
Map data
“FDM Enhancer” can significantly lower processing time • Map-Thru only on database level via custom script: no
mappings for map-thru dimensions• Mapping process entirely based on SQL script, no costly
update statements needed
“FDM Enhancer” can significantly lower processing time • Map-Thru only on database level via custom script: no
mappings for map-thru dimensions• Mapping process entirely based on SQL script, no costly
update statements needed
Enhancing the Mapping process
• Mapping process step• Problem:
• FDM does not support complex mappings (look up data from a database or several transformation steps), only hard-coded mappings based on information in source data file can be applied
• Solution:• Create custom mapping script for complex transformations
which will be applied after FDM’s mapping step
ImportImport
Delete Import data
Map data
12
Automation / Integration
Scheduler
FDMAutomation
Script
FDMExtension
FDMProcess
Wait
FDM*
Export /LoadFiles
EmailNotification
FDM StatusCheck if completeuntil timeout
Attribs
Dims
* Validation step skipped as integrated in enhanced Import step (including data quality checks)
Data Quality at a glance
Conclusion
• FDM was created to support data quality processes of financial data and integrate this data into Oracle’s EPM suite (Financial Management, Planning etc.)
• Supports Oracle’s “Management Excellence”• Using Pinnacle’s FDM Enhancer, handling large
amounts of data is possible. Tool selection should be primarily based on purpose – should the process be controlled by business user or IT
• Pinnacle Group Worldwide leads even large FDM data integration projects to success. FDM Enhancer offers a variety of pre-built features and methods to improve, enhance, scale and expedite FDM’s performance.
Questions
Scalability: Resource usage
• Delete process step• Problem: rollback segment in parallel mode
exceeded, too many transactions per commit cycle• Solution: Paging algorithm to delete subsets of data
in smaller transactions prior to FDM step
ImportImport
Delete (optional) Import data
Map data
Scalability: Export process
Problems: 1) ADODB Recordset exceeds 2GB memory limit
2) Extract routine is time-consuming (data mart adapter)
Solutions: 1) Paging algorithm to extract
2) Create dynamic SQL script, use DB Tool for extraction into delimited flat file
Useful features
• Data quality at a glance, including enhanced management information (see next slide)
• System integrity checks• Number of mappings per dimension and location• Compare mappings between periods
• Archive existing mappings• Custom logging, can be retrieved per day,
location, and process step as stored in database