essbase tips

24
Essbase Notes 1. No loading or calculation testing for objects can be done on client machine 2. The objects can be tested using data on server but not on client 3. Application menu is used for viewing application related information as well as changing application level settings and database level settings 4. Create only one database per application. That makes reading application log a lot easier. 5. No. of dimensions a. 4 or 5 – Few dimension b. 6 to 8 – Normal c. 10 or 11 – too high 6. Hierarchies within dimensions reflect both the consolidation levels as well as drill down paths for reporting. 7. Generations count top – to – bottom. a. Database – Generation Zero b. Each Dimension – Generation One 8. Levels count bottom – to – top. The lowest point in hierarchy is level 0. Level 0 members are also called as leaf nodes. 9. Member Settings are of following types a. Aliases – Maximum 10 alias tables b. Consolidation Operators – Also called as unary operators. Can’t be set for attribute dimensions. c. Data Storage d. Attribute dimension e. UDAs f. Time Balance g. Expense Reporting 10. Shared member assumes all attributes of the main member except the main member’s consolidation operator. 11. Shared member sharing values of input member, the shared members may display above or below the input member in outline 12. Shared member sharing values of computed member, the shared member must appear below such computed member in the outline. 13. Use formulas in outlines where member calculations are straightforward or a member requires Two-Pass calc and other members do not require back-calc. 14. UDAs are specific tags / flags for reporting and calculations. UDAs are specific flags set up for a specific

Upload: amit2577

Post on 30-Oct-2014

297 views

Category:

Documents


3 download

DESCRIPTION

Essbase Interview Questions

TRANSCRIPT

Page 1: Essbase Tips

Essbase Notes

1. No loading or calculation testing for objects can be done on client machine2. The objects can be tested using data on server but not on client3. Application menu is used for viewing application related information as well as changing

application level settings and database level settings4. Create only one database per application. That makes reading application log a lot easier.5. No. of dimensions

a. 4 or 5 – Few dimensionb. 6 to 8 – Normalc. 10 or 11 – too high

6. Hierarchies within dimensions reflect both the consolidation levels as well as drill down paths for reporting.

7. Generations count top – to – bottom. a. Database – Generation Zerob. Each Dimension – Generation One

8. Levels count bottom – to – top. The lowest point in hierarchy is level 0. Level 0 members are also called as leaf nodes.

9. Member Settings are of following typesa. Aliases – Maximum 10 alias tablesb. Consolidation Operators – Also called as unary operators. Can’t be set for attribute

dimensions.c. Data Storaged. Attribute dimensione. UDAsf. Time Balanceg. Expense Reporting

10. Shared member assumes all attributes of the main member except the main member’s consolidation operator.

11. Shared member sharing values of input member, the shared members may display above or below the input member in outline

12. Shared member sharing values of computed member, the shared member must appear below such computed member in the outline.

13. Use formulas in outlines where member calculations are straightforward or a member requires Two-Pass calc and other members do not require back-calc.

14. UDAs are specific tags / flags for reporting and calculations. UDAs are specific flags set up for a specific dimension & can be assigned to individual members. Multiple UDAs can be set up for a dimension.

15. Attribute dimensions add no overheads in terms of database size; but calculation is deferred till required. In built functionality to sum, average, minimum, maximum, count. 4 different types of attribute dimensions. Text, Numeric, Boolean, & Date. Attribute dimensions have one – to – one relationship with base dimensions.

16. Rules for attribute dimensions: -a. Base dimension must be sparse.b. Attribute dimensions do not have consolidation operatorsc. Although attribute dimensions can have a multi - tiered hierarchy, only level 0 members

of attribute dimensions can be associated with base dimension membersd. Base dimension members associated with attribute dimensions must be at the same level.

This can be any level but it must be the same across the base dimension.

Page 2: Essbase Tips

e. Only one set of Boolean values is allowed per database so whatever is set applies to all Boolean attribute dimensions in that outline model.

17. Time dimension is normally built in 2 waysa. Generic Modelb. Fiscal Crossover

18. Use built in Dynamic Time series calculations for period – to – date calculations such as YTD, QTD, & MTD accumulations. DTS calculations are on the fly and require time related dimension to be tagged as Time.

19. Scenario Dimension: - Scenarios typically track data sets and may include versions of such sets. Scenarios are used for tracking sequential versions of data sets =. Scenarios are used to track steps in internal buildup of a data set with contributions from different functional areas of the company. Scenarios are used for tracking what if analysis. Using @VAR and @VARPER functions with expense reporting flags on accounts, better / worse significations are easily controlled. In financial applications, data from scenario to scenario typically differs with respect to form of input and calculations.

20. Accounts Dimension: - Maximize use of consolidation operators for model building in the Accounts dimension. Minimize use of formulas in the outline of in calc scripts. Unary calculations execute faster than formulas in outline or calc scripts. Unary Calc construction provides drill down visibility in the accounts hierarchy to see where a number comes from. Formulas obscure visibility into calculations. With Accounts Tag, Accounts dimension gets calculated first in Calc All and Calc Dim functions. Time Balance Reporting and Expense Reporting gets enabled.

21. Time, Accounts, and Scenario are called as Data descriptor dimensions. All other dimensions are called business view dimensions.

22. Business view dimensions provide users a specific cut of the data; the multi-dimensional richness of analysis that extends beyond the simpler information incorporated in the data descriptor dimensions. Channels, Regions, Products, Customers, Departments are examples of Business View dimensions.

23. Creating a label outline provides advantages such asa. Provides single visual planning model.b. Expense nomenclature differences between different functional groupsc. Disclose conflicting hierarchy perceptions.d. Allow a scenario – driven process for hierarchy and data load process.e. Highlight sizing, performance and partitioning considerations.f. Analyze alternate roll-up needs.g. Plan for UDAsh. Research Data location.

24. Load rules automate loading and maintenance of dimensions in an outline. Steps for setting up a dimension building rule are as follows: -

a. Create a new rule object and associate with outline.b. Select Mode (Dimension Build / Data Load)c. Open data file and set the file properties like delimiter, header etc.d. Created a new dimension and set the build method and set properties for the dimension.e. Associate fields with dimensions and field type.f. Validate Rule file and save the same.

25. Dimension build can be initiated: -a. Using Outline Editor.b. Using Data Load dialog box.c. In batch mode using ESSCMD.

26. Generation Loading, Level Loading, and Parent / Child loading are 3 methods for building dimensions in an outline.

Page 3: Essbase Tips

27. While loading shared members, the column containing shared members is marked as Duplicate Generation if loaded using Generation Loading. It is marked as duplicate level if using level loading method. However, in both these cases, the shared member must be at the same generation / level as that of main member.

28. Parent / Child is the most flexible method for loading and maintenance of shared members. If you deselect “Do Not Share” on the dimension build settings, Parent Child automatically sets up shared members. The principle requirement is that a matching main member already exists in the outline with a different parent than the shared member currently being loaded. Parent / Child method enables building in one pass asymmetrical hierarchies for alternate roll-up paths with shared members. It enables sharing with members at upper levels.

29. Use rules files to build attribute dimensions dynamically, to add and delete members, and to establish or change attribute associations.

30. When defining Numeric Ranges for attribute dimensions, make sure that you have selected Top of Ranges of Bottom of Ranges in outline in menu option settings / Attribute member names.

31. There are 3 ways to load data: -a. Free Form loading without rules.b. Structured loading with rules.c. Lock and Send with a spreadsheet.

32. Free Form loading requires data files to be structured in a particular way with precise organization of headers.

33. Loading data with load rules deal with unstructured formats and messy source problems34. Lock & Send with spreadsheet is used for interactive applications like budgeting. The number of

records that can be sent using this method is limited to spreadsheet row availability.35. Setting up a new data load rule, is same as that of dimension build rule.36. In data load window, clicking on “interactive” in options, enables data load awaiting user

interaction to deal with unrecognized members.37. For maximum efficiency and to minimize passes on data blocks. Structure source file as follows:

a. For labels identifying data points, set sparse dimensions to the left and dense dimensions to the right.

b. Sort columns left to rightc. Use simple load rules.

38. Data load procedure can be initiated manually i.e. from Server Desktop, Select Database->Load data or it can be done automated by executing ESSCMD batch scripts which include IMPORT command.

39. If you have select or reject criteria on multiple column fields, by default all conditions must be met before a record is rejected or selected i.e. the default Boolean operator operating when there are several fields with reject or select criteria is an “AND” operator. This is set using Options / Data Load settings.

40. Options available with Global Properties panel of field properties dialog are as follows:a. Replace & With (Find and Replace)b. Case Changec. Prefix or Suffixd. Ignore field during data load or dimension build.e. Convert spaces to underscoresf. Scaling of values by specifying multiplierg. Specify a field as data field

41. You can add data file values to existing values or subtract the same from existing values or overwrite the existing values. Default setting is Overwrite. Can be changed using Options / Data Load settings.

42. Manipulate columns by moving or splitting or joining or by creating a field using text.43. To specify header lines or tokens (repeating headers), use Options / Data File Properties.

Page 4: Essbase Tips

44. You can specify phantom headers for dimensions not included in data files using Options / Data Load Settings.

45. You can set up a record selection or rejection criteria in Record / Select or Reject Record Settings.

46. You can set up a safety net whereby unrecognized members encountered during data load are identified and placed into a specific location in the outline. Options / Dimension Build Settings helps specifying build method as “Add a child of “ and specify the existing member name.

47. To activate spreadsheet Add-In, Select Tools / Add-Ins, select Hyperion Essbase OLAP Server DLL or click on Browse and go to Essbase\Bin\ExcelIn.XLL to install add-in to Excel.

48. To install the spreadsheet Add-in toolbar,a. File / Open (Browse to Essbase\Client\Sample)b. Select Esstoolb.xlsc. Enable Macrosd. View / Toolbars / Essbase

49. General Rules for placing labels in worksheet are as under:a. Header section of worksheet is scanned first, then the row / column section.b. Labels must match outline members or their aliases.c. Labels are not case-sensitive unless the same is set in Outline / Settingsd. Rows / Columns containing labels can be hidden.e. Excel Report can contain its own headers. The warning message for the same can be

toggled on and off from Essbase \ Options \ Global panel.50. Essbase Retrieve options are: Retrieve, Keep Only / Remove only, Zoom In / Zoom Out, Pivot,

Flashback.51. Server, network or poor database design such as Poor Design, Dynamic Calculations,

Transparent Partitions, Heavy Server traffic, Competing operations, may negatively impact retrieve performance.

52. Client or user generated conditions such as Large Area Retrieve may also negatively impact retrieve performance or. This happens when a user performs a retrieve operation while selecting a whole worksheet or more than one column & Analytic Services attempts to retrieve into any large selected area. Press Esc key to interrupt.

53. Global options are specific to client machines set by the individual users. The option settings apply to all worksheets and workbooks that a user may open. Settings are made from Essbase \ Options \ Global panel.

54. Display options are specific to individual worksheets. Each sheet may have its own settings of its own. Display settings are made from Essbase \ Options \ Display panel and saved with the workbook.

55. Suppress #Missing values & Zeros functionality have no memory i.e. previously suppressed rows with no values do not appear if they subsequently contain values.

56. Style options are specific to individual worksheets. Each sheet may have style settings of its own and are saved with excel workbook. Enabling Styles requires selecting “Use Style” under Cells category from Display panel.

57. Zoom options are also specific to individual worksheets. Each sheet may have zoom settings of its own and these settings are saved with excel workbook. Zoom options govern Zoom In behaviour; Zoom settings do not impact Zoom Out. In all cases, Zoom Out goes up to the next level from the member selected. Zooming out from a shared member zooms back to the shared member’s parent.

58. Navigate without data option helps developing reports using Zoom, Pivot and Keep Only without retrieving from the server. You would normally set this option ON when developing reports and OFF when reports are complete.

59. The member selection window provides access to the database outline for pasting member names. It provides 3 principal methods for selecting members. a) Member Name b) Generation c) Level.

Page 5: Essbase Tips

60. Cascade reporting helps creating one standard report complete with precise styles, colour coding and number formats and replicates this report format to multiple business view elements. Thus, Cascade creates multiple worksheets in a single workbook or multiple workbooks that replicates your standard format, retrieves on each replicated sheet and indexes the sheet reference. It creates a Table of Contents file in text format in the same directory.

61. Formula behaviour on spreadsheet is managed using Essbase \ Options \ Mode panel in formula preservation group. Formula preservation options are available only when suppress #Missing values or Suppress Zeros is deselected. It has 4 options.

a. Retain on Retrievalb. Retain on Keep & Remove onlyc. Retain on Zoomd. Formula fill on Zoom (This works for Zoom options)

62. Dynamic Time Series reporting can be enabled by specifying period in the latest time period dropdown in display panel of Essbase \ Options.

63. To create a link object with the data element:a. Select the data elementb. Essbase \ Linked Objectc. Click Attach – Cell Note or File or URL

64. Attribute dimensions do not show up in a standard retrieve. If you type them explicitly or use the query designer, they get displayed. With attribute dimensions, there is a default attribute calculations dynamic dimension.

65. Recommended option settings for Report development in Excel are:a. Deselect Formula Preservation, Retrieve on Retrieval.b. Zoom options set to Next Level and Select “Include Selection”.c. Indentation set to Totals.d. Deselect Suppress #Missing Values and suppress Zeros.e. Set Replacement labels for #Missing and #No Access.f. Deselect Cells – Use Stylesg. Select Global – Display Unknown membersh. Set Mode – Navigate without data.

66. After report is developed in Excel, following are the options to be set:a. Select Formula Preservation, Retain on retrieval.b. Zoom option set to Next Level and select “Include Selection”.c. Indentation set to Reporting preference.d. Select Suppress #Missing and Zeros if appropriate.e. Select Cells – Use Styles and implement styles.f. Deselect Global – Display Unknown members.g. Deselect Global – Mode – Navigate without data.

67. The query designer wizard is the only part of the spreadsheet client add-in that lets you create reports based on data values as opposed to member labels. The wizard defines a query, which can be itself saved for future use. It can generate the final report in 2 ways:

a. Excel workbook b. Report Script

68. The steps to create report using query designera. Essbase \ Options \ Query Designer – Select “Use Sheet Options with Query Designer”.b. Essbase \ Query Designer

i. New Query.ii. Place Headers, rows, & columns.

c. Specify members for headers, rows and columns (Double click dimensions)d. Specify filters by right clicking on selection rules.e. Save the query. (.EQD extension)f. Apply the query. (Shall populate the report with data values from Essbase)

Page 6: Essbase Tips

69. Data filtering helps in creating Top / Bottom N reports using Query Designer. Dimension being ranked can be only in Rows and base for ranking will be a column dimension.

70. Top / Bottom analysis should generally be done on row members that are defined by a member selection macro using generation level, UDA, or similar references where a complete refreshed pool of members, not a hard-wired list, is the basis for filtering.

71. Data restrictions window in data filtering, allows filtering row dimension members for one or more criteria based on the values of one column header. (Specific values, Value ranges, or Comparison of values)

72. Sorting based on values can be set up from the Navigation panel under data sorting. Data sorting lets you sort row dimension members in ascending or descending order, based on the values of a column header.

73. To run a report script in Excela. Copy and Paste Report script in Excel.b. Change mode from advanced interpretation to free form. Essbase \ Options \ Mode panel.c. Select Essbase \ Retrieve.

74. Calculating in Outline takes place in 2 ways:a. Unary or Consolidation operators.b. Member formulas.

75. Six unary operators are +, -, *, /, %, ~. When siblings have different operators, Analytic services calculate data in top – down order.

76. Every database needs at least one calc script to roll-up (aggregate) unary operators and execute formulas in outline. Input of data does not execute any calculation process. The default calc script for a new database is a CALC ALL statement.

77. The basic functions of a calc scripts are as follows:a. House Keepingb. Roll upc. Member Formulasd. Back Calc / Recalculations

78. Calculation Requirements are driven by processes, which typically include multiple interim steps. Hence most databases have multiple calc scripts associated with them. One calc script generally does not meet all calculation requirements for a single database.

79. Analytic services reads a calc script as text and then performs calculation operators according to script instructions.

80. Analytic Services Calc Script file extension is “.CSC”.81. Benefits of writing a calc script using Essbase’s own calc script editor are as under:

a. Select members from member lists.b. Calc script formula commands with syntax parameters guide from Formula / Paste

Function dialog box.c. Robust Error Checking.

82. 4 methods to execute a calc script are:a. Database \ Calculate (Select calc script and click on OK)b. From Server Desktop, Select Calc Script – Click RUN, select database and click OK.c. Essbase \ Calculate from spreadsheet.d. Batch operation in ESSCMD or MaxL.

83. Functional commands (also called as calculation commands) are divided in 2 sub-categories.a. Basic Roll-up commands used to roll up and execute member functions. For e.g. CALC

ALL, CALC DIM, etc.b. House keeping commands used to perform specific maintenance activities such as

CLEAR BLOCK, CLEAR DATA and SET commands, which specify calculation behaviours.

Page 7: Essbase Tips

84. Control flow commands manage flow of calculations. FIX…ENDFIX helps focusing calculations to specific dimensions and / or members within dimensions. LOOP…ENDLOOP allows looping for solving simultaneous equations.

85. Conditional commands like IF, ELSE, etc. also help in focusing calculations to specific dimensions / members.

86. Data declaration commands like ARRAY, VAR help in defining temporary variables and set their values. These variables store fixed or intermediate calculated values for use in other calculations. Using temporary variables improves calc performance and makes complex calculations with dependencies easier to develop and debug.

87. Operators include standard operators plus Boolean operators for comparing conditions & AND, OR, NOT.

88. Math functions include capabilities for finding minimum and maximum values, rounding, averaging, summing. 2 special operators @VAR & @VARPER used for computing variances for expense items designated expense reporting.

89. Range operators reference values of members across a range of members in a dimension.90. Boolean operators reference outline status or hierarchical relationships of member. It helps focus

calculations with conditional logic to specific subsets of members defined by outline relationships.

91. Member set commands reference outline membership using generation, level names and numbers and genealogy relationships used with FIX statements generating a specific list of members to be calculated upon.

92. Relationship functions reference outline membership using generation, level names & numbers and genealogy relationships. These functions reference actual values of other members in the outline relative to current member being calculated.

93. Allocation functions are designated to push data values from one area of the model to the other.94. Forecasting functions are designated to analyze historical data and look for trends.95. Statistical functions greatly enhance mathematical capabilities and include functions to calculate

Std. Deviations, medians, and correlations etc.96. @TODATE converts date strings to numeric values, which can be more easily used in

comparison operations. (Only one date time function so far…)97. @CALCMODE helps in determining calculation behaviours for certain functions. 98. Data is typically dense in a dimension when there is a high % of members that have data out of

all possible combinations after the database has been loaded and rolled up.99. Data is typically sparse in a dimension when there is a low % of members that have data out of

all possible combinations after the database has been loaded and rolled up.100.Each dimension in Essbase database is set to be either dense or sparse. The data block is the

fundamental unit of storage of data in Analytic services. Data block constructions, as determined by sparse / dense settings, has a major impact on calculation times, storage requirements, and other variables in Analytic Services performance.

101.Dense / Sparse settings for each dimension are typically hard-coded by a direct setting after benchmarking calculation & storage performances of alternate settings combinations.

102.By default, all the new dimensions added to an Analytic Service database are automatically set to Sparse.

103.The final criteria for dense / sparse settings should be that combination of that settings that produces the lowest overall calculation time within acceptable storage configurations.

104.A data block is the basic unit of storage. Blocks contain actual data of your database and are stored in page files located on one or more server drives. (.pag files). Data blocks contain cells. A cell is created for every intersection of stored members of the dimensions that are set as dense in the data storage dialog. Each cell takes up 8 bytes of storage space on disk (before compression) & in memory. The size of a data block can be calculated. It is the product of the stored members of the dense dimensions * 8 bytes.

Page 8: Essbase Tips

105.Data blocks are created in their entirety as long as one cell has a data value that has been input or calculated. All data blocks are the same size in bytes (before compression) regardless of number of cells that have values versus #Missing.

106.All data blocks move from disk to memory (for calculation, restructure or reporting) and back to disk from memory in their entirety, i.e. all the cells are loaded at once in the memory & written back to disk at once.

107.A data block is created for each combination of stored members of sparse dimensions where there is at least one data value in a cell within the block. The information identifying the data block is stored in index (.ind) files on the server drive. Data blocks are created when:

a. Data is loaded to sparse member combinations that did not previously exist.b. Sparse dimensions are rolled up as specified in the outline using CALC ALL or CALC

DIM statement or other calculation functions.c. As a result of a member formula. (This functionality is enabled when the “Create Blocks

on Equations” check box is selected in the calculation group of database settings dialog.d. DATACOPY command is executed in a calc script.

108.Block parameters information is available from Database \ Information. Important parameters are as follows:

a. Number of existing blocksb. Block size in bytes.c. Block density: - The % of cells with blocks that have a value versus total number of cells

in the block. Dense / Sparse combinations should maximize block density, key measure for storage & calculation efficiency.

d. % of maximum blocks existing: - No. of existing blocks divided by potential number of blocks. It is a measure of sparsity of database.

e. Compression Ratio: - It denotes the compression efficiency when blocks are stored on to disk, normally using the bit map encoding compression technique. #Missing cells are typically compressed with this method. Compression ratio normally tracks block density. Compression is set from Database \ Settings \ Storage Panel.

109. To set dense and sparse settings for each dimension, while in an outline: Select Settings / Data Storage to open data storage dialog and set the dense / sparse settings for each database.

110. A typical calc script is divided into five sections:a. Housekeepingb. Base Line Fixc. Normalizationd. Main Roll upe. Back Calc

111.House Keeping commands set the stage for the next sequence of calculations to occur. They prepare Analytic Services Calculator to properly process the commands that follow. SET commands and data declaration commands like CLEAR BLOCK, DATA COPY form part of House keeping commands.

112.The Base line Fix is found near the top of most of the calc scripts & defines the scripts specific focus. The base line Fix often includes a scenario reference because scenarios typically differ in calculations requirements. It often includes a time qualification (especially for accounting applications where data is calculated on only for the current time frame). Calc scripts are generally broken to reflect specific steps in process. The baseline FIX statement is usually the indicator for script segregation. Break up calc scripts as might be defined by different baseline Fixes.

113.The Normalization section of the calc script focuses on preparing data for the CALC DIM or CALC ALL roll up. Input data may need to be manipulated or normalized before doing a roll up. Normalization calculations are normally done in 2 phases:

a. Focused Rollupb. Normalization Calculations themselves.

Page 9: Essbase Tips

114.Focused rollup typically include setting bases for later allocations or adjustments. Focused rollups are wrapped in FIX statements.

115.Normalization calculation sections contain most lines of code. Organize normalization routines by category (type of calculation) & if appropriate wrap each category of normalization calculations together using FIX statement so as to focus calculations only to the data blocks needed.

116.The main rollup is generally preformed with a CALC DIM on all the dimensions or with the CALC ALL. This is the time when the major of data blocks are built. To do main rollup:

a. Sum up the dense dimensions.b. CALC DIM across the sparse dimensions

117.Back Calc: - In a main rollup, upper level calculations of accounts dimensions needs to be recalculated to put the same in sync with lower levels. This is done using Back Calc calculation script.

118.During a basic rollup process, the accounts dimension is calculated first and the result of each account calculation is rolled up across all other dimensions once at a time. This does give wrong results for certain accounts at the upper levels and the same needs to be recalculated either using Two-Pass calculation or a back calc script.

119.Use member formulas in calc script when calculation requirements are more complicated (especially characterized by a more complex order of calculation of dependency relationships). Member formulas are lines in the calc scripts referencing specific members in the outline that perform a mathematical calculation on the referenced member. Member formulas require special syntax under certain circumstances. Enclose member calculations within parenthesis (…) to absolutely control the calculation order of the member formulas that follow CALC DIM statements.

120.Usually formulas in the outlines are calculated as part of CALC DIM statement where formulas are executed during the rollup process.

121.Certain constructions involving member formulas must follow Calc member block syntax rules. Such rules require member formula constructions to be:

a. Enclosed in parenthesis andb. Preceded by a pointer member name (without a semicolon). The member name is

generally any member of the dimension being calculated in the member block formulas. This kind of syntax is required in the following cases.

i. For IF Statementsii. For VAR declarations (using temporary variables) that reference member names

iii. For member formulas where the left side of the equation uses a cross-dimensional operator.

122.Analytic Services Calculator has a top – down approach to calculation i.e. unless otherwise restricted by a FIX or IF command, every member formula in a calc script is executed everywhere in the database (on each and every data block). This top – down approach or calculation ability allows accomplishing a large amount of computing with few lines of code.

123.There are 3 principal methods for focusing calculations.a. Using FIX…ENDFIX, where calculations within FIX scope are restricted to FIX

argument parameters.b. Using IF…ENDIF, where calculations within IF scope are restricted to the IF argument

parameters.c. Using Cross Dimensional operator (->), which allows hard coding of member

relationships within a formula.124.FIX can be used only in Calc Scripts, not in the member formulas in the outline. FIX may have

one or several arguments that restrict the calculation scope. Arguments may be member names or macros and may follow in any order. Arguments by default indicate “and” logic. More complex focusing can be done with arguments using AND / OR operators.

Page 10: Essbase Tips

125.FIX statements can be nested within FIX statements without limitation. Calculations specified within FIX...ENDFIX block cannot have members outside the scope defined by FIX statements. Preprocessing functions are supported within FIX statements.

126.IF statements can be used in both CALC scripts and Member formulas in the Outline. IF may have one or several arguments that restrict the calculation scope. Arguments may be Boolean with member names or Boolean that reference macros. Arguments may follow in any order and include AND/OR operator. All calculations within IF…ENDIF statements are executed according to the restrictions in the arguments. Additional ELSE or ELSEIF statements may be included. IF statements are incorporated into member blocks and follow the syntax rules of member blocks.

127.FIX v/s IF Considerations:a. FIX is index driven. It means that its arguments are evaluated without brining all data

blocks into memory. Only those data blocks required by trigger a separate pass on data blocks as specified in its arguments. Hence, Do use FIX when the arguments are members of sparse dimensions.

b. In Normalization, use FIX for focused roll ups. For allocations or aggregations or pushdowns either use macros for calculating subset descendancies or use FIX along with macros.

c. IF is not index driven .IF statements are interpreted formulas with IF statements, all blocks are brought into memory when IF logic is applied. With such conditional logic, blocks are brought into memory only once even though multiple conditions may be applied.

d. Each IF statement triggers a pass on all data blocks unless otherwise restricted by a previous FIX statement.

e. IF is efficient for focusing calculations with conditional logic on members in the dense dimensions.

f. There is an important restriction in the use of IF. IF statements must be executed within a CALC member block. Within a CALC member only member formulas can be executed and so roll-up functions such as CALC DIM, CALC ALL, AGG can’t be used with IF.

128.The Analytic Services calculator incorporates a broad range of functions that reference the relationship between members within a hierarchy or generation / level references. The same gets categorized into 3 macros, Booleans and relationship functions.

129.Member set commands create a list of members (a set), which is acted upon by another function, which incorporates the member set reference in its own syntax. The member set name describes the hierarchical relationship of the member list. Member sets are used in formulas and FIX commands where a subset of members is to be calculated upon, or as stand alone member formulas. Members set commands are also used in Analytic Services Security system for setting up filters, which specify a user’s access to subsets of the database outline and in partition area definitions.

130.Booleans create a list, which is acted upon. The Booleans operate only in the context of an IF, ELSE or ELSEIF statements defining the IF condition and returning true or false for calculative on a data block or member cells. Booleans incorporate a broad rang of hierarchy relationships, level and generation references and other member characteristics such as account type and UDAs.

131.Boolean and macros are used in the control of the flow of calculations. Relationship operators, by contrast, reference values of other members in the outline in relation to the member currently been calculated. The referenced value is then used in member formula on the right side of the equation. Relationship functions return values of members used in calculating formulas. The reference can also be based on shared membership relationship.

132.Analytic Services support 2 Categories of variables:a. Substitution Variables: Which are set service-wide or specific to applications and

database for calculating and reporting on values that reference specific members.

Page 11: Essbase Tips

b. Temporary variables: These are used within CALC scripts to capture temporary values used for intermediate calculations.

133.Substitution variables are typically used for reference where the member assigned to the variable’s value rolls over for each month such as the definition of current month for financial reporting. Substitution variables are commonly used in situations where a value (current month) reference a member name (feb) across multiple CALC script and spreadsheet reporting references. (From Application Manager Desktop, select Server / Substitution variables). The benefit from using substitution variables is that the variable is maintained in one place at one time rather than across the multiple calc scripts and spreadsheets. Substitution variables can have server specific or application specific or database specific values. Substitution variables can be set and updated through ESSCMD, Analytic Services batching facility.

134.Within a CALC script, type a substitution variable name prefaced with an ampersand (&) in any place where a member formula or reference using a regular member name would be used. As well set up the scope for the variables using Options / Set substitutions variables scope.

135.Type the substitution variable name on the spreadsheet preceded by an ampersand (&) in any page header or row/column header where otherwise a member name would be used. After retrieving, the substitution variables get replaced by the actual member names thus rendering spreadsheet unusable for future retrieval.

136.Temporary variables use the VAR data devaluation command. Temporary variables are script specific and therefore can’t be crossover in their use between databases. Temporary variables within calc scripts are typically used to capture temporary values that are referenced in member formula calculations but not stored in the actual member for reporting. The benefit is:

a. Complex formulas need to be written and debugged only once, then used on multiple occasions within the calc script.

b. Complex formulas are calculated only once, rather than multiple occasions requiring the reference, thereby reducing the calculation time.

137.Setting up temporary variables involves 3 steps – a. Declare variable in housekeeping section of script using VAR.b. Define the variable using a member formula construction. If the variable references

member names in the outline, than the syntax must follow rules of calc member blocks.c. Use the variable as references in subsequent member formulas.

138.Calc scripts development process is broken into 2 phrases:a. The prototype phase, where scripts are developed and tested for baseline accuracy.b. The pilot phrase, where scripts are tested for performance and capture of execution

conditions.139.The recommended process of creating a prototype scripts has 3 process –

a. Create test input data.b. Create Audit sheets.c. Implement draft calc scripts and test the results.

140.Execute CLEARBLOCK ALL before executing any calc script for testing purposes. This clears all previous input data and calculations. It is important that each round of testing is done on a clean, empty database with no possibility of values being carried over from a previous input and calculation cycle.

141.OLAP Server documentation contains one segment titled “Error Messages”, which shows all error codes.

142.Make 2 types of adjustments to prototype script during pilot phrase:a. Performance:

i. Focusing calculations on specific data blocks.ii. Revisiting sparse / dense settings.

iii. Revisiting dynamic calc.b. Exception trapping:

Page 12: Essbase Tips

i. Actual input data at different levels requiring adjustments to allocation algorithms.

ii. Extra logic loops to handle exception conditions.iii. Data copy commands to create data blocks.

143.3 Tips to minimize calculation time of a calc script are:a. Blocks visualization.b. Pass tracking – no. of times a data block is accessed.c. Block minimizing

144.Aggregate missing values is a special calculation function in Analytic Services that speeds up calculations by eliminating redundant aggregations in the roll-up process. The default behavior of Aggregate missing values is set at the database level. (Database – Setting – General). The default behavior can be overridden by inserting SET AGGMISSG ON/OFF in a calc script.

145.With Aggregate Missing values selected or set ON in a calc script. Calculation performance is significantly improved during roll-up process:

a. Within a data block, aggregations on cells that can be performed in 2 ways are summed only once.

b. Between data blocks, (on sparse dimensions), totals that can be calculated by 2 pathways aggregating from other data block combinations are calculated only once.

c. The calculator also has an algorithm that attempts to compute via the shortest path146.Set Aggregate Missing values OFF when the values are loaded at parent level and same needs to

be protected during roll-up process.147.Loading to upper levels that are not leaf nodes across business view dimensions is acceptable

when loading values in Accounts dimension. The same must be however allocated or pushed down to lower levels across another business view dimension.

148.Calculating dense dimensions first and sparse dimensions second delivers the shortest path in terms of the number of passes (read/write to and from disk) on data blocks.

149.In CALC ALL statement, Analytic Services naturally calculates the dense dimensions first, then the sparse dimensions. In a Calc Dim statement also, Analytic Services calculates dense dimensions first whatever the order of dimensions stated in the statement.

150.If parallel calculation is enabled, Analytic Services analyzes all the talks in a calculation pass and breaks them into subtasks. The sub-tasks that can run independently of each other are scheduled to run simultaneously, thus optimizing database performance.

151.To use parallel calculation per calculation pass, enable it at the server level, application level or database level using either of the following methods:

a. Add or edit appropriate configuration settings to essbase.cfg file.b. Add appropriate calculation commands to a calculation script

152.Set the value for the parameter CALCPARALLEL in the configuration file. A value of 0 means parallel calculation is not enabled and values between 1 and 4 means parallel calculation is enabled. The number of threads that can be simultaneously executed is specified by value between 1-4. Set the value to be one less than number of processors available for calculation. The extra processor can be used either by 0’s or by Analytic Services process responsible for writing out dirty blocks from the cache.

153.Use SET CALCPARALLEL command in a calc script for enabling or disabling parallel calculation.

154.Analytic Services Structure are often driven and distinguished by scenario members. Such structures include:

a. Partitioned database architecture.b. User access of security.c. Complexity and detail of Account dimension.d. Type and level of data inputs.e. Calc scripts functionality

Page 13: Essbase Tips

155.To understand the impact of scenarios and to document input and calculation requirements, develop a Normalization table. Elements and Formalization Table include:

a. Accounts listed on the row axis. List accounts that are input members.b. Scenarios are major sections on the column axis. c. For each account / scenario intersection, define following:

i. Data type or Sourcing: Input, formula, CALC DIM roll up.ii. Input level: Generation or level numbers.

iii. Push to level: To what generation or level the input data needs to be copied or allocated to.

iv. Methodology: Method to be used for allocation or copying.d. Thus, for each input account, identify the input level, push to level, and method for each

business view dimension. 156.Dynamic Calc feature reduces batch calculation time and hard drive storage requirements.

Dynamic calc options allow members in the outline to be calculated on – the - fly when requested by users rather than during the batch calculation process. 2 types of dynamic calc settings:

a. Dynamic Calc (Non-Store): Values are calculated on the fly and not retained in databaseb. Dynamic Calc and Store: Values calculated on the fly and retained in DB. Any

subsequent retrieval reflects such stored values.157.Analytic Services let you tag dynamic calc with special formatting, so as to differentiate them

on spreadsheet reports.a. Essbase \ Options, Select Cells - Use style in Display tabb. In Style Tab, select Dynamic Calc from members’ section and specify format.

158.Normal order for calculation in a batch process is as follows – a. Dimension tagged accounts if it is dense.b. Dense dimensions in outline or CALC DIM statement order.c. Dimensions tagged as Accounts if it is sparse.d. Sparse dimensions in outline order or CALC DIM statement order.e. Two-pass calculations on members in the Accounts tagged dimension.

159.Upon retrieval, the calculation order for dynamic calc members, store and Non-store is as follows:

a. Dimension tagged Accounts (if Sparse).b. Dimension tagged Time (if Sparse).c. Sparse Dimensions in outline order.d. Dimension tagged accounts (if dense).e. Dimension tagged time (if dense).f. Dense dimensions in Outline order.g. Members tagged as dynamic and tagged with Two-pass calculations.h. The reason for having sparse dimensions before dense is that blocks must be virtually

created before they are filled up.160.Use Two-Pass Calc tag, when a member needs to get calculated last.161.Following are baseline characteristics of members with a dynamic calc.

a. Zero level members without outline formulas can’t be tagged as dynamic calc.b. The value of a dynamic calc member is calculated according to consolidation operators of

its children or its own outline formula. However, the value is discarded after retrieval.c. During a batch rollup process, dynamic calc members are not calculated. Thus, dynamic

calc members may not display on the left hand side of a member formula. However, they may get calculated for deriving dependant values of stored members. Thus, dynamic calc members may display on right hand side of a member formula.

d. Tagging members if a dense dimension as dynamic calc reduces block size thereby potentially improving batch calculation efficiency.

Page 14: Essbase Tips

e. Tagging members of a sparse dimension as dynamic calc reduces both the potential and actual number of blocks stored.

f. Dynamic calc members are skipped during data load. No error message is generated during data load, if there is an attempt to load data to a dynamic calc member.

162.Dense dimension guidelines for dynamic calc (Non-Store): -a. When a user retrieves information to create a report, an entire data block with the relevant

information is brought into memory. Once in memory, the calculation of dynamic members is relatively efficient because:

i. The values of stored members whose cells are used to calculate the dynamic members are usually all within a single block that is brought into memory for dynamic calculation.

ii. No additional read / write time on additional data blocks is necessary for each incremental dynamic calc member that needs calculating because all dynamic members are usually associated with the same data block and dependent on the same stored members.

b. Assigning members dynamic calc (Non-Store) within a dense dimension reduces data block size. Smaller block size potentially improves performance because:

i. Within a range, smaller blocks move into & out of memory faster than bigger blocks.

ii. A smaller initial block size may allow defining an additional dimension to be dense that would have been sparse otherwise; thus potentially reducing overall number of blocks that need to be moved in & out of memory for a given batch calculation.

163.Sparse dimension guidelines for dynamic calc (Non-Store): - Assigning upper level members in sparse dimension as dynamic calc eliminates creation of potentially many data blocks, thus reducing initial rollup calculation time & any subsequent passes on data blocks. However, dynamic calculations on sparse members require bringing multiple blocks into memory. The dynamic calculation penalty on retrieval across sparse dimensions is impacted by 3 principals:

a. Fan Out: - Setting members with large number of children as dynamic calc (Non-Store) may result into unacceptable retrieval time.

b. Stack up: - A dynamic member with many descendants that are also dynamic calc may result into a stack-up of sequential dynamic calculations that could significantly increase retrieval or calculation time.

c. Sand-witch: - Avoid sand-witch situations where members of different storage types sit between each other in hierarchy. Stored members sand-witched between dynamic members may result in incorrect calculations. Dynamic members sand-witched between stored members may cause data integrity issues.

164.Dynamic Calc & Store members: - The value of the member is calculated according to consolidation operator or member formula in the outline. Calculated only on retrieval. After retrieval, value is stored within the data block. Subsequent retrievals are as fast as stored regular members.

a. Tagging dense dimension members as dynamic calc & store does not reduce block size and only marginally reduces batch calculation time.

b. Tagging sparse dimension members as dynamic calc & store causes the database size to increase over time as users retrieve on such members.

c. During a batch calculation, dynamic calc & store members are bypassed. However, if a calculator discovers that the children of such members are recalculated, it makes data block of such parent member as “requiring calculation”. The recalculation, however, occurs upon next retrieval request. At that time, the block is marked as calculated and results are stored.

d. 2 commands may be used in batch calc scripts to cleanse previously calculated dynamic calc & Store members:

Page 15: Essbase Tips

i. CLEARBLOCK DYNAMIC: - Removes data blocks which are dynamic calc & stored.

ii. CLEAR DATA: - Marks dynamic calc & store members as non-calculated, forcing recalculation upon next retrieval.

e. Dynamic calc & store members are skipped during data load. No error message is generated during data load.

165.Design considerations for dynamic calc (Store & Non-Store): -a. Consider using dynamic calc (Non-Store) before dynamic calc & store.b. Consider Dynamic calc & Store for members in sparse dimensions with complex

formulas or calculations.c. Do not use dynamic calc & store for upper level members of dense dimensions.

166.Intelligent calc is a function of the Analytic services calculator that only grabs & calculates those blocks that need to be calculated. Intelligent calc works based on data block marking. As calculations are performed, blocks are marked as dirty or clean. When active, intelligent looks only for dirty blocks for calculations. By default, intelligent calc is turned ON for all databases at the server level.

167.To set intelligent calc ON or OFF at the server level, SET UPDATECALC True or False in essbase.cfg file. To override sever setting, use SET UPDATECALC ON/OFF within a calc script.

168.There are 3 definable circumstances under which intelligent calc operates, there by marking blocks clean:

a. CALC DIM: -i. When a calc script contains CALC ALL statement or CALC DIM statement

whose arguments include all dimensions:1. Any input blocks are calculated and marked clean2. All newly created blocks resulting from CALC DIM rollup process are

calculated and marked clean.3. All previously existing blocks marked dirty are now calculated & marked

clean.ii. If a calc script does not include all dimensions in a CALC DIM statement, even

though intelligent calc is ON, 1. All data blocks, not just dirty ones are calculated.2. All data blocks are left in their previously marked clean or dirty status.

b. SET CLEAR UPDATE STATUS AFTER: - If you execute any calc script, which includes SET CLEAR UPDATE STATUS AFTER, data blocks marked clean are not calculated upon. All blocks previously marked dirty are calculated and then marked clean.

c. SET CLEAR UPDATE STATUS ONLY: - SET CLEAR UPDATE STATUS ONLY is a calc script command that does not calculate. When included in a calc script, this command only marks dirty blocks clean. Use this command in a calc script immediately following a script where intelligent calc has been turned OFF.

169.There are 4 definable circumstances under which data blocks are marked dirty, there by making them eligible for calculation on the next round using intelligent calc.

a. Input Data.b. Modified Data.c. Ancestors of input or modified data block.d. Restructure of database: - Dense dimensions changes cause all blocks to be dirty. Sparse

dimensions change cause only blocks affected & their ancestors to be marked as dirty.170.The intelligent calc function is used in interactive or iterative situations where small incremental

changes are made to a database and it is not necessary to redo entire roll-up calculation.171.Intelligent calc provides no benefit where upper level data blocks are being created for the first

time from the input blocks. (During an initial main roll-up).