data migration for banking services solutions customer experience

44
Best Practice Data Migration for Banking Services Solutions – Customer Experience Best_Practice_Data_Migration_V21.doc – 03.11.2008 Best Practice Data Migration for Banking Services Solutions – Customer Experience Dietmar-Hopp-Allee 16 D-69190 Walldorf CS STATUS customer published DATE VERSION Nov-03 2008 2.1 SOLUTION MANAGEMENT PHASE SAP SOLUTION Setup Operations Transaction Banking 3.0 & 4.0 and Banking Services 5.0 & 6.0 TOPIC AREA SOLUTION MANAGER AREA Business Process Operations Transactional Consistency & Data Integrity

Upload: balakrishna-vegi

Post on 09-May-2015

3.271 views

Category:

Documents


3 download

TRANSCRIPT

Page 1: Data migration for banking services solutions  customer experience

Best PracticeData Migration for Banking Services Solutions – Customer Experience

Best_Practice_Data_Migration_V21.doc – 03.11.2008

Best Practice

Data Migrationfor Banking Services Solutions –

Customer Experience

Dietmar-Hopp-Allee 16D-69190 Walldorf

CS STATUScustomer published

DATE VERSION

Nov-03 2008 2.1

SOLUTION MANAGEMENT PHASE SAP SOLUTION

Setup Operations Transaction Banking 3.0 & 4.0 andBanking Services 5.0 & 6.0

TOPIC AREA SOLUTION MANAGER AREA

Business Process Operations Transactional Consistency & Data Integrity

Page 2: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 2/44

Table of Contents1 Management Summary 3

1.1 Goal of Using This Service 31.2 Alternative Practices 31.3 Staff and Skills Requirements 31.4 System Requirements 41.5 Duration, Timing and Migration Strategy 4

1.5.1 Big Bang 51.5.2 Parallel Processing 51.5.3 In ‘Slices’ 5

2 Best Practice 72.1 Preliminary Migration Tasks 72.2 Migration Activities During Business in the Blueprint Phase 8

2.2.1 Organizational Aspects 82.2.2 Migration Objects 92.2.3 Sequence of Migration Objects 14

2.2.3.1 Migration Strategy 142.2.3.2 Data Volume 142.2.3.3 Source System(s) 142.2.3.4 Dependencies Between Migration Objects 152.2.3.5 Verification of Migrated Data 19

2.2.4 Analysis of the Source System(s) Data 192.2.5 ETL: Strategy and Tools 202.2.6 Data Reconciliation 262.2.7 Dependencies Between Data Loading, Data Reconciliation, and Go-Live 262.2.8 Definition of Acceptance Criteria 302.2.9 External Impacts 302.2.10 Technical Aspects 39

2.2.10.1 Program Definition 392.2.10.2 Transfer of Files 40

2.2.11 Test Concept 402.2.12 Realization 402.2.13 Testing and Final Preparation 412.2.14 Go-Live 42

3 Further Information 433.1 Tasks After Migration 43

Page 3: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 3/44

1 Management Summary

This Run SAP document sets out the Data Migration approach and strategy for Banking Services. It is acollection of customer experiences of other banking migration projects.

A data migration in the banking environment is not only a challenge on technical aspects; it is also achallenge for the project management because of the parallel running SAP implementation projects. For thisreason the documents explains shortly how to set a migration team and the choices for the migrationstrategy.

The project scope of the required SAP migration objects and their migration sequence usually determines thecomplexity and duration of the migration project. Due to this cause, the main part of the document gives aninsight to this ensemble acting.

Dependencies between data loading and data reconciliation are important aspects within Data Migration. Forthis reason, data reconciliation, one of the critical success factors of the Data Migration, is covered in thesecond part of the Run SAP document.

Given the fact that an SAP migration project is mostly structured like the parallel running SAP implementationprojects (e.g. ASAP methodology), the Run SAP document highlight’s the main migration tasks related to theproject phases Testing, Final Preparation and Go-Live at the end of the document.

1.1 Goal of Using This Service

The document aims at providing a global overview of the migration process from legacy system to the SAPBanking platform. It is not a fully completed “cookbook” and it is not covering all necessary steps in detail.

The entitlement of the documentation is more to explain the complexity of a migration project, necessarysteps and the relation to the running project. It is a collection of experiences out of earlier migration projectsto provide information on migration approach, strategy, and sequence of migration objects, ETL methodology,testing, quality assurance, preparation, development and execution of the migration.

The Resources section describes the resource requirements, roles and responsibilities as it linked to themigration.

1.2 Alternative Practices

SAP Banking consultants should support you during the design and the setup of the data migration project.Additional advice and experience for Data Migration can be delivered as part of a SAP premium supportengagement. This service is available within a SAP premium support engagements, that is: SAP EnterpriseSupport, SAP MaxAttention, SAP Safeguarding).

1.3 Staff and Skills Requirements

Before running a migration project, it is necessary to ensure that the migration team has the followingknowledge and skills: Detailed knowledge of banking processes and the SAP solution Knowledge of migration objects used in the project Overview of the project itself

Page 4: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 4/44

It is necessary to define the data migration strategy as early as possible as the effort is generallyunderestimated. All project members have to be involved from the beginning of the project. All documentsfrom the Blueprint phase should have a chapter that describes the impacts on data migration.

Persons with deep knowledge of the source systems should be available to: Define a migration scenario Define the involved systems Decide which part of the development (extraction, transformation) can be done directly in the source

systems Define transformation rules Give the order to cleanse wrong or incomplete data in the source system Define fields that are necessary for the target system or for systems that receive data from the target

systems (e.g. BW, user frond ends)

Additional resources: Developer for writing extract programs in the source system(s) Developer for writing programs for transformation, analysis, and cleansing of the source data Developer/customizer for writing programs with a good knowledge of the standard techniques for data

migration from SAP (External Data Transfer, DI) for loading objects into Banking Services 6.0 andcomparing the loaded data with the provided source data.

Access to all project members responsible for customizing and development to verify if something isrelevant for data migration or not.

Access to employees from the auditing department for- Deciding, which data have to be controlled on which basis during or after data migration (automated or

manually, via numbers, sums, or lists, all data or just spot-tests depending on regular reporting rulesand internal instructions)

- Defining the signing process for verifying that the data migration has taken place according to the pre-defined rules, that everything is complete and correct

Access to a person with detailed knowledge about the job processing tool of the bank, for:- Deciding if to use it or not (alternatives?)- Describing the JCL (job control language) for implementing data migration programs- Testing

1.4 System Requirements

Valid from TRBK 3.0 to Banking Services 6.0

1.5 Duration, Timing and Migration Strategy

The duration and timing of a migration project depends on the chosen migration strategy (according to

experiences, minimum 3–6 months). In general the setup of a migration project should be aligned to theASAP project methodology.

There is definitely no ‘one fits for all’ strategy for Data Migration. An overview about three possible migrationstrategies is given in the next chapters: big bang, parallel processing and in “slices”.

Page 5: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 5/44

1.5.1 Big Bang

This type of Data Migration is often chosen for a new software implementation. There is no parallelprocessing of data in the old and in the new system. Depending on data volumes, you can choose from twoscenarios:

Scenario 1: The source system is stopped completely (preferably on a long weekend) Data are extracted, transformed and loaded After a successful system verification, the target system is set productive

Scenario 2: The biggest part of data is extracted, transformed, loaded and checked while the source systemis still productive On a pre-defined weekend the source system is stopped Only the delta has to be migrated and controlled The target system is set productive

Advantages Disadvantages

Short period of data migration Non-recurring effort in human and systemresources Communication (internal, external) is only neededonce No additional, temporary interfaces and programs No migration into an already productive systemFallback is easier on the migration weekend

No pilot migration possible No risk diversification Testing of migration and follow-up processes iseven more important

100% must be migrated correctly Downtime of up to a few days

1.5.2 Parallel Processing

The source system is productive during Data Migration and a certain period after Data Migration. The aim isto prove that the new software shows the same results as the old system. So every process has to beexecuted twice, and certain reconciliation points have to be defined to compare the results.

Advantages Disadvantages

Prove that the new system landscape is workingproperly Switch to the new system only when everything isworking fine

All data exists twice Extensive maintenance of all depending systems Parallel delivery of interfaces Additional programs and human resources forverification of results

Necessity to temporarily adapt processes to get thesame results

Problems with new or changing processes

1.5.3 In ‘Slices’

This strategy allows migrating a certain number of accounts or cards in slices depending on what the bankdefines. You can begin with a pilot tranche to reduce risk or validate processes before loading mass data.You can apply this strategy only if at least some accounts or cards are not depending on other accounts ofthe same system. As a precondition for this migration strategy the cut in independent slices must be possible.

Page 6: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 6/44

Example: A special internal account 123 is needed to post an item in a foreign currency on a customeraccount in home currency. The delivering system only knows account 123. If such an account is needed inthe old and in the new system at the same time, it is not possible to migrate this account (or you rebuild thewhole process before starting the data migration).

Advantages Disadvantages

Risk mitigation through migration of a pilot tranche(„uncritical“ data) Small number of data to validate the migrationprocess in a productive system Safety through automation of the migration process No fixed date for Data Migration necessary Short or no downtime

Additional tools and programs to localize accounts(dispatcher) and their status during migrationrequired

Parallel delivery of changes Automation is a must High effort in testing Long period of migration, complicated processes Many personal resources needed No fallback after migration of the first tranche Migration into a productive system

As mentioned above, it is vital to come to a decision at an early stage of the project. The chosen strategy hasan impact not only on the migration team itself, but also on the other teams of the project and maybe even onnot obviously involved surrounding systems.

Another important question is: What kind of history is needed? As the biggest problem of Data Migration istime for execution, act according to the following sentence: As much as needed – as little as possible

Example 1:

Load of payment items of the past 10 years. Depending on the product, this can result in a huge number ofpayment items. A bank normally has an archive system where the user can find every detail of payments.Also the data from Banking Services 6.0 have to be transferred to the archive system sooner or later.

Example 2:

If there is a need to load payment items and account settlements for the actual and the last year, you alsoneed the limits and interest rates of this period. So you have to assure that the customizing of the conditiongroups and reference interest rates are set up in the right way. If the lead system for overdraft limits is not thesource system of the accounts, you need an additional extract program from the leading system. On the otherhand, you don’t have to rebuild the whole lifecycle of an account when you just need the correct historicalinterest rates.

Page 7: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 7/44

2 Best Practice

2.1 Preliminary Migration Tasks

The following list gives you an overview of nice-to-have documents and tools which speed up your migrationproject:

For preparation: This Data Migration document Documentation about transactions and fields of the source and the target system(s)

For the first workshops: Pros and Cons of possible Data Migration strategies Slides to explain Data Migration strategies, if possible adjusted to the language and targets of the project

(examples!) List of Data Migration objects Project plan for software implementation and Data Migration Lists of contact persons for each subject (most projects have responsible persons for business partner,

products (accounts, cards), payments, correspondence, settlement,…) with mail address, phone number,deputy

Evaluating a tool for data transformation or documentation (Business Objects, Access DB, Excel, …)

For further work: Lists of attributes (source and target system) for every object definition of transformation rules Blueprint documents, especially the relevant chapters from documents other than Data Migration

Please note:

Depending on the Data Migration strategy and the implementation strategy in general, some of the ideas turnout to be very good and future-oriented.

When the first customers went productive with TRBK 2.0, SAP had the understanding that every bank had toswitch its software from the old legacy software to TRBK and to set its migrated accounts and cardsproductive on the same day. It was not possible to migrate accounts into a productive system with alreadyproductive accounts. Also, most of the EOD programs could not divide between productive and non-produc-tive accounts. That was a major problem if you were still in the process of loading data.

All the problems were discussed and finally SAP created the so-called migration group that disconnects theposting date of the bank posting area from objects (accounts, cards) in the process of Data Migration.

Page 8: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 8/44

2.2 Migration Activities During Business in the Blueprint Phase

A migration project is set up according to the ASAP methodology. In the following chapters, the phasesBusiness Blueprint, Realization, Test and Go-Live are described in more detail.

During the Business Blueprint, phase the evaluated migration strategy has to be validated in detail.

2.2.1 Organizational Aspects

A core banking project is usually divided into a number of teams. Every team has to deliver a businessblueprint document for its specific subject. A subject, in most cases, is not a business process but a functionor a group of functions in the core banking system, such as: Business partner Account lifecycle Card lifecycle End-of-day processing Product change Data migration Postings Correspondence

If not explicitly demanded, most of the team members just describe the new processes but do not define: The transition from the old to the new system, from the old to the new processes – for example, if you

define new fields on an existing product, do you need a change of the product version of accounts alreadyusing this product?

Aspects of data migration – for example: You are not allowed to plan a data migration at a month endbecause the GL analysts are busy with the end-of-month processes.

Possible solution: Insert a chapter describing the aspects for Data Migration into the business blueprint template Ask for descriptions of interfaces e.g. for user interfaces (UIs). At least the described fields have to be

delivered in the Data Migration process, too. Of special interest are the custom-developed fields which arecompletely new.

Ask also for historical aspects: What is needed to feed a certain turnover class correctly (e.g. transactiontypes 123456 and 234567 over a period of two settlements)? – So that the customer does not have a limitof zero because of the not-filled turnover class. Or: Are the tax documents at the end of the year correct,even if the first account settlements of the actual year were done in the old system and the remainingsettlements were calculated in the new solution?

A complete migration process does not only include extraction, transformation, and loading of data from theold legacy system into the new one, but also: Delivery of changed or new data to other depending systems, such as front end systems, archive

systems, BI, so that the employees of the bank are able to work according to the new processes afterData Migration

Delivery of changed or new data to leading systems, if data are changed by the migration process itself

Example:

The Customer Information Management system (CIM) has the lead over products used by a customer.Normally all processes that try to change these customer-product relationship have to do that via CIM, andCIM is the initiator of such changes.

Page 9: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 9/44

Imagine the bank has five current account products in its legacy system. In the new system, the customerwants to diverse more, so there will be six additional products. The responsible person for Data Migration hasto transform the existing accounts from five old to eleven new products via transformation rules. After go-live,the Banking Services system has to inform the other system, CIM, about the new customer-productrelationship so that CIM can take over the lead again.

Note: Generally it is not a good idea to combine a technically oriented Data Migration with application-oriented changes.

Reasons: The validation of data is more complicated. The effort of testing is higher. The bank’s customers and the employees must be informed about changes in functions of their accounts

and about the Data Migration itself if the system is stopped for a weekend or so.

Depending on the migration strategy, the following has to be defined: Timeframes for the whole migration process or a “slice” Flowchart of the migration process Definition of processes to be stopped for contracts in migration Definition of processes to be changed for contracts in migration Steps of data validation

2.2.2 Migration Objects

According to the IMG documentation of SAP a migration object is a ‘quantity of data that has similar businesspurposes and that can be transferred during migration from the legacy system to Account Management (FS-AM) (sometimes separated into master and flow data). This also includes objects that are not businessobjects, but can be viewed as individual objects.’

The Business Process Platform from SAP 6.0 manages the following migration objects and channels (foreven more details see the documentation in the IMG under Technical Documentation for AccountManagement (FS-AM) Concepts and Guidelines Migration Migration Objects and TechnicalDocumentation for Master Contract Management (FS-MCM) Concepts and Guidelines Migration):

Object Channels

DirectInput

BAPI Dialog Other

Direct Debit Order(master data)

X X X

Billing and OpenItems Master data(billing) Flow data (openitems)

Via account contract

RFC-enabled function modules to the bill creationsystem: BCA_API_BL_OPEN_ITEMS_MIG (Transfer ofopen items to the bill creation system) BCA_API_BL_REV_OPEN_ITEMS_MIG(Reverse open items in the bill creation system)

Page 10: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 10/44

Object Channels

DirectInput

BAPI Dialog Other

Settlement Account settlements as such cannot be migrated,however there is a special logic with regard to thesettlement calculation start and the settlementcalculation without posting. This means thatsettlements already executed in the legacy systemare also calculated in Account Management.These settlements are not, however, allowed togenerate any postings

Disbursement(master data)

RFC-enabled function modules: BCA_RFC_OR_DISB_CREATE_ACTIV (Createand Activate Disbursement) BCA_RFC_OR_DISB_CHANGE_ACTIV(Change and Activate Disbursement) BCA_RFC_OR_DISB_CHANGE_DEACTIV(Change and Deactivate Disbursement)

You need to transfer the BDDISB value as theentry origin or editing origin for the external datatransfer.

Notice on Amount X

Standing Order(master data)

X X X

Time Deposit(master data)

X

Capitalization(master data andindirectly flowdata)

You need to transfer the value BDCAPT as theentry originor processing origin for the externaldata transfer

Linked Accounts No information available yet

Card PoolCancellation(master data)

X

Card PoolContract(master data)

X

(create,change)

X

(change)

Card Cancellation(master data)

X

Card Contract(master data)

X

(create,change)

X

(change)

Page 11: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 11/44

Object Channels

DirectInput

BAPI Dialog Other

Account Closure(order)

X

Bank Statement

Master data Flow data

X X Via account contract

Account HolderChange

X

Account-CardRelationship(master data)

X(create,change)

X

(change)

Account Contract(master data)

X X

Correspondence-RecipientManagement

X X X

PLM Document(flow data)

X

Product Change(Card)

X

Product Change(Card Pool)

X

Product Change(Account)

X

BalanceConfirmation(master and flowdata)

X X

Starting Balance X

Deferral X X

Forward Order X

Waiver RFC-enabled function modules: BCA_RFC_OR_WAIV_CREATE_ACTVT(Create and Activate Waiver) BCA_RFC_OR_WAIV_CHANGE_ACTVT(Change and Activate Waiver) BCA_RFC_OR_WAIV_CHANGE_DCTVTE(Change and Deactivate Waiver)

Page 12: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 12/44

Object Channels

DirectInput

BAPI Dialog Other

Early Payoff RFC-enabled function modules: BCA_RFC_OR_PAYF_CREATE_ACTVT(Create and Activate Loan Payoff) BCA_RFC_OR_PAYF_CHANGE_ACTVT(Change and Activate Loan Payoff) BCA_RFC_OR_PAYF_CHANGE_DCTVTE(Change and Deactivate Loan Payoff)

Counter X X

(after go-live)

X

(after go-live)

Extension (masterand indirectly flowdata)

You need to transfer the value BDEXTN as theentry originor processing origin for the externaldata transfer

Skip You need to transfer the value BDSKIP as theentry origin or processing originfor the externaldata transfer.

Payment Item,info item

X

PaymentAgreement

Master data Flow data

X Via account contract

Debit position: Data is migrated using info items. Installment payments: Data is migrated usinginfo items. Counters for unpaid installments: Data ismigrated using counters (BAPI_BCA_COUNTER_CHANGE). Installment status, dates (next debit position,next monitoring): Data is migrated using a BAPIfrom the payment monitoring (BAPI_BCA_PAYMON_CREATE_FR_DATA). Dates for due date display (correspondence atend of payment phase, correspondence at endof payment agreement, settlement at end ofpayment agreement, action at end of paymentagreement): Data is migrated using the masterdata (with synchronization).

Payment Form X

Page 13: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 13/44

Object Channels

DirectInput

BAPI Dialog Other

PaymentDistribution Item(flow data)

Debit position: Data is migrated using info items. Installment payments: Data is migrated usinginfo items. Counters for unpaid installments: Data ismigrated using counters(BAPI_BCA_COUNTER_CHANGE). Installment status, dates (next debit position,next monitoring): Data is migrated using a BAPIfrom the payment monitoring(BAPI_BCA_PAYMON_CREATE_FR_DATA). Dates for due date display (correspondence atend of payment phase, correspondence at endof payment agreement, settlement at end ofpayment agreement, action at end of paymentagreement): Data is migrated using the masterdata (with synchronization).

Turnover class Are automatically filled with the migration ofpayment items

Prenote X

Effective Cashpooling(flow data)

RFC-enabled function modules to migrate themigration object data: /FSECP/API_DUE_DATA_MIGR (migration ofECP due dates) /FSECP/API_LAST_ECP_DATA_MIGR(migration of ECP data: Last run date)

Facility

Master data Flow data

Migration with main contract, participant maincontracts, and main contract hierarchies

To transfer the flow data from the account-managing system, use the report ActivateFacilities(/FSFAC/AL_PP_MIG_FAC_ACTIVATE).

Master ContractTermination(order)

X You must transfer value BDTTAP as the entryorigin or processing origin for the external datatransfer.

Master ContractHierarchy(master data)

X

CombinedSettlement

RFC-enabled function module to migrate themigration object data:/FSBPR/IN_RFC_SETTLE_EV_CR_CH.

Page 14: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 14/44

2.2.3 Sequence of Migration Objects

As you might have seen, there is not much information in the documentation about the sequence of migrationobjects. Maybe this is because there is no ‘one and only’ truth – as usual, there is more than one possiblesolution.

When defining the sequence, consider the aspects described in the following sections.

2.2.3.1 Migration Strategy

When defining the migration strategy you must take into account that some migration objects cannot be seenindependently. So a given sequence has an influence on the overall strategy, but the strategy itself againaffects the sequence of migration objects.

Examples: It does not matter if you want to migrate accounts, cards or a master contract; you always need abusiness partner with the correct role in your target system.

This seems to be obvious at first sight, but in some countries it is not allowed to have business partner andaccount data in the same system. Anyway, you have to migrate a business partner with at least a few basic(non-identifying) data. To assure the consistence of these data during daily business and migration period, aprogram must be written to compare leading and depending systems.

The effort rises enormously if you have to migrate while running the relevant systems and/or your (planned)go-live date is moving forward. What you have migrated today may not be valid anymore tomorrow. So youhave to deliver master data changes and changes of flow data as well.

Note: Whenever possible, try to migrate volatile data (e.g. prenotes, PLM documents, forward orders) directlybefore the effective go-live. Changes should not be allowed in the source system at least for a short periodbefore go-live.

Technically, it is possible to migrate a high number of objects and their history for every account, card, and soon. In fact, the complexity and run time of a Data Migration will be reduced effectively if only active contractsare migrated.

2.2.3.2 Data Volume

If the volume of data for a special product is very big, this product and the depending data might be migratedseparately as a tranche. This can also be the case if the migration objects needed are totally or partiallydifferent from other products (e.g. accounts vs. cards, loans accounts vs. term deposits).

2.2.3.3 Source System(s)

Sometimes it is not possible to migrate all data in one step. So you have to decide what will be migrated firstand what later.

Examples If you get data from an external card provider, you have to agree on possible delivery dates. Business

partners for cards (card holder, card pool holder) should be migrated separately from roles needed for theaccount creation because of the time difference – even if the accounts are migrated before the cards.

Page 15: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 15/44

Another implementation project of the bank might have the lead on parts of your migration data, e.g. aspecial product. They might ask you to migrate ‘their’ accounts on a certain date. Here you have to findout what are the preconditions, e.g. are there reference accounts that have to be migrated first?

If you have different source systems, e.g. for current accounts and for savings accounts, it is not easy todecide what is easier and faster: To merge data after extraction and load them in one program or to loadthem in two separate programs, which might increase the effort of data reconciliation. Considerdifferences of products and their depending migration objects.

If there are big differences between bank products and their migration structures, it might be better to migratethe relevant objects separately.

Example: Comparing the product customizing of a term deposit account with a loans account, there is anestimated 25 % of matching used fields. If you build the migration structure for the migration object ‘account’,you can either build one huge structure with all data needed for both products, or two smaller structures – onefor each product. In terms of performance and needed memory space the two-structure strategy turns out tobe better.

The structure of payment items, on the contrary, should be more or less the same no matter which product isthe basis.

2.2.3.4 Dependencies Between Migration Objects

Accounts

Accounts have to be migrated prior to debit cards or savings cards. Credit cards can be migratedindependently from accounts because their reference accounts can be in another system or bank. Accountsdo not have to be productive before migrating cards; they can be migrated in the same migration group.

Historical contract master data

Historical contract master data can be migrated in terms of account changes or product changes. Theaccount creation should always refer to the oldest state wanted.

Note: Product changes can only be migrated with a correct historical valid-on date if the migration grouppresent date starts in the past and is set forward step by step.

Note: If you have to migrate the contract history, find out if the complete lifecycle with all account changesand product changes is required or not. Most customers only need historical interest conditions for A recalculation of settlements for the past because of payment items with a value date in the past Comparing results of settlements in the source and target systems

Historical changes of condition types and groups can be done with an actual valid-on date of the migrationgroup present date if the product customizing allows changes (see next figure: green flash light, open lock):

Page 16: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 16/44

If this is possible, you can migrate into the newest product. Older master data should be found in an archive –an exact historization of master data is often not possible anyway.

Flow data

Before migrating any flow data, it is recommended to first create all master data including history. At the end,you should have the same product status you want to go live with. Reasons: Most of the source systems do not allow complicated product changes. Very often, you can migrate histo-

rical flow data (payment items, info items, settlements, account closure) independent from the product. All data you need for the future life of an account or card must be migrated against the actual status of

master data. Diverse checks take place during migration, and if you try to migrate data that is not allowedin the product, you have to find out if the product is not appropriately customized, or if a transformationrule is wrong.

Payment items

Payment items have to be migrated before settlements. Otherwise the result of the settlement will not becorrect and the system does not allow migrating a payment item with a posting date older than the last(migrated) settlement.

To deal with the historical settlement calculation, you need at least the following: An active account Correct historical standard interest conditions, individual interest conditions, and limits A calculated starting balance, payment items and info items Execution of report RBCA_CN_BKK92_INSERT_PP

During migration, this report writes the data required for calculation of settlement results to the table BCA92.The following data is required as a starting point for the first settlement run after Data Migration: Settlement period number Start of settlement calculation: The report RBCA_CN_BKK92_INSERT_PP creates an initial settlement for

each settlement track. In previous releases, the start date and end date of these settlements were basedon the event 013 – account interest calculation start. With Banking Services 5.0 the start and end date ofthose settlements are based on the start date of interest calculation of each track.

In case of condition type offsetting (e. g. for earnings credit), the surplus amount of the subtrahend can beforwarded to future periods. The surplus amount of the legacy system has to be migrated, for which the APIBCA_API_CF_INSERT is used. The report RBCA_CN_BKK92_INSERT_PP stores the carry-forwardamounts in the new table BCA_CARRY_FORW.

For combined settlements, the compensation start date has to be filled.

Page 17: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 17/44

Closed accounts

During the migration you will come to the point where the following question will be raised:

Why migrate an already closed account? Possible reasons therefore are: Year-end tax documents can only be produced in the new system It is possible to reactivate an account until a month or so (customizing) after closure

If it is necessary to migrate closed accounts, then which objects are needed? First you again need to knowwhy you are migrating those accounts. If you have to produce tax documents, of course you need postingand settlement information. If you want to reactivate an account, it is more complicated. But the reactivationprocess is just an exception, so some things can be done manually.

Note: The recommendation is to migrate data that is necessary for a correct settlement and the closure itself.All data that represent the future life of an account (see above) should not be migrated because ofperformance reasons. Often it is not even possible.

You can migrate PLM documents, but only as long as the account is still active and they must have the status“closed” before closing the account itself. This might be an extra program in the migration process but canalso be included in the migration program for account closure. In any case it is a customer-owneddevelopment. You should be ware of the fact that reactivating an account does not necessarily lead to areactivation of every dependent object.

Reference accounts

If you want to migrate an account referring to another account as a reference, e.g. for settlement payments,make sure that the reference account is either still productive in the target system or that it is part of the samemigration group but migrated before the referring account.

For an overview of a possible sequence please have a look at following figures:

Page 18: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 18/44

Page 19: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 19/44

Migration of GL-relevant data

After the go-live of a migration tranche, the balances of the accounts in the respective migration group mustbe posted into the correct accounts of the GL system. To do so, start the program BCA_INVA_RUN_PP -Inventory Preparation for Legacy Data Transfer to GL (best before the next EOD processing after go-live).The report carries out a preparation run of the legacy data transfer for the general ledger update.

When the legacy data is transferred, the balances posted in Account Management (FS-AM) are written to thelegacy data transfer tables. If it is not a simulation run, the balance transfer to the general ledger is alsoprepared. If the system determines that balances for a contract already exist on the general ledger duringpreparation of the legacy data transfer, the output is a warning message and the contract will be ignored.

For each go-live date, the legacy data transfer selects all those accounts that have not yet had a legacy datatransfer, according to the selection parameters. The system recognizes accounts as being relevant for thelegacy data transfer if the date of the contract start is after the go-live date for the account.

With the next GL transfer, the GL accounts will be updated with the balances of the migrated accounts. Theoffsetting account should be an account where you only post migration-relevant balances.

Note: The balance formerly posted via the old legacy system must be posted with the opposite posting recordto reduce the amount of money on the ‘old’ asset or liability accounts. Again, the offsetting account must bethe special migration account (customer-own development).

After every tranche, the balance of the offsetting account must be zero.

2.2.3.5 Verification of Migrated Data

Often the same functions are realized with different techniques in the source and target system. There is nobenefit of an automatic reconciliation of data consistency if the same transformation rules are used for loadingand for reconciliation. Another challenge is to migrate and reconciliate data of the same object but fromdifferent leading source systems.

Example: In a customer project, the source for individual debit interests and limits was stored in anothersystem than the source for individual credit interests. When those interest rates were loaded, they affectedeach other by creating new entries or terminating an entry and creating a new one of the other condition. Atthe beginning, the reconciliation only worked if just one condition type was loaded.

2.2.4 Analysis of the Source System(s) Data

It is necessary for the migration project to have at least one contact person for each system involved to: Inform you about standard processes Explain fields and possible field values Inform about major incidents, data inconsistencies and data corrections happened during the relevant

period of time Write reports to evaluate the content of the source system Explain evaluation results and discrepancies from expected results Tell how to get data if they are no longer in the source legacy system at the time of migration, e.g.

because of archiving or reorganization

Page 20: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 20/44

During the Blueprint phase you have to verify your migration strategy by defining migration steps, relevantobjects and transformation rules. The first step might be an interview with other project members. Theproduct customizer, for example, has to find out what kind of products a bank has and what processes can beexecuted with every product. Step by step, he or she evaluates the number of new products to be createdand the necessary fields. So there must be at least an idea of how to transform accounts of an ‘old’ productinto the new one.

One fact that you have to be aware of is that the other project members do not want to rebuild the legacysystems but to create new, future-oriented products and processes. So the old processes and productsshould only be used as an orientation. Maybe you do not find an equivalent for old data in the new system.You have to ascertain if this is correct or if it was forgotten during the definition.

Example (interest and fee conditions): Evaluate what kind of standard conditions exist in the legacy system Figure out if there is an equivalent for all of them or if someone wants a transformation into another

standard condition If you have to migrate historical standard conditions, make sure that you have an equivalent for them in

the target system. If not, ask the conditions team to create one. Historical fee conditions must be correct from the beginning of the first period calculated and posted in the

Banking Services 6.0 system. Normally, it doesn’t matter if they are correct for periods calculated in thelegacy system because they are not affected by postings with a value date of an already settled period.

Note: If you have an own settlement track for period-based fees, make sure that the first period productivelycalculated in Banking Services 6.0 starts at the same date as the others (if all have the same period). It is notrecommended to recalculate historical fees because of performance reasons, but you have to find the correctstarting point for them anyway. Evaluate what kind of individual conditions exist in the legacy system. Especially for fee conditions, you will probably need to cleanse incorrectly created individual conditions.

To do so, you should set up a ‘mini project’. One target of the migration might be to reduce the effort of maintaining too many individual conditions, so

evaluate if you can convert former individual conditions into standard conditions in Banking Services 6.0. Itis a good idea to ask the conditions team because they too have to analyze the existing conditions.

2.2.5 ETL: Strategy and Tools

After the migration strategy is defined and you know what to migrate, you will have to define how to migrate.ETL stands for Extraction, Transformation, Loading. Consider the following questions:

Who is going to write the extract programs?

If the person is an very experienced developer that knows the source system(s) very well, the transformationof most of the data can already be done in the source system.

Advantages: Only the relevant data for extraction and transformation is selected no overhead of data, better

performance Files (if chosen for the data transfer) are smaller, file transfers are, in general, faster Changes to transformation rules, which happen quite often during realization and testing, can be

implemented faster, especially if additional fields, for example, from an external system are needed

Page 21: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 21/44

Inconsistent data are detected faster, data cleansing or a workaround solution is defined more rapidly andwith less mistakes

Where to transform data?

If the above mentioned solution is not possible, you have to decide either to buy a transformation tool (e.g.Business Objects), to develop a tool with bank-owned methods, or to use the SAP standard EDT tool.

External Data Transfer – All transactions required for processing the ETL are available in IMG:

Note: When generating a structure, please pay attention to the following: If you are importing sender structures, never use packed fields. You will get a problem with the translation

of different character sets (ASCII, EBCDIC).

Page 22: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 22/44

Do not sort the fields of a structure alphabetically. The structure should be the same as the one in the datadictionary.

How to document transformation rules?

The first step is to document the whole structure of a migration object, for instance, in an Excel sheetincluding the customer-developed fields. Workshops with other project members who are responsible foraccounts, cards, payment items and so on will help you to define which fields or field groups are not used bythe bank. Now you can shorten the structure significantly. As a consequence, the transfer files will be muchsmaller and more transparent.

Note: Be careful not to delete too many fields. It is not easy to understand technical coherences betweenfields. Sometimes you will need a field in the structure even if you do not want to fill it. So it is better to deleteonly field groups that you really don’t need. For example, if you migrate only current accounts, you don’t needfields with loans or time deposit content. Product Configurator is a good help to identify connected fields foraccounts, cards, card pools, or master contracts.

During realization and testing it often turns out that the bank wants to use more fields or functions thandefined before. This can have a direct impact on extraction, transformation rules and data load structures.

The second step is to define the transformation rules for every field or field group. You will find differentdegrees of complexity: The source system may not deliver anything. The field values can be taken from the product customizing. No (or nearly no) transformation needed: You can use the same field value as in the source system. You

just have to describe if a leading zeros has to be added at the beginning or if the value has to be deliveredleft-aligned or right-aligned (e.g. account number, account currency in most cases).

A (more or less) 1:1 transformation: The value ‘A’ in the source system will be value ‘6’ in the targetsystem (e.g. interest calculation method).

Transformation into a field group in Banking Services 6.0: The logic in the source system can becompletely different than that in Banking Services 6.0. (e.g. periodicity of account settlement, bank state-ment agreements).

You don’t find equivalent data in the source system. This can happen if a new functionality is used in theBanking Services 6.0. Either you can take a fixed value or the value from the product customizing, or youhave to define a transformation rule derived, for example, from CIM data.

Example 1: Account master data Copy the contract structure BCA_STR_CONTRACT_ALL_DI into an Excel sheet. Delete all fields that do not belong to an account contract Mark all fields that are not needed in the migration project of your current customer (structure from TRBK

3.0, only the first fields):

Page 23: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 23/44

Page 24: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 24/44

Create a new sheet with the reduced structure (example; has to be changed according to customerrequirements):

Define the transformation rules. As the responsible person, choose the person responsible for therespective function (account statement, settlement…).

Page 25: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 25/44

Example 2: Account settlement

The migration program, for example, calls the BAPI BAPI_BCA_ACC_SETTLE_STRTSINGLE with the importparameter of the above structure. Please ensure that a commit is called after every single settlement beforecalculating the next. Otherwise, the results won’t be correct, because the starting date of the calculation iswrong.

How to load the data?

Of course you can choose the External Data Transfer (EDT) tool for loading data. It doesn’t matter if thetransformation takes place outside the SAP system or not. The tool does not cover all the demands of a bankusing Banking Services 6.0: Performance aspects: Without a customer development, the tool is not able to load data in parallel

processes. It is not possible to load huge amounts of data in a short period. To use parallel processing,you can use Framework for Parallel Processing (FPP) which is part of each SAP NetWeaver installation. Adeveloper guide is also available in the Run SAP Roadmap for Banking.

It is not possible to use EDT for all migration objects (e.g. account settlement) or if your data load file hassubstructures that must be used to load individual conditions. You will have to develop your own programfor such objects.

There are no control mechanisms of data consistencies in place before loading the data. The customershould deliver a program to verify, for example, if the number of data records received equals the numberthe sender wanted do deliver (this number can be part of a file trailer).

After an abnormal end (abend) of a load program, a tool is required to analyze which records have beenloaded before the abend and which not. In addition, you have to reconstruct a file with the unloaded datato start loading again. Such a tool or functionality is not available from SAP for EDT.

Page 26: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 26/44

2.2.6 Data Reconciliation

When all necessary data are migrated to the target system, the project has to define criteria to reconciliatethe results of Data Migration and verify it against the instruction of the revising persons.

There is no general rule of what has to be validated and how. The rather formal aspects can be handled inthe ETL phase, for instance: Number of objects and datasets extracted = number of objects and datasets loaded Number and amount of debit payment items extracted = Number and amount of debit payment items

loaded Number and amount of credit payment items extracted = Number and amount of credit payment items

loaded

The reconciliation of content-oriented aspects must be defined separately. An automatic reconciliation with agood protocol is preferred, but very likely not all field values can be validated automatically. So additionally,random samples of every product, migration object and so on must be defined and checked.

An automatic validation is primarily of interest for data that can be changed during the migration processbecause of a delta load or parallel delivery to the old and the new legacy systems via one or more differentchannels, such as account status, number and amount of payment items, PLM documents and theirequivalent in the source system. The first reconciliation can be done directly after the initial load, the last onedirectly before the planned go-live. During reconciliation no changes should be allowed. The verification ofhistorical settlements is important, too, to validate that the conditions are correct and a posting with a valuedate after the last settlement will lead to a correct recalculation of a settlement period.

The migration team has to define the tools and programs for reconciliation. It is recommended to extract datafrom the source legacy system and transfer the result files to Banking Services 6.0. A customer-writtenprogram should compare the loaded data with the content of the file. Contracts that are not ok cannot go live.

2.2.7 Dependencies Between Data Loading, Data Reconciliation, and Go-Live

SAP uses the so-called migration group to divide contracts and their dependent objects in migration from thealready productive contracts. The idea is to load every contract of a tranche into a migration group.

All SAP standard programs, dialog programs as well as mass runs, know that they are not allowed to processcontracts with a non-productive migration group status. The same logic must be implemented into thecustomer-developed programs. With this technique it is possible to load data independently from the end-of-day processing, and no mass run can destroy any half-migrated contract.

Page 27: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 27/44

The Planned Go-Live Date enables you to load data over a period of more than one day, and the go-live caneven be postponed.

As mentioned before, the Migration Group Present allows you to reconstruct the lifecycle of a contract. If youuse this function, you normally have many ‘time slices’, because a lot of contracts will be available. In thiscase, you need a program that derives the different migration groups and calls the programRBCA_SET_PRESENCE_MIG_GRP to set the date in Banking Services.

Note: Possible Problems – After loading, it very often turns out that some of the data is incorrect and acorrection is impossible. For instance, while the first migration group (the one with the correct contracts) willget the status “productive”, the other (with the damaged contracts) will receive the status “migration failed”after the decision of go-live. These contracts and all relating table entries can be deleted and, after a programor data correction the same contracts, reloaded.

Note: Not all standard programs can handle the migration group status 40 (“migration failed”) correctly. Toassure the Banking Services system is working correctly, the program RBCA_UNDO_MIG_GRP, whichdeletes the key information of a contract, has to be executed. If possible, execute it before the next mass runafter the go-live starts. Otherwise error messages in the mass runs are created although nothing will bedestroyed.

Depending on your migration strategy, you have to make sure, that also the source legacy system(s) knowsthe status of a contract. Depending of the status, the source legacy system decides, for example: To send an error message because the contract is in the migration process To send the status “in migration”, and the surrounding systems deliver changes either to the old legacy

system or to Banking Services or to both systems, as defined To process or not to process a contract in the end-of-day processing, maybe in a different manner during

migration

Following are some examples of possible communication steps between the source legacy system, BankingServices and surrounding systems.

Page 28: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 28/44

Example 1: Big bang – no communication

Start migration

Create onemigration group

Extract, transformand load of all

data

Validate data

Data correct?

Correctionpossible?

Correct data

yes

no

yes

Go live

Stop every system

Fallback /workaroundno

Start only newsystem

End of migration

Before starting ETL, every system of the bank that can change data in the legacy systems must bestopped. No master data changes or postings are allowed until all contracts are migrated and the newsystem landscape is set active.

The old legacy system(s) don’t need to know anything of migration in terms of a status or differentprocesses, because it will never be switched on again.

After loading data in Banking Services, a reconciliation process is started. Whenever possible, incorrectdata should be corrected. If this is not possible, a decision has to be made based on the number ofincorrect contracts or the error types whether a manual workaround can be executed or the migration itselfhas to be stopped.

When everything is ok, the migration group status “migration complete” is set in Banking Services. After some tests the new system landscape with all new processes can be started.

Page 29: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 29/44

Example 2: Migration of a tranche

The following picture is a simplified illustration of a migration process in tranches. The basic idea is to makesure that the migration of a certain number of contracts does not affect the functionality or technicalavailability of the rest of contracts. Neither the bank’s customers nor the employees (except the IT…) shallrealize that something is happening. The illustration is just one example out of lots of possible procedures.

Start migrationtranche 1

Create twomigration groups

ETL (initial load) of1. tranche into

migration group 1in BPP

Validate data of 1.tranche

Data correct?

Set status‚unproductive’ in

the source system

yes

no

Go live withmigration group 1

Set status ‚inmigration’ for 1.

tranche in sourcesystem

Transfer incorrectcontracts to

migration group 2

Set status ofmigration group 2:

migration failed

End of migrationof tranche 1

Delete status ‚inmigration’ in thesource system

ETL (delta load) ofremaining

contracts intomigration group 1

Validate data

Data correct?

yes

noTransfer incorrect

contracts tomigration group 2

Delete status ‚inmigration’ in thesource system

Page 30: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 30/44

No system or at least no vital system is stopped. All processes are executed normally. It is assumed in this case, that the bank wants to make an initial load and a delta load a few days later. During the ETL phase itself, changes of master data are stopped, but only for the relevant accounts, to

simplify data reconciliation A migration status is needed, so that the surrounding systems know if theycan deliver or not. In this case master data should not be changed, but you can post if you decide so.

The first data reconciliation after the initial load decides (manually or automatically) if up to that point themigration is correct for every migrated contract or not. If not, the easiest way to handle those contracts isto transfer them to a migration group which finally will receive the status ‘migration failed’. In this case, thesource legacy system needs to know that. You can clear all migration information as if nothing happened.The contract will be handled like the others not in migration.

The delta load treats only those accounts which are still ok. Again there is reconciliation, and again theremight be some contracts which are not ok…. and so on.

Finally, the migration group 1 goes live (status 30: migration completed) independently from the number ofcontracts which ‘survived’, and the migration group 2 gets the status 40 (migration failed).

Note: If you extract data more than once for the same contract, you have to make sure that you can identifychanges since the last extraction.

If you have online processes delivering data in both, old and new systems, you should better reconcile alldata against the source system before go-live, even if you did not load anything in the last (delta) migrationprocess. You have to make sure that all data records are consistent, even if it is not the fault of the DataMigration team.

2.2.8 Definition of Acceptance Criteria

One task of the migration team is to define criteria for every step of the Data Migration. You need theassistance of people from the users department and a reviser to cover all demands of the bank andlegislation.

Here are some important questions: Which data must be validated? Do you have to use special tools? What has to be done automatically, what can be done manually? What kind of lists must be produced? Layout? Is a printed version required? How much time is needed for

the diverse verification tasks? Who is allowed to do the verifications and to sign the results? What are the criteria for a go/no-go decision? What kind of document is needed to record the migration process itself? What kind of documentation is needed for manual or automatic corrections during or after the Data

Migration process?

2.2.9 External Impacts

A Data Migration is not only a technical task. There is a lot of communication between other project members,bank employees, and external systems. There are some decisions to make and restrictions to pay attention towhich cannot or not easily be influenced by the bank.

Page 31: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 31/44

Find here some examples: In Germany, it is possible to close a bank for one day if one weekend is not enough to migrate all data.

You have to apply for it in a special form. The migration of cards must be planned in a tight cooperation with the card provider:

- The card provider normally has very narrow time windows for testing.- Mass runs of the card provider must be considered (e.g. renewal runs).- Ask the provider to stop master data changes, if necessary.- The delivery of master data from the provider must be organized, including reconciliation.

You have to inform other banks or authorities about changing bank account numbers. The information ofcustomers must happen in a given time frame if important functionality will change after Data Migration orif the access to the accounts is limited or not possible during the migration process.

The definition of the migration process might result in some tasks for other project teams or external systems.Find here the tasks: Parallel delivery of changes depending on the migration status of a contract (you must produce a

complete list of transactions and fields) If the bank runs more than one legacy system, there must be a‘dispatcher’ to deliver the data to the correct system.

Do not allow transactions (= error message) for contracts in migration The Valid From date of a product must be at least as old as the oldest creation date (in the source legacy

system, not migration date!) of all contracts

Standard conditions must start with the earliest period to be calculated during the migration process andmust correspond to the historical conditions in the source system

If there is no equivalent to the historical conditions, ask to create special condition groups for migration.During the migration process (directly before or after go-live) you must change the artificial condition groupinto a valid one.

Page 32: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 32/44

You need the following customizing entries: SAP Customizing Implementation Guide Financial ServicesAccount Management Item management Basic Functions Assign Medium/Payment Method toPosting Processes.

Technically, it is possible to migrate starting balances as payment items. If you do so, you don’t need thebalances entry.

Note: If you migrate a starting balance of 0 for every account that has no postings in the legacy system,make sure that the respective transaction type will not be selected as a payment item on the bank statement.

Page 33: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 33/44

SAP Customizing Implementation Guide Financial Services Account Management Item management Basic Functions Maintain Posting Types

SAP Customizing Implementation Guide Financial Services Account Management Item management Basic Functions Transaction Types and Transaction Type Groups Maintain and Assign Transaction

Types for Payment Items

You need both transaction types or an equivalent if the bank creates own transaction types.

Page 34: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 34/44

SAP Customizing Implementation Guide Financial Services Account Management Item management Basic Functions Assign Posting Types to Transaction Types

SAP Customizing Implementation Guide Financial Services Account Management Item management Basic Functions Maintain Non-Balance-Changing Transaction Types

Page 35: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 35/44

SAP Customizing Implementation Guide Financial Services Account Management Item management Basic Functions Maintain Non-Balance-Changing Posting Categories

SAP Customizing Implementation Guide Financial Services Account Management Item management Basic Functions Assign Non-Bal-Changing Posting Categories to Non-Bal-Changing Trans. Types

Page 36: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 36/44

SAP Customizing Implementation Guide Financial Services Account Management Item management General Ledger Transfer Maintain GL Operation

SAP Customizing Implementation Guide Financial Services Account Management Item management General Ledger Transfer Assign Non-Balance-Changing Transaction Type to GL Operation

Page 37: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 37/44

SAP Customizing Implementation Guide Financial Services Account Management Item management General Ledger Transfer GL Account Assignment Non-Balance-Changing Processes

It is recommended to create a new offsetting account for Data Migration in the GL for reconciliation reasons.

SAP Customizing Implementation Guide Financial Services Account Management Tools Parallelprocessing Maintain Job Distribution

The number of tasks must be defined together with SAP Basis. Relevant criteria are: Hardware Available batch processes CPU utilization (account settlements need much CPU, payment items not) Other batch programs running at the same time

Note: SAP delivers most of these entries, but sometimes customers create their own posting types or deletetoo many of the entries, because they think they don’t need them. So just check if they are still there.

Page 38: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 38/44

If there are external systems receiving data from the legacy systems, like a front-end system or a businessinformation warehouse, the following has to be decided and defined: Must there be an extraordinary data delivery after the go-live and before people start to work with the new

environment? (Possible reasons: completely new data in the new system, new processes where thesedata are needed)

If not, how can the user actualize data from the day before? (Assumption: The data are only deliveredonce in the end-of-day processing). This is a general task, not only for Data Migration!

Delivery of data for SAP BW

If you are migrating into a productive system with data already delivered to SAP BW, you must pay attentionto the following: The BW extraction is working with time stamps. The already productive data are extracted every day in

the end-of-day processing. The time stamp of the last extraction is saved in a table in Banking Services.The next extraction will select all changes made after this time stamp.

The standard BW extraction selects only contracts in migration if there is a migration group in progress.Already productive data are not selected. After selection the new time stamp is saved. The problem canbe solved by executing the following steps:- While migrating master data of a contract, the BW link table BCA_BCT_CN_OBJV is automatically

filled. The migration group is saved in field XTR_STATUS.- Program RBCA_PP_SIF_TBBW_CHG_MIGR_OBJV changes the value for the migration group into

the standard value for already productive contracts ‘PEEEEEEEEE’. Now there is no differencebetween productive contracts and contracts in migration.

- Additionally, you have to change the field value for the field Status BW-Relevant to “20”.

In the next EOD processing all master data and all related flow data that have been created after the last timestamp of extraction will be selected.

Note: Program RBCA_PP_SIF_TBBW_CHG_MIGR_OBJV only selects relevant entries if the migrationgroup has the status 20 (in migration). If the migration group goes productive without having executed theprogram, migrated master data are not extracted. In case you extract payment items and other flow data forthose contracts, they cannot be assigned to a contract in SAP BW.

Page 39: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 39/44

2.2.10 Technical Aspects

2.2.10.1 Program Definition

Data Migration programs should be developed according to the customer’s developing rules of other massrun programs. Load programs should write a protocol where you can find: The number of total data sets Data sets loaded Data sets failed Reason for failing (message number and message text) A statistic with number per message

Load programs should generate a new file with failed data sets to facilitate correction and reload. Data setsfor loading that belong together must be treated together: If one of the data sets has an error, the wholegroup of data sets must be written into the error file and no data is saved (complete roll-back)

Example: To load an account, you need more than one dataset, especially if you have more than onecondition group. Also the periodicity of an account settlement or statement needs more than one dataset perperiodicity, depending on the migration object. You need to define whether you have to pay attention to aspecial data load order or not.

Example 1: Payment items

If one payment item out of 1000 for one account has an invalid transaction type, you load all other 999payment items. After correction, the one payment item can be reloaded.

Note: This is a general rule, but it depends on how you generate the time stamps of payment items. They canimpact the first statement created in Banking Services and the correct data selection for SAP BI, the datavalidation itself, if you load payment items more than once using the time stamp for separating formerlyloaded items from newly loaded items.

Example 2: Account settlements

You have to load four settlements for one account. The second leads to an error. You cannot load the thirdand forth before loading the second, because it is not possible to reload the second settlement. Data sets 2,3, and 4 have to be written into the error file.

Reconciliation programs should generate lists according to the definition mentioned above: One entry per migration object (correct and incorrect objects) A marker: what is not correct Statistics: how many objects in total, how many correct, how many incorrect Like standard SAP mass runs, you should create a protocol to be found in the transaction slg1 (application

logs) with information about the variant started, the content of the selection screen, error messages, andreturn code. If you use an external job control tool to set variables in a variant, the incorrect delivery of avariable is often a mistake, especially at the beginning of testing with a job control tool.

A reorganization program should be able to create a new input file out of the original error and successmessages after a dump. Please consider that you cannot reorganize those data manually if you don’tknow what is loaded and what is not in a program with high amounts of data.

Page 40: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 40/44

If you use EDT for loading, you can use transaction KCLP for monitoring the load process. If you writeown parallelizing programs, please use program RBANK_PP_MONITOR or ST13 Mass Activity Monitor toview the percentage of data loaded. For detailed error messages, you have to write your own protocol.

2.2.10.2 Transfer of Files

Most Data Migrations are done with files. It is easier to guarantee the consistency of data and to find reasonsfor an error. If you use files, you have to make sure that no file is loaded twice. Again, it depends on themigration object if you get trouble when loading a file a second time: Loading an account with an externalaccount number twice will lead to an error message. Payment items, on the contrary, can be loaded manytimes. Therefore, it is recommended to use a one-to-one name for a file and, of course, to check the nameagainst already processed files in the load program.

Example: Account master data file <date>_account<run number> Name of the first file of a day: 20081005_account01 Name of the second file of a day: 20081005_account02 Name of the first file of the next day: 20081006_account01

It is also possible to use the creation time stamp in the header of a file for control.

If not done manually, the techniques for a file transfer are normally defined by the SAP Basis department:IFTP or SFTP, the writing of transfer scripts. The folder system must be defined by the project, because oftenthere are more file transfers used, e.g. by the end-of-day processing.

2.2.11 Test Concept

Depending on the migration strategy, the migration process can be very complicated, with lots of impacts onsurrounding systems and processes. On the other hand, the possibility to test all these processes is oftenvery limited because of lacks of time, personal or technical resources.

The following list might be helpful for a good test result: Test environment for testing the migration process itself (ETL, reconciliation) Unit test Realistic data for reproducible tests Historical test data Test environment for testing the processes during and after Data Migration System- or integration tests Consistent data in all systems Personal resources to write and test the test cases

2.2.12 Realization

Besides the usual task of developing programs or creating EDT structures, you have to carry out some moremigration-related tasks: Complete defined transformation rules (often only possible during Realization phase because of other

project teams – especially customer-developed fields are very often not defined completely during theSpecification phase)

Page 41: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 41/44

Write program descriptions for the job control tool:- Sequence of migration programs, integration into the end-of-day processing- Program description- JCLs for all migration programs- Reaction if a program is canceled

Create variants for Banking Services programs Write an implementation or migration story book (depending again on the migration strategy) with every

detail of the migration process:- Date and time of every Data Migration step- Description of every task- Responsible person- Person to be notified in case of completion of a task or in case of an error- Go/no-go decisions- Fallback strategy

2.2.13 Testing and Final Preparation

Tests are vital for the verification of the Data Migration process. An essential aspect is to verify that allplanned processes are really working on the contracts that are migrated. Especially the end-of-yearprocessing may require data that was created in the old legacy system but was simply forgotten during theDefinition phase.

Another important aspect is to carry out performance tests to Define the size of tranches Validate the planned time frame of Data Migration Define the number of batch processes and hardware equipment needed Find out if programs are too slow performance optimization in terms of

- Coding- File size- Variants: number of objects per package or package size

Avoid collisions and locks with other mass runs or optimize the number of batch processes if you cannotavoid these collisions

Find out how much disk space is required for loading Define how much space is needed for generating, transferring, or archiving files Find out when to reorganize data bases during and after the migration process See if there are enough and the right people involved to verify data and to do all the manual tasks Verify the sequence of all tasks Verify all system authorizations needed by the persons involved

When it comes to restricted time slots during the productive Data Migration, you already should know the typeof errors that can occur and how to correct them. If you test with a near-productive data set, you will detect alot of errors that you can fix or for which you find a workaround solution.

Note: To simplify unit testing, you can create a job net within Banking Services to repeat loading more easily.If you do this, you do not have to start every load program or reconciliation program manually.

Precondition: The programs must be enabled to be used in a job net. To do this you can use the standard.

Page 42: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 42/44

2.2.14 Go-Live

After having tested properly, no major problem should occur during the real migration. The story book givesevery participating person the confidence what to do.

Of course testing cannot give you a 100% guarantee because productive data have been changed since thelast tested copy, but you should try to get as close as possible to it. Also the tested environment differs fromthe productive one, which can cause some issues.

In the Preparation phase you should Check if you have the right to access the building at night or at the weekend Control if you have valid users and passwords for all systems and tools you must work with Verify if you have to use special monitoring work stations for the productive systems

Make sure that you get as much support as possible from the following departments (in addition to thepersons mentioned in the story book): Data base specialists SAP basis specialists Authorization specialists Someone who can give you access to the building at the weekend Someone who gives the ok for data corrections in the productive system

Page 43: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 43/44

3 Further Information

3.1 Tasks After Migration

It is important to support the first productive days of end-of-day processing, end-of-month processing andend-of-year processing through the migration team and the project team or an equivalent group of people.

Especially, the first account statements sent to the customers can result in a lot of incidents where bankcustomers complain about missing information or documents, wrong periodicities, miscalculated interests orfees and so on or where users have problems with new processes and layouts. Not always there will be aneed to make a program or data correction, but even if not, the effort to analyze all incidents should not beunderestimated.

During the first end-of-day processing it may turn up that the performance of a mass run is too bad for thenumber of accounts that are in the system now. This can happen because the test system has the samemaster data as the productive system but does not get as many master data changes or flow data records.Then there has to be somebody to find a solution for this, often even during the same night.

When having finished the Data Migration, you have to clean up: Archive the migration protocols, task lists, event logs Deactivate or delete job definitions in the job control tool Remove modifications especially developed for the migration process Switch off all systems that are no longer used

Page 44: Data migration for banking services solutions  customer experience

Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0

© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 44/44

© Copyright 2007 SAP AG. All Rights ReservedNo part of this publication may be reproduced or transmitted in any form or for any purpose without the express permission of SAP AG.The information contained herein may be changed without prior notice.Some software products marketed by SAP AG and its distributors contain proprietary software components of other software vendors.Microsoft, Windows, Outlook, and PowerPoint are registered trademarks of Microsoft Corporation.IBM, DB2, DB2 Universal Database, OS/2, Parallel Sysplex, MVS/ESA, AIX, S/390, AS/400, OS/390, OS/400, iSeries, pSeries, xSeries,zSeries, z/OS, AFP, Intelligent Miner, WebSphere, Netfinity, Tivoli, and Informix are trademarks or registered trademarks of IBMCorporation.Oracle is a registered trademark of Oracle Corporation.UNIX, X/Open, OSF/1, and Motif are registered trademarks of the Open Group.Citrix, ICA, Program Neighborhood, MetaFrame, WinFrame, VideoFrame, and MultiWin are trademarks or registered trademarks of CitrixSystems, Inc.HTML, XML, XHTML and W3C are trademarks or registered trademarks of W3C®, World Wide Web Consortium, MassachusettsInstitute of Technology.Java is a registered trademark of Sun Microsystems, Inc.JavaScript is a registered trademark of Sun Microsystems, Inc., used under license for technology invented and implemented byNetscape.MaxDB is a trademark of MySQL AB, Sweden.SAP, R/3, mySAP, mySAP.com, xApps, xApp, SAP NetWeaver, and other SAP products and services mentioned herein as well as theirrespective logos are trademarks or registered trademarks of SAP AG in Germany and in several other countries all over the world. Allother product and service names mentioned are the trademarks of their respective companies. Data contained in this document servesinformational purposes only. National product specifications may vary.

The information in this document is proprietary to SAP. No part of this document may be reproduced, copied, or transmitted in any formor for any purpose without the express prior written permission of SAP AG.This document is a preliminary version and not subject to your license agreement or any other agreement with SAP. This documentcontains only intended strategies, developments, and functionalities of the SAP® product and is not intended to be binding upon SAP toany particular course of business, product strategy, and/or development. Please note that this document is subject to change and maybe changed by SAP at any time without notice.SAP assumes no responsibility for errors or omissions in this document. SAP does not warrant the accuracy or completeness of theinformation, text, graphics, links, or other items contained within this material. This document is provided without a warranty of any kind,either express or implied, including but not limited to the implied warranties of merchantability, fitness for a particular purpose, or non-infringement.SAP shall have no liability for damages of any kind including without limitation direct, special, indirect, or consequential damages thatmay result from the use of these materials. This limitation shall not apply in cases of intent or gross negligence.The statutory liability for personal injury and defective products is not affected. SAP has no control over the information that you mayaccess through the use of hot links contained in these materials and does not endorse your use of third-party Web pages nor provide anywarranty whatsoever relating to third-party Web pages.