1 odi lessons

Upload: ahmed-ali

Post on 16-Jul-2015

1.273 views

Category:

Documents


2 download

TRANSCRIPT

Oracle Data Integrator10.1.3 Lesson

Rev 2.2

03/03/2008

Authors FX Nicolas Christophe Dupupet Craig Stewart Main Contributors/Reviewers Nick Malfroy Julien Testut Matt Dahlman Richard Soule Bryan Wise Oracle Corporation World Headquarters 500 Oracle Parkway Redwood Shores, CA 94065 USA Worldwide inquiries: Phone: +1 650 506 7000 Fax: +1 650 506 7200 www.oracle.com Oracle is the information company Oracle is a registered trademark of Oracle Corporation. Various product and service names referenced herein may be trademarks of Oracle Corporation. All other product and service names mentioned may be trademarks of their respective owners. Copyright 2008 Oracle Corporation All rights reserved.

Rev 2.2

03/03/2008

Oracle Data Integrator Workshop

1

Introduction

2

1

ObjectivesAfter completing this training, you should: Have a clear understanding of the ODI architecture Have a clear understanding of the ODI differentiators Have some experience in developing with ODI Be ready for your first live projects

3

Before We Start Please copy and unzip the VM ware image on your machine

4

2

LessonsGeneral Information Overview of the product, sales tactics, positioning Architecture

ODI: The Extensibility Framework Knowledge Modules CDC

Packaging and enhancing the ETL processes Workflow management Metadata Navigator Web Services User Functions, Variables and advanced mappings ODI Procedures, Advanced Workflow

A day in the life of an ETL Developer Designer Simple Transformations Designer Transformations for heterogeneous sources (databases, flat files) Designer Introduction to Metadata and XML Designer Data Integrity Control

Administrative tasks in ODI Installation Agents configuration

Understanding the Metadata and the Databases Connectivity Metadata Topology

Additional Features Data Profiling Data Quality Versioning

5

Methodology Install the GUI Create the repositories..... Define Users and Profiles. Define the IS architecture Physical and logical view. Reverse-engineering of the meta-data Table, Views, Synonyms definitions Constraints. Definition of the elementary transformations Which are the targets? Which are the sources for each target? Define transformation rules and control rules Define the transfer rules... Unitary tests Understand the outcome Debugging . Optimize stategies Knowledge Modules.. Define the sequencing Order the interfaces Integration tests Scenarios generation Defining the scheduling Agents configuration Execution frequency. Packaging / delivery Freeze the version Deliver the scenarios Operations..

Install Security Topology Designer Designer Model Def. Project/ Interface Operator Project/KM Project/Pkg Agents/Scen

Project/Scen.

6

Operator

3

Oracle Data Integrator Overview

11-1

ObjectivesAfter completing this lesson, you should be able to describe: The scope of Data Integration for batch and near real time integration The difference between ODI ELT and other ETL tools on the market for batch approaches General overview of the ODI architecture, and how it combines ELT and SOA in the same product architecture

1-2

1

Why Data Integration?NEEDInformation How and Where you Want ItBusiness Intelligence Corporate Performance Management Business Process Management Business Activity Monitoring

Data IntegrationMigration Data Warehousing Master Data Management Data Synchronization --------Federation Real Time Messaging

HAVEData in Disparate Sources---

---------------

---------------

---------------

Legacy

ERP

CRM

Best-of-breed Applications

1-3

Challenges & Emerging SolutionsIn Data IntegrationCHALLENGE EMERGING SOLUTION

1.

Increasing data volumes; decreasing batch windows Non-integrated integration

Shift from E-T-L to E-LT

2.

Convergence of integration solutions Shift from custom coding to declarative design Shift to pattern-driven development

3.

Complexity, manual effort of conventional ETL design Lack of knowledge capture

4.

1-4

2

Oracle Data Integrator Architecture Overview

1-5

Oracle Data Integrator Architecture

Service Interfaces and Developer APIs Design-TimeUser InterfacesDesigner Operator Thin Client

Java design-time environment Runs on any platform Thin client for browsing MetadataAgent Data Flow Conductor

Runtime

Data Flow Generator Knowledge Module Interpreter

Data Flow Generator Runtime Session Interpreter

Java runtime environment Runs on any platform Orchestrates the execution of data flows

Knowledge Modules

Data Flow

Metadata repository Pluggable on many RDBMS Ready for deployment Modular and extensible metadata

Metadata ManagementMaster Repository Work Repositories Runtime Repositories

1-6

3

ODI Detailed ArchitectureDevelopmentODI Design-Time Environment User Interfaces Topology/Security Administrators Design-time Metadata/Rules Repositories DesignersESB Files / XML

Development Servers and Applications Execution Agent Data Flow Conductor Return CodesCRM Data Warehouse

Code Execution Log

Legacy ERP

Production

Scenarios and Projects Releases

ODI Runtime Environment User Interfaces Topology/Security Administrators Execution Log Operators Runtime Repository Code Execution Log Agent Data Flow Conductor Execution Return Codes

Production Servers and Applications

CRM

Data Warehouse

Legacy

Thin Client Metadata Lineage Data Stewarts Metadata NavigatorESB

ERP Files / XML

1-7

Oracle Data Integrator Data Movement and Transformation from Multiple Sources to Heterogeneous Targets

BENEFITS

KEY DIFFERENTIATED FEATURES

1. 2. 3. 4.

Performance: Flexibility: Productivity: Hot-Pluggable:

Heterogeneous E-LT Active Integration Platform Declarative Design Knowledge Modules

1-8

4

1 1

Differentiator: E-LT ArchitectureHigh PerformanceConventional ETL Architecture

Transform in Separate ETL Server Proprietary Engine Poor Performance High Costs IBM & Informaticas approach

Extract

Transform

Load

Transform in Existing RDBMS Leverage Resources Efficient High Performance

Next Generation Architecture

E-LTTransform Transform Extract Load

BenefitsOptimal Performance & Scalability Easier to Manage & Lower Cost

1-9

2 2

Differentiator: Active IntegrationBatch, Event-based, and Service-oriented IntegrationOracle Data Integrator

Evolve from Batch to Near Realtime Warehousing on Common Platform Unify the Silos of Data Integration Data Integrity on the Fly Services Plug into Oracle SOA Suite

Event Conductor Event-oriented Integration

Service Conductor Service-oriented Integration

Metadata Declarative Design

Data-oriented Integration Data Conductor

Benefits Enables real-time data warehousing and operational data hubs Services plug into Oracle SOA Suite for comprehensive integration

1-10

5

3 3

Differentiator: Declarative DesignDeveloper ProductivityConventional ETL Design

Specify ETL Data Flow Graph Developer must define every step of Complex ETL Flow Logic Traditional approach requires specialized ETL skills And significant development and maintenance efforts

Declarative Set-based Design Simplifies the number of steps Automatically generates the Data Flow whatever the sources and target DB ODI Declarative Design 1Define

2

BenefitsSignificantly reduce the learning curve Shorter implementation times Streamline access to non-IT pros

Automatically Generate What Dataflow You Want

Define How: Built-in Templates

1-11

4 4

Differentiator: Knowledge ModulesHot-Pluggable: Modular, Flexible, ExtensibleReverse Engineer Metadata Journalize Read from CDC Source Load From Sources to Staging Check Constraints before Load Integrate Transform and Move to Targets Service Expose Data and Transformation ServicesWS WS WS

Pluggable Knowledge Modules Architecture

Reverse

Staging Tables

Load CDCSources Journalize

Integrate CheckError Tables Target Tables

Services

Sample out-of-the-box Knowledge ModulesSAP/R3 Siebel Log Miner SQL Server Triggers Oracle DBLink JMS Queues Check MS Excel TPump/ Multiload Oracle Merge Siebel EIM Schema Oracle Web Services

DB2 Journals

DB2 Exp/Imp

Oracle SQL*Loader

Check Sybase

Type II SCD

DB2 Web Services

Benefits Tailor to existing best practices Ease administration work Reduce cost of ownership

1-12

6

Oracle Data Integrator General Overview

1-13

Overview: 6 steps to Production1. 2. 3. Retrieve/Enrich metadata Design transformations Orchestrate data flowsDevelopmentDevelopment Servers and Applications

4. 5. 6.

Generate/Deploy data flows Monitor executions Analyze impact / data lineageProductionProduction Servers and Applications

CRM

Data Warehouse

CRM

Data Warehouse

Legacy

Legacy

ERP ESB Files / XML ESB Files / XML

ERP

ODI Design-Time EnvironmentUser Interfaces Administrators Designers Design-time Design-time Repositories Repositories Agent Data Flow Conductor Runtime Repository

ODI Runtime EnvironmentAgent Data Flow Conductor User Interfaces Operator Metadata Navigator

1-14

7

Extended Capabilities

1-15

Extended Capabilities Master Data Management enabled Common Format Designer Automated generation of canonical format and transformations Built-in Data Integrity

Real-time enabled Changed Data Capture Message Oriented Integration (JMS)

SOA enabled Generation of Data Services Generation of Transformation Services

Extensibility Knowledge Modules Framework Scripting Languages Open Tools

1-16

8

Use Cases

1-17

E-LT for Data WarehouseCreate Data Warehouse for Business Intelligence Populate Warehouse with High Performance ODIHeterogeneous sources and targets Incremental load Slowly changing dimensions Data integrity and consistency Changed data capture Data lineage

Load Transform Capture Changes

Incremental Update Data Integrity

Aggregate Export

Cube

Operational

Analytics

-------------

Data Warehouse

Cube

Cube

Metadata

1-18

9

SOA InitiativeEstablish Messaging Architecture for Integration Incorporate Efficient Bulk Data Processing with ODI

Generate Data Services Expose Transformation Services

Deploy and reuse Services

Services

Business Processes

Data Access -------------

Transformation

Invoke external services for data integration Deploy data services Deploy transformation services Integrate data and transformation services in your SOA infrastructure

Operational

Others

Metadata

1-19

Master Data ManagementCreate Single View of the Truth Synchronize Data with ODIUse in conjunction with packaged MDM solution Use as infrastructure for designing your own hub Create declarative data flows Capture changes (CDC) Reconcile and cleanse the data Publish and share master data Extend metadata definitions

Change Data Capture Master Data Load

Canonical Format Design Cleansing and Reconciliation

Master Data Publishing

CDC CDC -------------

Master Data

CDC -------------

Metadata

1-20

10

MigrationUpgrade Applications or Migrate to New Schema Move Bulk Data Once and Keep in Sync with ODI

Initial bulk load CDC for synchronization

Transformation to new application format

CDC for loopback synchronization

CDC

-------------

CDC

Bulk-load historical data to new application Transform source format to target Synchronize new and old applications during overlap time Capture changes in a bidirectional way (CDC)

Old ApplicationsOther Sources

New Application

Metadata

1-21

ODI Enhances Oracle BIPopulate Warehouse with High Performance ODIOracle BI Suite EEAnswers Interactive Dashboards Publisher Delivers

Oracle Business Intelligence Suite EE:Simplified Business Model View Advanced Calculation & Integration Engine Intelligent Request Generation Optimized Data Access

Oracle BI Presentation Server Oracle BI Server

Oracle BI Enterprise Data Warehouse

Bulk E-LT Oracle Data IntegratorE-LT Agent E-LT Metadata

Oracle Data Integrator:Populate Enterprise Data Warehouse Optimized Performance for Load and Transform Extensible Pre-packaged E-LT ContentSiebel CRM

SAP/R3

PeopleSoft

Oracle EBS

1-22

11

ODI Enhances Oracle SOA SuiteAdd Bulk Data Transformation to BPEL ProcessOracle SOA SuiteBPEL Process ManagerBusiness Activity Monitoring Web Services Manager Descriptive Rules Engine Enterprise Service Bus

Oracle SOA Suite:BPEL Process Manager for Business Process Orchestration

Oracle Data IntegratorE-LT Agent E-LT Metadata

Oracle Data Integrator:Efficient Bulk Data Processing as Part of Business Process Interact via Data Services and Transformation Services

Bulk Data Processing

1-23

ODI Enhances Oracle SOA SuitePopulate BAM Active Data Cache EfficientlyOracle SOA SuiteBusiness Activity MonitoringEvent Monitoring Web Applications Event Engine Report Cache Descriptive Rules Engine BPEL Process Manager Web Services Manager

Oracle SOA Suite:Business Activity Monitoring for Real-time Insight

Active Data Cache

Enterprise Service Bus

Oracle Data Integrator:Oracle Data IntegratorBulk and Real-Time Data Processing Agent Metadata

High Performance Loading of BAMs Active Data Cache Pre-built and Integrated

Data Warehouse SAP/R3

CDCPeopleSoft

Message Queues

1-24

12

Links and References IAS (Internal):http://ias.us.oracle.com/portal/page?_pageid=33,1704614&_dad=portal&_schema=PORTAL

OTN (external):http://otn.oracle.com/goto/odi

Product Management Support:[email protected]

Field support:[email protected]

Forum:http://forums.oracle.com/forums/forum.jspa?forumID=374&start=0

KMs:http://odi.fr.oracle.com

Product Management Wiki:http://aseng-wiki.us.oracle.com/asengwiki/display/ASPMODI/Oracle+Data+Integrator+Product+Management

1-25

Lesson summary

Data Integration Data Integration Challenges Challenges Market Market Positioning of Positioning of ODI ODI

Key Key Differentiators Differentiators

1-26

13

1-27

14

Oracle Data Integrator Architecture

22-1

ObjectivesAfter completing this lesson, you should: Know the different components of the ODI architecture Understand the structure of the Repositories

2-2

1

Components

2-3

Graphical Modules

Designer Reverse-Engineer Develop Projects Release Scenarios Java - Any Platform Any ISO-92 RDBMS

Operator Operate production Monitor sessions

Topology Manager Define the infrastructure of the IS

Security Manager Manage user privileges

Repository

2-4

2

Run-Time ComponentsDesigner Reverse-Engineer Develop Projects Release Scenarios Java - Any Platform Operator Operate production Monitor sessionsMonitor sessions View Reports

Submit Jobs

Repository

Any ISO-92 RDBMS Scheduler Agent Handles schedules Orchestrate sessions Java - Any Platform

Read sessions Write reports

Lightweight Distributed Architecture

Return Code

Execute Jobs

Information System

2-5

Metadata NavigatorAny Web Browser Browse metadata lineage Operate production

Repository

Any ISO-92 RDBMS Scheduler Agent Handles schedules Orchestrate sessions Java - Any Platform Metadata Navigator Web access to the repository J2EE Application Server

Submit Executions

Return Code

Execute Jobs

Information System

2-6

3

SOADesigner Generate and deploy Web Services

Repository

Any ISO-92 RDBMS Scheduler Agent Handles schedules Orchestrate sessions Java - Any Platform Tomcat / OC4J Web Services presentation J2EE Application Server

Exposes Scenarios for Executions

Return Code

Execute Jobs

Information System

and Data oses Data Exp anged Ch

2-7

Components: a Global ViewDesigner Reverse-Engineer Develop Projects Release Scenarios Java - Any Platform Operator Operate production Monitor sessions Topology Manager Define the IS infrastructure Security Manager Manage user privileges Any Web Browser Browse metadata lineage Operate production

Repository

Any ISO-92 RDBMS Scheduler Agent Handles schedules Orchestrate sessions Java - Any Platform Information System Repository Access HTTP Connection Execution Query Metadata Navigator Web access to the repository J2EE Application Server

2-8

4

ODI Repositories

2-9

Master and Work RepositoriesSecurity Topology Versioning Master Repository

Models Projects Execution Work Repository (Development) Execution Execution Repository (Production)

Two type of Repositories: Master and Work Work Repositories are always attached to a Master Repository

2 - 10

5

Example of a Repository Set-UpSecurity Topology VersioningCreate and archive versions of models, projects and scenarios Import released and tested versions of scenarios for production

Master Repository

Import released versions of models, projects and scenarios for testing

Models Projects Execution Work Repository (Development) Models Projects Execution Work Repository (Test & QA)

Execution Execution Repository (Production)

Development Test Production Cycle 2 - 11

Lesson summary

Structure Structure of the of the Repository Repository

Components Components of the of the Architecture Architecture

2 - 12

6

2 - 13

7

Oracle Data Integrator First Project Simple Transformations: One source, one target

33-1

ObjectivesAfter completing this lesson, you will know how to: Create a first, basic interface Create a filter Select a Knowledge Module and set the options Understand the generated code in the Operator Interface

3-2

1

Anatomy of ODI Transformations

3-3

Quick Overview of Designer

Toolbar

Workspace Object Tree

Selection Panel

Metadata

Project

3-4

2

Terminology ETL/ELT projects are designed in the Designer tool Transformations in ODI are defined in objects called Interfaces. Interfaces are stored into Projects Interfaces are sequenced in a Package that will be ultimately compiled into a Scenario for production execution

3-5

Interface An Interface will define Where the data are sent to (the Target) Where the data are coming from (the Sources) How the data are transformed from the Source format to the target format (the Mappings) How the data are physically transferred from the sources to the target (the data Flow)

Source and target are defined using Metadata imported from the databases and other systems Mappings are expressed in SQL Flows are defined in Templates called Knowledge Modules (KMs)

3-6

3

Creating, Naming a New Interface

Interfaces are created in Projects To create any object in ODI, right-click on the parent node and select Insert xyz This is true for interfaces as well: On the projects Interfaces entry, select Right-Click/Insert Interface.

3-7

Interfaces: The Diagram

3-8

4

Selection of Sources and TargetDrag and drop the Metadata from the tree into the interface to make these sources or targetsSource Tables Target Table (single target)

Metadata

3-9

Automatic Mappings

Automatic Mapping creates mappings by matching column names automatically. ODI will prompt you before doing so: you have the option to disable this feature.

3-10

5

Mappings in the InterfaceTarget Columns (click here to open the mapping field)

Mapping expressions (read only)

Type or edit your mapping expressions here Expression Editor button

3-11

Using the Expression Editor1. Click the expression editor button ( ) in the mapping window 2. Build your SQL expressions from the SQL help at the bottom, and from the Columns at the left

3-12

6

Notey ce onl i nt er f a An t es a pop ul a r get ta single ore. dat as t ul at e T o p o p a r g e t s, lt se ve r a everal need s yo u ces. i nt er f a3-13

Valid Mapping TypesThe following type of clauses may be used in the mappings:Value Source Column DBMS Function DBMS Aggregate Combination String values should be enclosed in single quotes: SQL', '5 but 10.3Drag and drop the column or use the expression editor. It is prefixed by the datastores alias. E.g.: SRC_SALES.PROD_ID

Use the expression editor for the list of supported functions and operators MAX(), MIN(), etc. ODI automatically generates the GROUP BY clause.Any combination of clauses is allowed: SRC_SALES_PERSON.FIRST_NAME || ' ' || UCASE(SRC_SALES_PERSON.LAST_NAME)

3-14

7

Filtering DataDrag and drop a column on the background area Then type the filter expression

Check expression. SQL filter expression Execution location Expression editor Save expression

3-15

Saving the Interface

Click the Apply button to save the interface You can press the OK button to save and close the interface. The Cancel button closes the interface without saving it. Interfaces are saved in the Work Repository.

3-16

8

Notey ce ma nt er f a An i han more t hav e ur ce. one so esso n, r t h i s l y u se Fo l o nl we wil ce. ur one so

3-17

Interfaces: The Flow

3-18

9

Graphical Representation of the Flow Source and target systems are graphically represented in the Flow tab This is where KM are chosen, and KM options are set

3-19

KM and KM OptionsClick on the caption to display Loading KM choices and options Click on the caption to display the Integration KM choices and options

Select the appropriate KM

Set the option values as needed

3-20

10

Important Noteant ! Import r e t ha t ake su M o pr i at e e ap p r th

dge K now l e hav e s M o dul e por t ed

m been i project ! e i nt o t h

3-21

Interfaces: Execution

3-22

11

Requirements To run an interface, you need at least the following: A target table An Integration Knowledge Module (selected in the Flow tab) A Loading Knowledge Module if there is a remote source.

If you have all the prerequisites, you are ready to execute the interface.

3-23

Running an Interface Simply click the Execute button

3-24

12

Follow-up of the Execution: Logs and Generated Code

3-25

Code Generation When we ask ODI to Execute the transformations, ODI will generate the necessary code for the execution (usually SQL code) The code is stored in the repository The execution details are available in the Operator Interface: Statistics about the jobs (duration, number of records processed, inserted, updated, deleted) Actual code that was generated and executed by the database Error codes and error messages returned by the databases if any

3-26

13

The Operator Interface Start the operator interface from the Windows menu or from the ODI toolbar

3-27

Refresh the Logs Display By default, ODI will not refresh the logs. There are two ways to refresh the logs: Manual refresh: click on this icon in the toolbar: Automatic refresh: Set the refresh rate (in seconds) in the toolbar and click on this icon in the toolbar:

3-28

14

Multiple Levels of DetailsJob level details

Specific step in the job

Actual code sent to the systems (SQL or other)

3-29

Errors Reporting

The red icon in the tree indicates the steps that failed Error Codes and Error Messages are reported at all levels

3-30

15

Information Available for each LevelTime Information Statistical Information Generated Code

3-31

Understanding the Operator IconsRunning Success Failure Warning Waiting to be executed Queued by the agent`

3-32

16

Course SummaryCreate Interfaces Create Interfaces and define and define transformations transformations (mappings) (mappings) Understand Data Understand Data Flows, Select Flows, Select KMs and set KMs KMs and set KMs options options

Execute an Execute an Interface Interface

Understand how Understand how to follow-up on to follow-up on the execution the execution

3-33

3-34

17

Oracle Data Integrator Transformations: Adding More Complexity

44-1

ObjectivesAfter completing this lesson, you will: Understand how to design an interface with multiple sources. Know how to define relations between the source using joins. Better understand an interfaces flow. Be able to customize the default flow of an interface. Be able to appropriately choose a Staging Area

4-2

1

Adding More than One Source

4-3

Multiple Sources

You can add more than one source datastore to an interface. These datastores must be linked using joins. Two ways to create joins: References in the models automatically become joins in the diagram. Joins must be manually defined in the diagram for isolated datastores.

4-4

2

Note

an Import

t!

res atasto All d e must b or y directl y tl indirec . joined4-5

Manually Creating a Join

1.

Drag and drop a column from one datastore onto a column in another datastore.A join linking the two datastore appears in the diagram. In the join code box, an expression joining the two columns also appears.

2.

Modify the join expression to create the required relation.You can use the expression editor.

3. 4.

Check the expressions syntax if possible. Test the join if possible.

4-6

3

Setting up a JoinJoins can be defined across technologies (here a database table and a flat file) The number of joins per interface is not limited

SQL join expression (technology dependant) Execution location

Validate expression

Expression editor Save expression Join order (ISO-92 Syntax) Use ISO-92 syntax Automatically calculate order

Join type Inner/Outer, Left/Right.

4-7

Types of JoinsThe following type of joins exist:Cross Join Cartesian Product. Every combination of any Customer with any Order, without restriction. Only records where a customer and an order are linked. All the customers combined with any linked orders, or blanks if none. All the orders combined with any linked customer, or blanks if none. All customers and all orders.

Inner Join

Left Outer Join

Right Outer Join

Full Outer Join

4-8

4

Advanced Considerations on Filters, Joins, Mappings

4-9

Options for Filters, Joins and Mappings

Active Mapping When unchecked, the filter, join or mapping is disabled for this interface

Enable mapping for update and/or insert Allows mappings to only apply to updates or inserts. By default, both insert and update are enabled

Choose the update key by selecting the Key checkbox Change the execution location of the filter, join or mapping.

4-10

5

Setting Options for Filters, Joins and MappingsActivate/Deactivate For mappings, filters or joins Execution Location For mappings, filters or joins Insert/Update For mappings Part of the Update Key For target columns (mappings)

Active Mapping When unchecked, the filter, join or mapping is disabled for this interface Enable mapping for update and/or insert Allows mappings to only apply to updates or inserts. By default, both insert and update are enabled Choose the update key by selecting the Key checkbox Change the execution location of the filter, join or mapping.

4-11

Note Update Keys for Flow Controlpdates for m U T o pe r ow Fl or use you must l, C ont r o pdate e an u defi n r t he key f o ce i nt er f a

4-12

6

What is an Update Key?

An update key: is a set of columns capable of uniquely identifying one row in the target datastore is used for performing updates and flow control can be: one of the primary/unique keys defined for the datastore defined specially for the interface

4-13

How to Define the Update Key1. 2. 3. Go to the Diagram tab of the interface Select the Target Datastore. Select the Update Key in the properties panel.

To define a new key in the Interface only 1. Choose for the update key. 2. Select one target column to make part of the update key. 3. Check the Key checkbox in the properties panel. 4. Repeat for each column in the update key. To define a new key for the table that could be used in other interfaces 1. Go back in the Model 2. Expand the table 3. Right-click on Constraints and add a new key (more on this in a later chapter)

4-14

7

How to Change the Execution LocationFor mappings, filters and joins, you can choose where the operation will take place: source database, staging area or target database (mappings only, and for the mappings, only literals and database functions) 1. Go to the interfaces Diagram tab 2. Select the filter, join or mapping to edit. 3. Select an execution location from the properties panel. Not every execution location is always possible. Must be set to Active first.

4-15

Why Change the Execution Location?

You may need to change the execution location if: The technology at the current location does not have the features required Files, JMS, etc do not support transformations A required function is not available

The current location is not available for processing The machine cant handle any more demand

ODI does not allow this location It is not possible to execute transformations on the target.

4-16

8

Note Moving the Staging Area

n e wh e ke car Ta ing the tion cha ng l oca cut i on exe in g t h e or m ov ar ea t agi ng s . Y ou cation lo ou bl e oul d d sh t he che ck at i o n n sf o r m tra . sy nt ax4-17

Data Flow Definition

4-18

9

What is the Flow?

Flow The path taken by data from the sources to the target in an ODI interface. The flow determines where and how data will be extracted, transformed, then integrated into the target.

4-19

Note

and i ng nde r st U will e flow y th man av oi d at run bl e m s pr o time. g t hi s asterin ill help M pt w con ce pr ov e u t o im e . yo manc per f or4-20

10

What Defines the Flow?

Three factors: Where the staging area is located On the target, on a source or on a third server

How mappings, filters and joins are set up Execution location: Source, target or staging area Whether transformations are active

Choice of Knowledge Modules LKM: Loading Knowledge Module IKM: Integration Knowledge Module

4-21

A Data Integration ScenarioFilter - ORDERS.STATUS=CLOSED

Source SybaseORDERS

Target Oracle

Mapping - SALES = SUM(LINES.AMOUNT) + CORRECTION.VALUE. - SALES_REP = ORDERS.SALES_REP_ID

LINES

SALES

CORRECTIONS File Join - ORDERS.ORDER_ID = LINES.ORDER_ID

4-22

11

The Basic ProcessSequence of operations with or without an integration toolSource: SybaseTransform & Integrate ORDERS

Target: OracleSALES

11LINES Extract/Join/Transform

C$_0

55 33I$_SALES

Join/Transform

CORRECTIONS File

22Extract/Transform

C$_1

4-23

What Is the Staging Area?

Staging Area A separate, dedicated area in an RDBMS where ODI creates its temporary objects and executes some of your transformation rules. By default, ODI sets the staging area on the target data server.

4-24

12

Case Study: Placing the Staging Area

The Staging Area may be located: On the target database (default). On a third RDBMS database or the Sunopsis Memory Engine. On the source database.

The Staging Area cannot be placed on non relational systems (Flat files, ESBs, etc.)

4-25

Note Staging Area Must Be an RDBMSd locate hemas c logies Only s techno ing MS on RDB as the stag t , LDAP can ac , MOM iles not. area. F P bases can A and O L of the target t he nW hen s a no rface i nology, the int e S tech ust be RDBM area m r g stagin anothe ved to mo a. schem4-26

13

How to change the Staging Area

1. 2.

3.

4.

Go to the interfaces Definition tab of your Interface. To choose the Staging Area, check the Staging Area Different From Target option, then select the logical schema that will be used as the Staging Area. To leave the Staging area on the target, uncheck the Staging Area Different From Target option Go to the Flow tab. You can now see the new flow.

4-27

Case #1:Staging Area on Target

Target (Oracle) Source (Sybase)ORDERS

Staging AreaTransform & Integrate

11LINES Extract/Join/Transform

C$_0

55 33I$_SALES

SALES

Join/Transform

CORRECTIONS File

22Extract/Transform

C$_1

4-28

14

Case #1 in ODIStaging Area in the Target

Staging Area + Target

Source Sets

4-29

Case #2: Staging on Middle Tier

DB2 UDB, Sunopsis Engine, etc. Source (Sybase) Staging AreaORDERS Transform & Integrate

Target (Oracle)SALES

11LINES Extract/Join/Transform

C$_0

55 33I$_SALES

Join/Transform

CORRECTIONS File

22Extract/Transform

C$_1

4-30

15

Case #2 in ODIStaging Area is the Sunopsis Memory Engine

Target

Source Sets

Staging Area

4-31

Case #3: Staging on Source

Source (Sybase)ORDERS

Staging Area 11C$_0

Transform & Integrate

Target (Oracle)SALES

55 33I$_SALES

LINES

Extract/Join/Transform Join/Transform

C$_1

22CORRECTIONS File Extract/Transform

4-32

16

Case #3 in ODIStaging Area in the Source

Target

Source Sets

Staging Area

4-33

Note Staging Area Syntax

ce of e c hoi Th g ar ea st ag i n t he r m i n es by al l det e use d y nt ax s ilters ings, f m ap p ecut ed i ns e x and j o t her e.

4-34

17

Which KMs for What Flow?

When processing happens between two data servers, a data transfer KM is required. Before integration (Source Staging Area) Requires an LKM, which is always multi-technology At integration (Staging Area Target) Requires a multi-technology IKM

When processing happens within a data server, it is entirely performed by the server. A single-technology IKM is required. No data transfer is performed

4-35

Which KMs for What Flow? Four possible arrangements:Loading phase Multi-tech LKM Multi-tech LKM (No LKM needed)Staging area on source

Source

Staging area

Integration phase Multi-tech IKM Single-tech IKMStaging area on target

Target

Multi-tech IKM Single-tech IKM

(No LKM needed)

Source, staging area and target in same location

4-36

18

More on KMs

KMs can skip certain operations Unnecessary temporary tables will not be created

Some KMs lack certain features Multi-technology IKMs can not perform Flow control IKMs to File, JMS, etc do not support Static control

All KMs have configurable Options

4-37

Case #1Using the Target as the Staging AreaTarget (Oracle) Source (Sybase)ORDERS

Staging Area

LKM_1 LKM_1LINES LKM SQL to Oracle

C$_0

IKM_1 IKM_1 IKM_1 IKM_1I$_SALES

SALES

IKM Oracle Incremental Update

CORRECTIONS File

LKM_2 LKM_2LKM File to Oracle (SQLLDR)

C$_1

IKM Oracle Incremental Update

4-38

19

Case #2Using a third server as the Staging AreaSunopsis Memory Engine Source (Sybase) Staging AreaORDERS IKM SQL to SQL Append

IKM_1 IKM_1 LKM_1 LKM_1C$_0

Target (Oracle)SALES

LINES LKM SQL to SQL

IKM_1 IKM_1C$_1

I$_SALES

CORRECTIONS File

LKM_2 LKM_2LKM File to SQL

IKM SQL to SQL Append

4-39

Case #3Using the Source as the Staging Area

Source (Sybase)ORDERS

IKM SQL to SQL Append

Target (Oracle)SALES

Staging Area IKM_1 IKM_1C$_0 IKM SQL to SQL Append I$_SALES

IKM_1 IKM_1

LINES

IKM SQL to SQL Append

IKM_1 IKM_1C$_1

LKM_1 LKM_1CORRECTIONS File LKM File to SQL

4-40

20

How to a Specify an LKM

1. 2.

Go to the interfaces Flow tab. Select the Source Set from which data will be extracted.The KM property panel opens.

3. 4. 5.

Change the Name of the Source Set (optional) Select an LKM. Modify the LKMs Options.

4-41

Note Default KMs

s es a I choo OD t KM d e f a u l r p o s sib l e . ve w h er e ar s i n g appe fault A fl a if a de w the flo ed (X) or if us . KM is set (X) K M is no

4-42

21

How to Specify an IKM1. 2. Go to the interfaces Flow tab. Select the Target.The KM property panel opens.

3. 4. 5.

Check/Uncheck Distinct Rows. Select an IKM. Set the IKMs Options.

4-43

Common KM OptionsThe following options appear in most KMs:INSERT UPDATE COMMITFLOW CONTROL STATIC CONTROL TRUNCATE DELETE ALLDELETE TEMPORARY OBJECTS

Should data be inserted/updated in the target?Should the interface commit the insert/updates? If no, a transaction can span several interfaces. Should data in the flow be checked? Should data in the target be checked after the interface? Should the target data be truncated or deleted before integration?

Should temporary tables and views be deleted or kept for debugging purposes? 4-44

22

Note The Staging Area Trade-offaging and St osen s The KM uld be ch ho ty Area s e the quanti c et to redu ransferred y t of data he required et ion provid format a trans g dat eckin and ch ies. ilit capab

4-45

Lesson SummaryUsing multiple, Using multiple, heterogeneous heterogeneous source source datastores datastores

Locating the Locating the Staging Area Staging Area

Understanding Understanding the Flow the Flow

Creating joins Creating joins

Choosing Knowledge Choosing Knowledge Modules Modules

4-46

23

4-47

24

Oracle Data Integrator Quick Introduction to Metadata Management

55-1

ObjectivesAfter completing this lesson, you should understand:

Have a generic understanding of the Metadata in ODI Be ready to do a more exploratory hands on tying together metadata and advanced transformations

5-2

1

Metadata in ODIMetadata in ODI are available in the Model tab. Each Model will contain the tables from database schema. A model can contain all tables from a schema, or only a subset of the tables of the schema Models can contain sub models for an easier organization of the tables from a schema

5-3

A Special Case: XMLODI comes with its own JDBC driver for XML files. The XML file will be viewed as a database schema where:- Elements become tables - Attributes of the elements become columns of the tables

To maintain the hierarchical view of the XML file, the driver will automatically create primary keys and foreign keys. To retain the order in which the records appear in the XML file, the driver will add an Order column.

5-4

2

Lesson summaryIntroduction to Introduction to Models Models

5-5

5-6

3

Oracle Data Integrator Data Quality (Integrity Control)

66-1

ObjectivesAfter completing this lesson, you will: Know the different types of data quality business rules ODI manages. Be able to enforce data quality with ODI. Understand how to create constraints on datastores.

6-2

1

When to Enforce Data Quality? The IS can be broken into 3 sub-systems Source application(s) Data integration process(es) Target application(s)

Data Quality should be managed in all three sub-systems ODI provides the solution for enforcing quality in all three.

6-3

Data Quality Business Rules Defined by designers and business analysts Stored in the Metadata repository May be applied to application data Defined in two ways: Automatically retrieved with other metadata Rules defined in the databases Obtained by reverse-engineering Manually entered by designers User-defined rules

6-4

2

From Business Rules to Constraints De-duplication rules Primary Keys Alternate Keys Unique Indexes

Reference rules Simple: column A = column B Complex: column A = function(column B, column C)

Validation rules Mandatory Columns Conditions

6-5

Overview of the Data Quality System

SourceORDERS Errors Integration Process LINES

Target

SALES

Static Control is started - Automatically (scheduled) - manually

Errors

Flow Control is started - by Interfaces during executionCORRECTIONS File

Static Control is started - by Interfaces after integration - by Packages - manually

Error Recycling is performed - by Interfaces

6-6

3

Static/flow Control Differences Static Control (static data check) Checks whether data contained in a datastore respects its constraints. Requires a primary key on the datastore.

Flow Control (dynamic data check) Enforces target datastore constraints on data in the flow. Requires an update key defined in the interface. You can recycle erroneous data back into the flow.

6-7

Properties of Data Quality ControlStatic and flow checks can be triggered: by an interface (FLOW and/or STATIC) by a package (STATIC) manually (STATIC)

require a Check Knowledge Module (CKM) are monitored through Operator copy invalid rows into the Error table Flow control then deletes them from flow. Static control leaves them in data stores. Error table can be viewed from Designer or any SQL tool.

6-8

4

Constraints in ODI Mandatory Columns Keys Primary Keys Alternate Keys Indexes

References Simple: column A = column B Complex: column A = function(column B)

Conditions

6-9

Mandatory Columns

1. Double-click the column in the Models view. 2. Select the Control tab. 3. Check the Mandatory option. 4. Select when the constraint should be checked (Flow/Static).

6-10

5

Keys

1. 2. 3. 4. 5. 6.

Select the Constraints node under the datastore. Right-click, select Insert Key. Fill in the Name. Select the Key or Index Type Go to the Columns tab Add/remove columns from the key.

6-11

Checking Existing Data with a New Key

1. 2.

3. 4.

Go to the Control tab. Select whether the key is Defined in the Database, and is Active Select when the constraint must be checked (Flow/Static). Click the Check button to perform a synchronous check of the key.

Number of duplicate rows

6-12

6

Note Synchronous Check Limitationsant ! Import ronous rk S yn ch o on l y w he cks c s ed S Q L -b a on s. sy st e m e lt of th e r e su Th s n ot he ck i c sa ve d .6-13

Creating a Reference

1. 2. 3. 4. 5.

Select the Constraints node under the datastore Right-click, select Insert Reference Fill in the Name Select the reference type User Reference Complex Reference Set the model and table to to manually enter the catalog, schema and table name.

Select a Parent Model and Table

6-14

7

Creating a User Reference

1. 2. 3. 4.

5.

Go to the Columns tab Click the Add button Select the column from the Foreign Key table. Select the corresponding column from the Primary Key table. Repeat for all column pairs in the reference.

6-15

Creating a Complex Reference

1. 2. 3.

Go to the Expression tab Set the Alias for the Primary Key table. Code the ExpressionPrefix with the tables aliases Use the Expression Editor.

6-16

8

Checking Existing Data with a New Reference

1. 2. 3.

Go to the Control tab. Choose when the constraint should be checked (Flow/Static). Click the Check button to immediately check the reference. Not possible for heterogeneous references.

6-17

Creating a Condition

1.

2. 3. 4. 5.

Right-click Constraints node, select Insert Condition Fill in the Name. Select ODI Condition type. Edit the condition clause Use the Expression Editor

Type in the error message for the condition.

6-18

9

Checking Existing Data with a New Condition

1. 2.

3.

Go to the Control tab Select when the constraint must be checked (Flow/Static). Click the Check button to perform a synchronous check of the condition.

6-19

Data Quality in the Interfaces

6-20

10

How to Enforce Data Quality in an InterfaceThe general process: 1. Enable Static/Flow Control 2. Set the options 3. Select the Constraints to enforce Table constraints Not null columns

4.

Review the erroneous records

6-21

How to Enable Static/Flow Control1. 2. 3. Go to the interfaces Flow tab. Select the target datastore.The IKM properties panel appears.

4.

Set the FLOW_CONTROL and/or STATIC_CONTROL IKM options to Yes. Set the RECYCLE_ERRORS to Yes, if you want to recycle errors from previous runs

6-22

11

How to Set the Options1. 2. 3. 4. Select the interfaces Controls tab. Select a CKM. Set up the CKM Options. Set the Maximum Number of Errors Allowed. Leave blank to allow an unlimited number of errors. To specify a percentage of the total number of integrated records, check the % option.

6-23

How to Select Which Constraints to EnforceFor flow control: For most constraints:1. 2. Select the interfaces Controls tab. For each constraint you wish to enforce, select Yes. Select the interfaces Diagram tab. Select the Target datastore column that you wish to check for nulls. In the column properties panel, select Check Not Null.

For Not Null constraints:1. 2. 3.

6-24

12

Differences Between Control TypesStatic control Launched via CKM Defined on Options defined on Constraints defined on Invalid rows deleted (Default KM behavior) ModelModel

Flow control InterfaceInterface

InterfaceInterface

Model

Interface

Interface

Model

Model

Interface

Possible

Never

Always

6-25

How to Review Erroneous RecordsFirst, execute your interface. To see the number of records: 1. Select the Execution tab. 2. Find the most recent execution. The No. of Errors encountered by the interface is displayed.

To see which records were rejected: 1. Select the target datastore in the Models view. 2. Right-click > Control > Errors 3. Review the erroneous rows.

6-26

13

Lesson summaryEnabling Quality Enabling Quality Control Control

Manually creating Manually creating constraints constraints

Data quality Data quality business business rules rules

How to enforce data How to enforce data quality quality

Setting Setting Options Options

6-27

6-28

14

Oracle Data Integrator Metadata Management

77-1

ObjectivesAfter completing this lesson, you should understand:

Why Metadata are important in ODI Where to find your database metadata in ODI How to import Metadata from your databases How to use ODI to generate your models

7-2

1

Why Metadata? ODI is strongly based on the relational paradigm. In ODI, data are handled through tabular structures defined as datastores. Datastores are used for all type of real data structures: database tables, flat files, XML files, JMS messages, LDAP trees, The definition of these datastores (the metadata) will be used in the tool to design the data integration processes. Defining the datastores is the starting point of any data integration project

7-3

Models

7-4

2

Model Description Models are the objects that will store the metadata in ODI. They contain a description of a relational data model. It is a group of datastores stored in a given schema on a given technology. A model typically contains metadata reverse-engineered from the real data model (Database, flat file, XML file, Cobol Copybook, LDAP structure) Database models can be designed in ODI. The appropriate DDLs can then be generated by ODI for all necessary environments (development, QA, production)

7-5

TerminologyAll the components of relational models are described in the ODI metadata:Relational Model Table; Column Not Null; Default value Primary keys; Alternate Keys Indexes; Unique Indexes Foreign Key Check constraint Description in ODI Datastore; ColumnNot Null / Mandatory; Default value Primary keys; Alternate keys Not unique indexes; Alternate keys Reference Condition

7-6

3

Additional Metadata Filters Apply when data is loaded from a datastore.

Heterogeneous references Link datastores from different models/technologies

Additional technical/functional metadata OLAP type on datastores Slowly changing dimension behavior on columns Read-only data types/columns User-defined metadata (FlexFields)

7-7

Importing Metadata: The Reverse Engineering Process

7-8

4

Two Methods for Reverse Engineering Standard reverse-engineering Uses JDBC connectivity features to retrieve metadata, then writes it to the ODI repository. Requires a suitable driver

Customized reverse-engineering Read metadata from the application/database system repository, then writes these metadata in the ODI repository Uses a technology-specific strategy, implemented in a Reverseengineering Knowledge Module (RKM)

7-9

Standard vs. Customized ReverseEngineeringFile-specific reverse-engineeringFixed formatCOBOL copybooks

ODI RepositoryModel (Metadata)

Oracle Data Integrator

Delimited format

MS SQL Server

JDBC DriverStandard Reverse-engineering

Data Model

System tables

Customized Reverse-engineering

7-10

5

Other Methods for Reverse-Engineering

Delimited format reverse-engineering File parsing built into ODI.

Fixed format reverse-engineering Graphical wizard, or through COBOL copybook for Mainframe files.

XML file reverse-engineering (Standard) Uses Sunopsis JDBC driver for XML.

LDAP directory reverse-engineering (Standard) Uses Sunopsis JDBC driver for LDAP.

7-11

Noteg ineerin se- eng . R eve r ent al ncr em is i at a i s m et ad N ew but ol d dd ed, t a a is n o et ad at m ed. r em ov

7-12

6

Reverse Engineering In Action

7-13

Create and Name the New Model

1. 2. 3. 4. 5. 6.

Go to the Models view. Select Insert Model. Fill in the Name (and Code). Select the model Technology. Select the Logical Schema where the model is found. Fill in the Description (optional).

7-14

7

Notealways del is A mo given d in a defi n e l ogy. t echno ge a u chan nology, I f yo t ec h odels m check ust related yo u m j ec t r e ob every o d e l. t hat m to7-15

How to Define a Reverse-Engineering Strategy1. 2. 3. 4. 5. Go to the Reverse tab. Select the Reverse-engineering type. Select the Context for reverseengineering. Select the Object Type (optional). Type in the object name Mask and Characters to Remove for the Table Alias (optional). If customized: Select the RKM. Select the Logical Agent.

6.

7-16

8

Optional: Selective Reverse-Engineering

1. 2. 3.

4. 5. 6.

Go to the Selective Reverse tab (Standard reverse only). Check the Selective Reverse option. Select New Datastores or/and Existing Datastores. Click Objects to Reverse. Select the datastores to reverseengineer. Click the Reverse button.

7-17

How to Start the Process If using customized reverseengineering:1. 2. 3. Click the Reverse button. Choose a log level, then click OK. Use Operator to see the results.

If using standard reverse-engineering:1. 2. 3. Optionally, set up Selective Reverse. Click the Reverse button. Follow the progress in the status bar.

7-18

9

Generating Metadata: The Common Format Designer

7-19

Add Elements Missing From Models Some metadata cannot be reverse-engineered JDBC driver limitations

Some metadata cannot exist in the data servers No constraints or keys on files or JMS messages Heterogeneous joins OLAP, SCD, etc.. User-defined metadata

Some business rules are not implemented in the data servers. Models implemented with no constraints Certain constraints are implemented only at the application level.

7-20

10

Fleshing Out Models ODI enables you to add, remove or edit any model element manually. You do this in Designer.

The model Diagram is a graphical tool to edit models. Requires the Common Format Designer component. You can update the database with your changes.

7-21

Lesson summary

Relational models Relational models

ReverseReverseengineering engineering

Fleshing out models: Fleshing out models: why and how why and how

7-22

11

7-23

12

Oracle Data Integrator Topology: Connecting to the World

88-1

ObjectivesAfter completing this course, you will: Understand the basic concepts behind the Topology interface. Understand logical and physical architecture. Know how to plan a Topology. Have learnt current best practices for setting up a Topology.

8-2

1

What is the Topology? Topology The representation of the information system in ODI:Technologies: Oracle, DB2, File, etc. Datatypes for the given technology Data Servers for each technologies Physical Schemas under each data server ODI Agents (run-time modules) Definition of Languages and Actions

8-3

The Physical Architecture

8-4

2

Properties of Physical Schemas An ODI physical schema always consists of 2 data server schemas: The Data Schema, which contains the datastores The Work Schema, which stores temporary objects

A data server schema is technology-dependant. Catalog Name and/or Schema Name Example: Database and Owner, Schema

A data server has: One or more physical schemas One default physical schema for server-level temporary objects

8-5

Concepts in Reality

TechnologyOracle Microsoft SQL Server Sybase ASE DB2/400 Teradata Microsoft Access JMS Topic File

Data serverInstance Server Server Server Server Database Router File Server

SchemaSchema Database/Owner Database/Owner Library Schema (N/A) Topic Directory

8-6

3

Important Notesded mmen ly reco erver you rong s It is st h data rea for or eac hat f ated a s and t edic ad ject create rary ob hema. tempo Sc ODIs Work as the u s e it rver, ata se or each d al schema f Under sic a phy f the define ision o ed. ub-div each s at will be us th server

8-7

Example InfrastructureProduction site: Boston WindowsMS SQL Server

LinuxOracle 9iACCOUNTING

db_dwh db_purchase

Oracle 10gSALES

Production site: Tokyo WindowsMS SQL Server A

WindowsMS SQL Server B

LinuxOracle

db_dwh db_purchase

ACCT SAL

8-8

4

The Physical Architecture in ODIMSSQL-Bostondb_dwh db_purchase

Oracle-Boston9ACCOUNTING

Oracle-Boston10SALES

LegendData serverPhysical schema

MSSQL-TokyoAdwh

MSSQL-TokyoB

Oracle-TokyoACCT

purchase

SAL

8-9

Prerequisites to Connect to a Server Drivers (JDBC, JMS) Drivers must be installed in /oracledi/drivers This should be done on all machines connecting to the data server.

Connection settings (server dependant) Machine name (IP Address), port User/Password Instance/Database Name,

8-10

5

Important Notee is er nam all T he u s c e ss d t o a c ch e m a s, us e i ng s s nderly ib r a r i e u es o r l s databa ta server. da i n t he i s u se r sure th M a ke n t has le ges. acc ou privi ficient su f

8-11

Creating a Data Server1. 2. 3. 4.

Right-click the technology of your data server Select Insert Data Server Fill in the Name Fill in the connection settings:Data Server User and Password

(Optional) JNDI Connection

8-12

6

Creating a Data Server - JDBCSelect URL Select driver

1. 2. 3. 4. 5.

Select the JDBC tab Fill in the JDBC driver Fill in the JDBC URL Test the connection Click OK

8-13

The JDBC URL The JDBC driver uses a URL to connect to a database system. The URL describes how to connect to the database system. The URL may also contain driver-specific parameters

Use the select button to choose the driver class name and URL template.

8-14

7

Testing a Data Server connection1. 2.

Click the Test button Select the Agent to test this ConnectionLocal (No Agent) performs the test with the Topology Manager GUI.

3.

Click Test The driver must be installed

8-15

Note test the connectionst the ays te A lw n to nectio he con at t ec k t h ch er is ta serv da tly correc ed. ur config

8-16

8

Creating a Physical Schema1. 2.

Right-click the data server and select Insert Physical Schema Select or fill in:Data Schema Work Schema

3. 4.

Select whether this is the Default schema Click OKA warning appears

8-17

The Logical Architecture

8-18

9

What is a Logical Schema? Developers should not have to worry about the actual location of the data servers, or the updates in user names, IP addresses, passwords, etc. To isolate them from the actual physical layer, the administration will create a Logical Schema that is simply an alias for the physical layer.

8-19

Alias vs. Physical ConnectionDatawarehouse(Logical Schema)

Logical Architecture: the Alias

Physical Architecture: the Physical ConnectionWindowsMS SQL Server db_dwh

User: Srv_dev Password: 12456 IP:10.1.3.195 Database: db_dwh

Development site: New York, NY

8-20

10

Modifications of the Physical ConnectionDatawarehouse(Logical Schema)

Logical Architecture: the AliasChanges in the actual physical information have no impact on the developers who always refers to the same logical alias

Physical Architecture: the Physical ConnectionWindowsMS SQL Server db_dwh

User: Srv_prod Password: 654321 IP:10.1.2.221 Database: db_dwh

Production Server: Houston, TX

8-21

Mapping Logical and Physical ResourcesDatawarehouse(Logical Schema)

Logical ArchitectureBut changing the connectivity from one server to the other can become painful

Physical ArchitectureWindowsMS SQL Server db_dwh

WindowsMS SQL Server A dwh

WindowsMS SQL Server db_dwh db_purchase

Development site: New York, NY

QA: New York

Production site: Houston, TX

8-22

11

Mapping Logical and Physical ResourcesDatawarehouse(Logical Schema)

Logical Architecture Contextsve lop me ntQA

For that purpose, the definition of Contexts will allow you to attach more than one physical definition to a Logical Schema

n tio uc od Pr

Physical ArchitectureWindowsMS SQL Server db_dwh

De

WindowsMS SQL Server A dwh

WindowsMS SQL Server db_dwh db_purchase

Development site: New York

Production site: Tokyo

Production site: Boston

8-23

Mapping Logical and Physical ResourcesCRM(Logical Schema)

Datawarehouse(Logical Schema)

Purchase(Logical Schema)

Logical Architecture

Contexts Physical Architecture

Production

Production

Production

Unix

WindowsMS SQL Server db_dwh db_purchase

Of course, a given context will map all physical connections

MS SQL Server

CRM

Production site: Boston

8-24

12

Note Design-Time vs. Run-Timer data esign o s is , the d In ODI on processe ti integra h logical it done w s. ce resour n is ecutio me, ex cular -ti At run n a parti di I will starte and OD ated t, contex e associ t th for tha select urces al reso physic t. contex8-25

Notesmain may re ources ysical l res ny ph . Logica ntexts ed to a nmapp in a given co u ce t resour canno ource . ed res p xt Unmap in the conte used rce be l resou l hysica ra le p in seve A sing apped em may b ts. al contex a logic t ntext, ven co pped at mos In a gi ma ce is urce. resour al reso physic to one

8-26

13

Logical Architecture/Context views

Technology

Logical Schema Context Logical Agent

The same technologies are displayed in Physical and Logical Architecture views. You can reduce the number of technologies displayedWindows > Hide Unused Technologies

8-27

Linking Logical and Physical Architecture

1. 2. 3. 4. 5. 6.

Double-click the context Go to the Agents tab For each logical agent, select the corresponding physical agent in the context. Go to the Schemas tab For each logical schema, select the corresponding physical schema in the context. Click OK.

8-28

14

Planning Ahead for Topology

8-29

Planning the Topology1.

Identify the physical architectureAll data servers All physical schemas Required physical agents

2. 3.

Identify the contexts Define the logical architectureName the logical schemas Name the logical agents

4.

On paper, write out a matrix of logical/physical mappingsThis matrix helps you plan your topology

8-30

15

Matrix of Logical/Physical Mappings

Logical Schemas AccountingACCOUNTING in Oracle on Windows ACCT in Oracle on Linux

2

Contexts Development

SalesSALES in Oracle on Windows

1

Tokyo

3

8-31

Advanced Topology: More on JDBC

8-32

16

Creating a Data Server JNDIExtra properties

1. 2.

Select the JNDI tab Set the JNDI parametersAuthentication User/Password Protocol Driver URL Resource

3. 4.

Run connection Test Click OK

8-33

JDBC Driver A JDBC driver is a Java driver that provides access to a type of database. Type 4: Direct access via TCP/IP Type 3: Three- tier architecture Type 2: Requires the database client layer Type 1: Generic driver to connect ODBC data sources.

Drivers are identified by a Java class name. Class must be in present on the classpath.

Drivers are distributed as .jar or .zip files Should be copied to the /oracledi/drivers directory.

8-34

17

Some Examples of Drivers and URLs

TechnologyOracle Microsoft SQL Server Sybase (ASE, ASA, IQ) DB2/UDB (type 2) DB2/400 Teradata Microsoft Access (type 1) File (Sunopsis driver)

Driveroracle.jdbc.driver.OracleDriver com.inet.tds.TdsDriver com.sybase.jdbc2.jdbc.SybDriver COM.ibm.db2.jdbc.app.DB2Drivercom.ibm.as400.access.AS400JDBCDriver

URLjdbc:oracle:thin:@:: jdbc:inetdae7:: jdbc:sybase:Tds::/[] jdbc:db2: jdbc:as400://[;libraries=] jdbc:teradata://:/ jdbc:odbc: jdbc:snps:dbfile

com.ncr.teradata.TeraDriver sun.jdbc.odbc.JdbcOdbcDriver com.sunopsis.jdbc.driver.file.FileDriver

8-35

Lesson summaryDefining your Defining your topology topology

Physical and Physical and logical agents logical agents

Overview of Overview of topology topology

Data servers & Data servers & physical schemas physical schemas

Logical schemas & Logical schemas & contexts contexts

8-36

18

8-37

19

Oracle Data Integrator Knowledge Modules

99-1

ObjectivesAfter completing this lesson, you will: Understand the structure and behavior of Knowledge Modules Be able to modify Knowledge Modules and create your own behavior

9-2

1

Definition Knowledge Modules are templates of code that define integration patterns and their implementation They are usually written to follow Data Integration best practices, but can be adapted and modified for project specific requirements Example: When loading data from a heterogeneous environment, first create a staging table, then load the data in the staging table. To load the data, use SQL loader. SQL loader needs a CTL file, create the CTL file for SQL loader. When finished with the integration, remove the CTL file and the staging table

9-3

Which KMs for What Flow?

When processing happens between two data servers, a data transfer KM is required. Before integration (Source Staging Area) Requires an LKM, which is always multi-technology At integration (Staging Area Target) Requires a multi-technology IKM

When processing happens within a data server, it is entirely performed by the server. A single-technology IKM is required. No data transfer is performed

9-4

2

More on KMs

KMs can skip certain operations Unnecessary temporary tables will not be created

Some KMs lack certain features Multi-technology IKMs can not perform Flow control IKMs to File, JMS, etc do not support Static control

All KMs have configurable Options

9-5

Case #1Using the Target as the Staging AreaTarget (Oracle) Source (Sybase)ORDERS

Staging Area

LKM_1 LKM_1LINES LKM SQL to Oracle

C$_0

IKM_1 IKM_1 IKM_1 IKM_1I$_SALES

SALES

IKM Oracle Incremental Update

CORRECTIONS File

LKM_2 LKM_2LKM File to Oracle (SQLLDR)

C$_1

IKM Oracle Incremental Update

9-6

3

Case #2Using a third server as the Staging AreaSunopsis Memory Engine Source (Sybase) Staging AreaORDERS IKM SQL to SQL Append

IKM_1 IKM_1 LKM_1 LKM_1C$_0

Target (Oracle)SALES

LINES LKM SQL to SQL

IKM_1 IKM_1C$_1

I$_SALES

CORRECTIONS File

LKM_2 LKM_2LKM File to SQL

IKM SQL to SQL Append

9-7

Case #3Using the Source as the Staging Area

Source (Sybase)ORDERS

IKM SQL to SQL Append

Target (Oracle)SALES

Staging Area IKM_1 IKM_1C$_0 IKM SQL to SQL Append I$_SALES

IKM_1 IKM_1

LINES

IKM SQL to SQL Append

IKM_1 IKM_1C$_1

LKM_1 LKM_1CORRECTIONS File LKM File to SQL

9-8

4

KM TypesThere are five different types of knowledge modules:

KM Type Interfaces

DescriptionAssembles data from source datastores to the staging area. Uses a given strategy to populate the target datastore from the staging area. Checks data in a datastore or during an integration process. Retrieves the structure of a data model from a database. Only needed for customized reverse-engineering. Sets up a system for Changed Data Capture to reduce the amount of data that needs to be processed. Defines the code that will be generated to create Data Web Services (Exposing data as a web service)

LKM IKM CKM RKM JKM SKM

Loading Integration Check Reverseengineering Journalizing Web Services

Models

9-9

Which KMs for What Flow? Four possible arrangements:Loading phase Multi-tech LKM Multi-tech LKM (No LKM needed)Staging area on source

Source

Staging area

Integration phase Multi-tech IKM Single-tech IKMStaging area on target

Target

Multi-tech IKM Single-tech IKM

(No LKM needed)

Source, staging area and target in same location

9-10

5

Importing a New Knowledge Module1. Right click the project 2. Select Import > Import Knowledge Module 3. Choose the import directory ODI KMs are found in the impexp subdirectory 4. Select one or more knowledge modules Hold CTRL/SHIFT for multiple selectionBrowse for directory

5. Click OK

9-11

Description A Knowledge Module is made of steps. Each step has a name and a template for the code to be generated. These steps are listed in the Details tab. The code that will be generated by ODI will list the same step names

9-12

6

Details of the Steps Details of the steps are generic: the source and target tables are not known, only the technologies are known Substitution Methods are used as placeholders for the table names and column names Parameters of the substitution methods let you select which tables or columns are used in the KM

9-13

Options KMs have options that will Allow users to turn options on or off Let users specify or modify values used by the KM

Options are defined in the projects tree, under the KM Options are used in the KM code with the substitution method On/Off options are defined in the Options tab of each step of the KM

9-14

7

Most Common Methods getInfo Returns general information on the current task.

getColList Returns a list of columns and expressions. The result will depend on the current phase (Loading, integration, control).

getTargetTable Returns general information on the current target column.

getTable Returns the full name of the temporary or permanent tables handled by ODI.

getObjectName Returns the full name of a physical object, including its catalog and schema.

9-15

getInfo Method Syntax in a KM or Procedure

Extract of the possible values pPropertyName: SRC_CATALOG: Name of the data catalog in the source environment DEST_USER_NAME: Username of the destination connection CT_ERR_TYPE : Error type (F : Flow, S : Static). Example: The current source connection is: on server:

9-16

8

getColList Method Values returned according to the phase: Loading (in a KLM) To build loading tables To feed loading tables Integration (in a KIM) To build the integration table To feed the integration table Control (KCM) To build the integration table and feed it To control the constraints

9-17

getColList Method Syntax

Where pStart is the string to insert before the pattern pPattern is the string used to identified the returned values Ex: [COL_NAME] returns a list of column names Several pPattern can be declared pSeparator is the character to insert between the returned patterns pEnd the string to insert at the end of the list pSelector is the string that defines a Boolean expression used to filter the elements of the initial list 9-18

9

getColList Examples Retrieve a columns list and their data types (Loading phase): Returns for instance :

(CITY_ID numeric(10) null, CITY_NAME varchar(50) null, POPULATION numeric(10) null)

Retrieve the list of columns of the target to create the loading tables:

9-19

getColList Examples Retrieve the list of columns to be updated in the target (integration phase):

9-20

10

Modifying a KM Very few KMs are ever created. They usually are extensions of modifications of existing KMs. To speed up development, duplicate existing steps and modify them. This will prevent typos in the syntax of the odiRef methods. If you modify a KM that is being used, all interfaces using that KM will inherit the new behavior. Remember to make a copy of the KM if you do not want to alter existing interfaces. Then modify the copy, not the original. Modifying a KM that is already used is a very efficient way to implement modifications in the data flow and affect all existing developments.

9-21

Lesson summary

Understand Understand KMs KMs

Modify // Create Modify Create KMs KMs

9-22

11

9-23

12

Oracle Data Integrator Changed Data Capture

1010-1

ObjectivesAfter completing this lesson, you will: Understand why CDC can be needed Understand the CDC infrastructure in ODI What types of CDC implementations are possible with ODI How to setup CDC

10-2

1

Introduction The purpose of Changed Data Capture is to allow applications to process changed data only Loads will only process changes since the last load The volume of data to be processed is dramatically reduced CDC is extremely useful for near real time implementations, synchronization, Master Data Management

10-3

CDC Techniques in General Multiple techniques are available for CDC Trigger based ODI will create and maintain triggers to keep track of the changes Logs based for some technologies, ODI can retrieve changes from the database logs. (Oracle, AS/400) Timestamp based If the data is time stamped, processes written with ODI can filter the data comparing the time stamp value with the last load time. This approach is limited as it cannot process deletes. The data model must have been designed properly. Sequence number if the records are numbered in sequence, ODI can filter the data based on the last value loaded. This approach is limited as it cannot process updates and deletes. The data model must have been designed properly.

10-4

2

CDC in ODI CDC in ODI is implemented through a family of KMs: the Journalization KMs These KMs are chosen and set in the model Once the journals are in place, the developer can choose from the interface whether he will use the full data set or only the changed data

10-5

CDC Infrastructure in ODI CDC in ODI relies on a Journal table This table is created by the KM and loaded by specific steps implemented by the KM This table has a very simple structure: Primary key of the table being checked for changes Timestamp to keep the change date A flag to allow for a logical lock of the records

A series of views is created to join this table with the actual data When other KMs will need to select data, they will know to use the views instead of the tables10-6

3

CDC Strategies and Infrastructure Triggers will directly update the journal table with the changes. Log based CDC will load the journal table when the changed data are loaded to the target system: Update the journal table Use the views to extract from the data tables Proceed as usual

10-7

Simple CDC Limitations One issue with CDC is that as changed data gets processed, more changes occur in the source environment As such, data transferred to the target environment my be missing references Example: process changes for orders and order lines Load all the new orders in the target (11,000 to 25,000) While we load these, 2 new orders come in: 25,001, 25,002. The last two orders are not processed as part of this load, they will be processed with the next load. Then load the order lines: by default, all order lines are loaded including order lines for orders 25,001 and 25,002 The order lines for 25,001 and 25,002 are rejected by the target database (invalid foreign keys)

10-8

4

Consistent CDC The mechanisms put in place by Consistent CDC will solve the issues faced with simple CDC The difference here will be to lock children records before processing the parent records As new parent records and children records come in, both parent and children records are ignored

10-9

Consistent CDC: Infrastructure Processing Consistent Set CDC consists in the next 4 phases: Extend Window: Compute the consistent parent/child sets and assign a sequence number to these sets. Lock Subscriber: for the application processing the changes, record the boundaries of records to be processed (between sequence number xxx and sequence number yyy). Note that changes keep happening in the source environment, other subscribers can be extending the window while we are processing the data. After processing the changes, unlock the subscriber (i.e. record the value of the last sequence number processed). Purge the journal: remove from the journal all the records that have been processed by all subscribers.

Note: all these steps can either be implemented in the Knowledge Modules or done separately, as part of the Workflow management.

10-10

5

Using CDC Set a JKM in your model For all the following steps, right-click on a table to process just that table, or right-click on the model to process all tables of the model: Add the table to the CDC infrastructure: Right-click on a table and select Changed Data Capture / Add to CDC For consistent CDC, arrange the datastores in the appropriate order (parent/child relationship): in the model definition, select the Journalized tables tab and click the Reorganize button Add the subscriber (The default subscriber is SUNOPSIS) Rightclick on a table and select Changed Data Capture / Add subscribers Start the journals: Right-click on a table and select Changed Data Capture / Start Journal

10-11

View Data / Changed Data Data and changed data can be viewed from the model and from the interfaces In the model, right click on the table name and select Data to view the data or Changed Data Capture / Journal Data to view the changes From the interface, click on the caption of the journalized source table and select or unselect Journalized data only to view only the changes or all the data.

10-12

6

Using Journalized Tables Keep in mind that only one journalized table can be used per interface If you were to use two journalized tables, there is a very highly likelihood that the data sets will be disjoined. No data would be loaded as a result.

10-13

Lesson summaryImplement Implement CDC CDC

Why CDC? Why CDC?

Types of CDC Types of CDC implementations implementations

CDC CDC Infrastructure Infrastructure

10-14

7

10-15

8

Oracle Data Integrator Workflow Management: The Packages

1111-1

ObjectivesIn this lesson, you will: Learn how ODI Packages are used to create a complete workflow. See how to create several different kinds of package steps. Learn how to execute a package.

11-2

1

What Is a Package?

Package: An organized sequence of steps that makes up a workflow. Each step performs a small task, and they are combined together to make the package.

11-3

How to Create a Package

1. 2.

Create and name a blank package Create the steps that make up the package Drag interfaces from the Projects view onto the Diagram tab Insert ODI tools from the toolbox Define the first step Define the success path Set up error handling

3.

Arrange the steps in order

11-4

2

The Package DiagramToolbar

Diagram

ODI tool step Toolbox for ODI tools

Interface step (selected)

Properties of selected step

11-5

Package Diagram ToolbarExecute package Execute selected step Edit selected step Hide/show toolbox Hide/show properties Hide/show success links Hide/show failure links Print package Page setup

Select Next step on success Next step on failure Duplicate selection Delete selection Rearrange selection

Shows errors in the diagram

11-6

3

How to Create an Interface Step1.Expand the project and folder containing the interface. Expand the Interfaces node. 2.Drag the interface to the package diagram.The new step appears.

3.Optionally, change the Step Name in the Properties panel.

11-7

Note Interfaces Are Reusablebe e s c an c Interfa m an y eused r i n t he or times ckage pa s am e nt differe in ges. pack a

11-8

4

Note Interfaces Are Reusableot d ce is n interfa ut reference The d, b ate . duplic kages he pac by t e e in th e s mad hange will affect th s C ce kage interfa n of all pac tio execu it. sed using be reu es can e same c h Interfa es in t erent ny tim ma in diff ge or packa ges. packa

11-9

What Is an ODI Tool?

ODI tools Macros that provide useful functions to handle files, send emails, use web services, etc. Tools can be used as steps in packages.

11-10

5

How to Create an ODI Tool Step1. In the Toolbox, expand the group containing the tool you want to add. Click the tool. Click the diagram.A step named after the tool appears.

2. 3.

4. 5. 6.

Change the Step Name in the Properties panel. Set the tools Properties. Click Apply to save.

11-11

Note Tool Steps Are Not Reusableannot teps c T ool s t can ed, bu s be r eu . ic a t e d e dup l b ate a To cre e of nc se qu e t ool u sa b l e re ds yo u mman e a co reat must c re. u P r oced11-12

6

Note Other Step Typese pes ar step ty s Other such a lable, avai les, variab res, u proced s, or r io scena a. at metad e using l only b tools W e w il s and ce e interfa n of th sectio in this g. trainin11-13

A Simple PackageFirst step Step on success

The first step must be defined Right click > First Step

After each step the flow splits in two directions: Success: ok (return code 0) Failure: ko (return code not 0)

Step on failure

This package executes two interfaces then archives some files. If one of the three steps fails, an email is sent to the administrator.

11-14

7

Note Error Buttonat are g es t h P ack a t l y c incorre ed appear nc se qu e r e Erro with th ighlighted h but t on o l bar . to i n t he e the i t to s e Click . det ai l s11-15

Executing a Package1. Click the Execute button in the package window

2.

Open OperatorThe package is executed as a session Each package step is a step Tool steps appear with a single task Interface steps show each command as a separate task

11-16

8

Note Atomic Testingteps T es t s irst! ually f i ndi vi d le to p ossi b It is single cut e a e exe m th t ep fr o s m. diagra

11-17

Lesson SummaryExecuting a Executing a package and package and viewing the log viewing the log

Sequencing Sequencing steps with error steps with error handling handling

Creating a Creating a package package

Creating interface Creating interface steps steps

Creating tool steps Creating tool steps

11-18

9

11-19

10

Oracle Data Integrator Metadata Navigator

1212-1

ObjectivesAfter completing this lesson, you will: Understand Metadata Navigator Know How to use Metadata Navigator Be able to explain the features in Metadata Navigator

12-2

1

Purpose Medatadata Navigator will give access to the metadata repository from a Web interface It is a read-only interface (see Lightweight Designer for an interactive interface) It can build graphical flow maps and data lineage based on the metadata

12-3

Login to Metadata Navigator The same username and passwords can be used to login into MN, as long as the user has enough privileges These privileges are set in the security interface

12-4

2

Overview By default, Metadata Navigator will show the projects available in the repository. Users menu will be customized based on their privileges

12-5

Repository Objects All objects in the repositories can be viewed. Hyperlinks let you jump from one object to the other.

12-6

3

Data Lineage For data lineage, MN will list the source datastores and target datastores for any element. You can click on any icon in the graph to get further lineage

12-7

Details on the Data Lineage The option Show Interfaces in the Lineage will show all the interfaces where the datastores are used as source or targets

12-8

4

Details of an Interface in the Lineage If you click on an interface, you can see the detailed mappings

12-9

Flow Maps Flow maps will show the dependencies between models (or datastores) and projects (or interfaces) You can choose the level of details that you want

12-10

5

Flow Map Details This flow map shows that all TRG_* tables are used as targets in the Oracle Target project. It also shows that TRG_CITY and TRG_CUSTOMER are also used as sources in that same project.

12-11

Execution of a Scenario

Select the values from the drop-down menus Set the value for the parameters Execute!

12-12

6

Lesson summaryHow to Use How to Use Metadata Navigator Metadata Navigator

What is Metadata What is Metadata Navigator Navigator

How to Describe How to Describe Metadata Navigator Metadata Navigator

12-13

12-14

7

Oracle Data Integrator Web Services

1313-1

ObjectivesAfter completing this lesson, you will: Understand why Web Services? Understand the different types of Web Services Know how to setup these web services

13-2

1

Environment In this presentation, Apache Tomcat 5.5 or Oracle Container for Java (OC4J) are used as the application server, with Apache Axis2 as the Web Services container. Examples may need to be adapted if using other Web Services containers.

13-3

Types of Web Services The Oracle Data Integrator Public Web Services are web services that enable users to leverage Oracle Data Integrator features in a service-oriented architecture (SOA). It provides operations such as starting a scenario. Data Services are specialized Web Services that provide access to data in datastores, and to captured changes in these datastores. These Web Services are automatically generated by Oracle Data Integrator and deployed to a Web Services container - normally a Java application server.

13-4

2

Public Web Services To install the Oracle Data Integrator Public Web Services on Axis2: In Axis2, go to the Administration page. Select the Upload Service link Browse for the Oracle Data Integrator Web Services .aar file. It is located in the /tools/web_services/ sub-directory in the Oracle Data Integrator installation directory. Click the Upload button. Axis2 uploads the Oracle Data Integrator Web Services. You can now see Data Integrator Public Web Services in the Axis2 services list.

13-5

Usage For Public Web ServicesAdd Bulk Data Transformation to BPEL ProcessOracle SOA SuiteBPEL Process ManagerBusiness Activity Monitoring Web Services Manager Rules Engine Enterprise Service Bus

Oracle SOA Suite:BPEL Process Manager for Business Process Orchestration

Oracle Data IntegratorE-LT Agent E-LT Metadata

Oracle Data Integrator:Efficient Bulk Data Processing as Part of Business Process Interact via Data Services and Transformation Services

Bulk Data Processing

13-6

3

Data Services: Environment Setup ODI will let you generate and deploy web services directly from the designer interface Carefully setup your environment to enable this feature: Topology must be properly setup (definition of the iAxis server) META-INF/context.xml and WEB-INF/web.xml must be updated in the iAxis directories (see the next slides) The database drivers must be installed in the appropriate directory Use /common/lib for Tomcat. Use ORACLE_HOME/j2ee/home/applib for OC4J.

Restart your server to take these changes into account

13-7

Context.xml Add the following entry in the file Resource name will be re-used in the web.xml file and in the Model in Designer driverClassName, url, username and password will explicitely point to the data source

13-8

4

OC4J Update the file \j2ee\home\config\data-sources.xml with the appropriate connection information:

13-9

Tomcat Add the following entry in the context.xml file Resource name will be re-used in the web.xml file and in the Model in Designer driverClassName, url, username and password will explicitely point to the data source

Update the web.xml file with the resource name of the context.xml file (here res-ref-name) Data Integrator Data Services on Oracle_SRV1 jdbc/Oracle/Win javax.sql.DataSource Container

13-10

5

Topology for Data Services One entry per Web Container Make sure that you define a logical schema for the server a well Note: this entry defined the access to the container, not to the data!

13-11

Data Services: Model Setup Enter the appropriate information in the Service tab of the model Select the logical schema name for the web service container Set the values for the service name note that the name of the data source must be consistent with the entries in: data-sources.xml forOC4J context.xml and web.xml, for Tomcat (prefixed with: java:/comp/env/ for Tomcat)

Select the appropriate SKM for the operation

13-12

6

Generate and Deploy Select the datastores to be deployed Click Generate and deploy Select the actions you want to perform (all by default) Your web services are ready to be used

13-13

Checking for Web Services List the services on axis2: http://myserver:8080/axis2/servic es should list your service Right-click on the tables that you have exposed as web services select Test Web Services a list of ports will be available: one po