comparing data integration platforms_090910

Upload: lsona

Post on 03-Apr-2018

216 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/28/2019 Comparing Data Integration Platforms_090910

    1/18

    Comparative costs and usesof Data Integration Platformsresearch and survey results

    An original research survey paper by Bloor ResearchAuthor : Philip HowardPublish date : September 2010

    Sur

    vey

  • 7/28/2019 Comparing Data Integration Platforms_090910

    2/18

    Our results show dramatic, andsometimes surprising, differencesbetween vendors and products inboth overall TCO and in cost perproject and cost per source andtarget system (but) cost is,and should be, only one of several

    determining factors when selectingdata integration solutionsPhilip Howard

  • 7/28/2019 Comparing Data Integration Platforms_090910

    3/181 2010 Bloor ResearchA Bloor Survey Results Paper

    Comparative costs and uses of Data Integration Platformsresearch and survey results

    Introduction

    Data integration has sometimes been por-trayed as a necessary evil, a cost centre, atime sink, or in other unattering terms. Con-versely, some vendors, consultants, punditsand analysts have positioned various integra-tion models and technologies aggressivelyas a miracle cure-all. However, there is asurprising lack of primary research to sup-port any kind of claim with regards to the costof, or cost-effectiveness of, data integrationsolutions. Even basic details on how data in-tegration platforms are being used in organi-sations today are thinly supported by todays

    research. Opinions are everywhere: facts arefew and far between.

    Bloor Research has set out to provide themarketplace with some of the most compre-hensive information available as to the typesof projects that data integration platforms arebeing used for, on what scale, and whether thisdiffers by integration product. Our researchwas designed to capture detailed informa-tion on the costs involved with using differentproducts, including both direct acquisition andrecurring costs, ultimately arriving at a com-parison by vendor of total cost of ownership

    (TCO) over a three year period. In addition, wewanted to go beyond traditional TCO formula-tions and estimate relative cost effectivenessbased on costs per integration project andcosts per source and target connection: whichBloor Research believes provide more usefuland robust metrics for decision-making.

    In order to gather the appropriate informationto reach our conclusions Bloor Research con-ducted a survey, which produced 341 respons-es from companies using data integrationtools (or, in some cases, using hand coding)

    for a variety of purposes. There is more use-ful content in these responses than we couldanalyse and include within this particular re-port (we hope to address that in subsequentreports) and, inevitably, there were also someresponses that failed to provide complete ormeaningful data. We decided to exclude thelatter responses from our analysis and conse-quently the results of this survey are based on163 completed responses.

    One unfortunate consequence of this deci-sion has been that we no longer have enoughresponses from users of either open sourcetools or hand coding, for us to fully rely on theresponses received from these users. Whilewe have included these results (in summaryform only) in the details that follow, readersshould be aware that the results from thesetwo categories are not necessarily reliableand should be treated with caution.

    This report does not attempt to distinguishbetween products on the basis of their func-

    tionality and, in any case, that type of infor-mation is readily available elsewhere (notleast from Bloor Research). One productmay be more suitable for supporting complextransformations than another, for example.Similarly, another product may offer betterperformance or have more real-time capa-bilities or have features that are not presentelsewhere. Cost is, and should be, only oneof several determining factors when selectingdata integration solutions.

    Our results certainly show dramatic, andsometimes surprising, differences between

    vendors and products in both overall TCO andin cost per project and cost per source andtarget system, as well as variations within therange of common integration scenarios or usecases. As we noted earlier, this type of primaryresearch and the level of detail in this reportare not generally attempted in the data inte-gration eld, but Bloor Research believes thatthis information will contribute to a better un-derstanding of the options available to prac-titioners that are responsible for designing,planning and implementing data integrationprojects and strategies.

  • 7/28/2019 Comparing Data Integration Platforms_090910

    4/182 2010 Bloor Research A Bloor Survey Results Paper

    Comparative costs and uses of Data Integration Platformsresearch and survey results

    Using data integration platforms

    In this section we review the data we collectedabout how the various products were used,the frequency with which they were used forcommon scenarios and our respondentsviews on the suitability of these products foreach scenario.

    Integration scenarios

    We started by outlining six common scenariosfor which data integration tools might be used.In our experience, these scenarios or use cas-es represent the vast majority of project types

    for which integration products are used. Thescenarios are:

    1. Data conversion and migration projects

    2. ETL, CDI and MDM solutions

    3. Synchronisation of data between in-houseapplications (such as CRM/ERP)

    4. Synchronisation of data with SaaSapplications

    5. B2B data exchange for, say, customer/sup-

    plier data

    6. Implementation of SOA initiatives

    Respondents were asked to identify the one,or more, integration products that were usedin their organisation for these scenarios and toidentify the single product with which they hadthe most experience. For that selected product,respondents recorded their view of how suitablethey thought their tool was for each of the abovescenarios. It was not a requirement that thechosen product or vendor was actually beingused for each scenario, simply that respondentsbelieved that the products were suitable.

    In this and following charts we have distin-

    guished between single-product vendors (Per-vasive Software) and those supplying multipleproducts, because there is clearly an advantagein being able to tackle multiple project typeswith a single tool even though, ultimately, it isunrealistic to expect that any one product canaddress all these scenarios equally success-fully. For this reason, most vendors have de-veloped or acquired multiple products (whichoften have signicant variations in architecture,metadata and required skill levels) to addressthese various requirements. In that context, it isnotable that the majority of responses using In-formatica were from companies using a single

    product (PowerCenter) whereas users of IBM,Microsoft and Oracle products were typicallyusing multiple products from that vendor. As aresult, readers may wish to treat Informaticasresults as if they derived from a single tool.

    Figure 1: Average number of scenarios for which products/vendors are considered suitable

    0 0.5 1 1.5 2 2.5 3 3.5 4

    OpenSource

    HandCoding

    Pervasive

    Oracle

    Microso=

    InformaAca

    IBM

    The conclusion from Figure 1 would appearto be that the product sets from Informaticaand Oracle are considered by their users to bemore reusable than their competitors, though

    the differences are not large. Open source andhand-coded solutions lag behind proprietarysolutions in terms of perceived reusability.

    Single product

    Multiple products

    Others

  • 7/28/2019 Comparing Data Integration Platforms_090910

    5/18

  • 7/28/2019 Comparing Data Integration Platforms_090910

    6/184 2010 Bloor Research A Bloor Survey Results Paper

    Comparative costs and uses of Data Integration Platformsresearch and survey results

    Using data integration platforms

    Figure 4: Average resources (person-weeks) spent per project

    Conv/migr ETL/MDM B2B SOA SaaS ERP/CRM Average

    IBM 10.7 12.5 15.2 17.2 12.4 11.8 13.3

    Informatica 11.6 12.7 11.9 14.2 16.5 13.4 13.4

    Microsoft 7.8 8.2 7.1 12.1 10.0 6.2 8.5

    Oracle 18.9 11.9 12.0 16.0 16.3 10.4 14.3

    Pervasive 4.0 4.2 3.3 7.0 7.1 4.4 5.0

    Hand coding 16.0

    Open Source 7.8

    Project timelines and resources

    For any one integration scenario, the projects themselves can vary in scale and complexity. In thissection, we look in more detail at the amount of effort (in man-weeks) that was actually devotedto projects within each scenario. We did this by asking about the percentage of projects that ab-sorbed 0 2 person weeks, 2 6 person weeks, 6 12 person weeks and so on. We then calculatedthe weighted average as described in Appendix C.

    One of the interesting ndings here is that, for each vendor, SOA and SaaS projects take signi-cantly longer, on average, than other types of projects. More signicantly, there are surprisingdifferences between the products in terms of overall productivity. It seems at rst sight thatPervasive is clearly the most productive environment, such that you can complete projects inless time than with any of the other products. At the same time, it is worth noting that averageproject effort is not solely a reection of the efciency (or inefciency) of the integration plat-form being deployed. Another key factor that might contribute to these results is if there maybe a consistent difference in project scale and complexity being addressed by users of IBM, In-formatica and Oracle versus Pervasive or Microsoft, a factor that we cover in the next section.Readers will need to form their own opinion of which of the outlined possibilities is most l ikely.

  • 7/28/2019 Comparing Data Integration Platforms_090910

    7/185 2010 Bloor ResearchA Bloor Survey Results Paper

    Comparative costs and uses of Data Integration Platformsresearch and survey results

    Using data integration platforms

    Scale and complexity

    One of the challenges that we faced in col-lecting and interpreting the data relatesto project scale and complexity. While it isrelatively easy to capture information on thenumber of person months required to com-plete a project, this resource allocation doesnot necessarily reect project complexity.What we have identied and measured as oneuseful indicator of both scale and complexity

    is the average number of sources and targetsused in each project. While this does not tellus about the types of transformations andvalidations required (which must impact over-all project cost in some way) it does give us ametric for the complexity of the overall envi-ronment and the likely scale and complexityof each project. It certainly provides anotherdimension for both cost comparisons andproject planning purposes.

    Figure 5: Average number of end points (sources and targets) per project

    0 2 4 6 8 10 12 14 16 18

    OpenSource

    HandCoding

    Pervasive

    Oracle

    Microso

  • 7/28/2019 Comparing Data Integration Platforms_090910

    8/186 2010 Bloor Research A Bloor Survey Results Paper

    Comparative costs and uses of Data Integration Platformsresearch and survey results

    Using data integration platforms

    Ramp up time and effort

    We asked a dual question about ramp up time: how long it took to learn to use the product wellenough to build the rst solution, and what resources (either internal or external) were actuallyused to build that solution. Averaged answers were as follows:

    Time to learn Resources required to build rst solution

    WeeksInternal staff(man weeks)

    External consultants(man weeks)

    Total(man weeks)

    IBM 7.3 10.0 6.8 16.8

    Informatica 4.2 7.2 5.1 12.3

    Microsoft 4.3 7.4 3.0 10.4

    Oracle 6.5 11.9 5.1 17.0

    Pervasive 3.0 5.6 2.5 8.1

    Hand coding 4.6 5.6

    Open Source 6.5 9.8

    While we do not know the relative size of rst projects it is not unreasonable to assume that, onaverage, they are comparable across vendors, particularly considering the results on the numberof end points in the prior section. On both counts, Pervasive is signicantly easier to learn andrequires less resource than solutions from the other suppliers for the rst project. Of course, itmust be remembered that many of the users of the tools from other vendors have actually pur-chased multiple products and one would therefore expect a longer learning curve.

    Cloud Integration

    As an ancillary question we wanted to nd out about companies usage andplans for cloud and SaaS based solutions for data integration. The results areillustrated in the following diagram:

    While some users of all the vendors plan to implement cloud or SaaS baseddata integration solutions in the future only Informatica (1), Microsoft (9) andPervasive (7) had such implementations today. Each of these had a singlecustomer whose solution was wholly based on this platform. IBMs customers

    were far more likely to have no plans for cloud or SaaS with 60% having nosuch plans.

    0% 10% 20% 30% 40%

    Noplans

    Alreadyimplementedinconjunc:onwithin-

    housecapabili:es

    Alreadyimplementedassolesourceofdata

    integra:on

    Planwithin6months

    Planwithin612months

    Planwithin1224months

    Dontknow

  • 7/28/2019 Comparing Data Integration Platforms_090910

    9/187 2010 Bloor ResearchA Bloor Survey Results Paper

    Comparative costs and uses of Data Integration Platformsresearch and survey results

    Cost elements

    The primary goal of our research was to ex-plore the costs involved in acquiring and using(over a three year period) the respective prod-ucts/vendors. Product acquisition decisionsare based on a variety of factors, but costs inrelation to budgets are invariably a key driver.All too often, however, the cost elements arehard to gauge or not well understood, so inthis section we pull all our data points togeth-er and provide a comprehensive analysis oftotal costs over time, covering traditional TCOas well as costs per project and per end point(that is, source or target systems).

    Initial costssoftware and hardware

    Licensing structures and pricing points forsoftware can vary enormously between ven-dors and, with the growing adoption of SaaS/cloud applications and infrastructure, therange of options is getting broader. As arst step, we asked how users licensed theirsoftware and found that 18% had subscrip-tion-based pricing vs 82% for the more con-ventional perpetual licensing model. All ofthe IBM users fell into the latter category. Wethen combined these results with users ex-

    penses for new hardware (where needed), any

    additional software required to support theintegration products, and training and imple-mentation costs for a rst year total. The tablebelow shows these rst year costs based on aconventional licensing model. There were alsoa relatively small number of users (althoughnone in the case of IBM) who had opted for asubscription-based licensing model and, whilethere were not enough of these to make themworth presenting in detail, they do affect thetotal average costs, which are presented in thefollowing list.

    IBM $834,531Informatica $226,367Microsoft $185,113Oracle $307,061Pervasive $ 98,368Hand coding $ 87,000Open Source $108,333

    These gures speak for themselves: in partic-ular that Pervasive offers by far the lowest costfor initial acquisition and deployment, evenunder-cutting the costs of open source op-tions (probably because of the need for enter-prisenon-freeversions of these products).

    IBM, on the other hand, is by some margin the

    Licensecosts

    Additionalhardware

    Additionalsoftware

    ImplementationTotal rstyear costs

    IBM $175,781 $270,156 $51,094 $337,500 $834,531

    Informatica $ 41,089 $ 92,450 $32,507 $ 88,679 $254,726

    Microsoft $ 36,828 $ 37,591 $67,349 $ 62,599 $204,367

    Oracle $ 85,472 $104,018 $20,738 $132,578 $342,806

    Pervasive $ 29,390 $ 18,455 $21,873 $ 12,531 $ 82,250

    Open Source* $105,000

    Hand coding* $ 87,000

    Annual/ongoing costs

    Maintenancefees

    Hardware AdminInternal

    tech staffExternal

    consultantsTotal

    annual costs

    IBM $75,750 $45,875 $74,438 $107,375 $75,063 $378,501

    Informatica $46,895 $31,368 $57,053 $ 86,000 $53,105 $274,421

    Microsoft $16,343 $21,969 $33,438 $ 48,156 $28,938 $148,844

    Oracle $30,500 $26,778 $35,222 $ 46,111 $42,778 $181,389

    Pervasive $23,194 $12,278 $18,944 $ 50,278 $11,056 $115,750

    Open source* $ 32,677

    Hand coding* $127,300

    * Additional information based on limited number of responses for rst year and on-going costs

    Initial costs

  • 7/28/2019 Comparing Data Integration Platforms_090910

    10/188 2010 Bloor Research A Bloor Survey Results Paper

    Comparative costs and uses of Data Integration Platformsresearch and survey results

    Cost elements

    most expensive, for reasons we have previous-ly discussed. As an example (not statisticallysignicant), it turns out that for respondentsonly using DataStage or InfoSphere the aver-age costs are $491,786.

    In the long run, the recurring costs of own-ing or operating any product will outweigh theinitial acquisition costs. The data we collectedwas designed to let us calculate annual (re-curring) costs, including administrative costs,maintenance fees, and consulting as well asany costs associated with technical personnel.

    With the exception that Oracles ongoing costsare somewhat lower than we might otherwisehave expected, these gures are very much inline with those that have gone before.

    Total cost of ownershipover 3 years

    In this section we have combined the resultsof all the cost-related survey questions otherthan the initial time to learn and time requiredfor needs analysis: they tend to correlate withthe total costs so they will merely exacerbatethe differences between the least and most

    expensive solutions listed here. We have as-sumed for this exercise that data integrationproducts remain in service for at least threeyears (although the average is probably sig-nicantly longer). In this section, we havetaken the sum of the rst year costs and twofurther years of recurring costs to derive atotal three-year total cost of ownership (TCO)gure, as shown in Figure 6.

    Figure 6: Three-year Total Cost of Ownership (TCO)

    0 500,000 1,000,000 1,500,000 2,000,000

    OpenSource

    HandCoding

    Pervasive

    Oracle

    Microso;

    Informa?ca

    IBM

    Once again, the most economical productscome from Pervasive and Microsoft, but thesenumbers do not tell the whole story. In thenext section, we extend the analysis by look-ing at the number of projects for which eachproduct is used.

    Cost per project

    Various studies have been published in thepast, both for data integration platforms andother software products, which have attempt-ed to identify the costs associated with par-

    ticular products. However, costs alone are notenough, at least in the case of data integration,because the amount of use you get out of theproduct is also relevant. If you only run a sin-gle three week data integration project once ayear, then you are not getting as much valueout of the software as you would if you wererunning a dozen such projects. So we need todivide total costs by the number of projectscompleted. This results in the average costper project, as shown in Figure 7.

    It is interesting to see how these results aredifferent from those in the previous section.

    Pervasive, followed by Microsoft, continue tolead in providing a signicant overall value ad-vantage. However, as multiple projects are fac-tored into the comparison, Oracle has slippedwell behind Informatica. Also of interest is therelatively high cost of both open source andhand coding although the numbers are not sta-tistically reliable.

    Single product

    Multiple products

    Others

  • 7/28/2019 Comparing Data Integration Platforms_090910

    11/189 2010 Bloor ResearchA Bloor Survey Results Paper

    Comparative costs and uses of Data Integration Platformsresearch and survey results

    Cost elements

    $6,111

    $2,226

    $945

    $3,037

    $1,585

    $2,080

    $6,784

    $0 $1,000 $2,000 $3,000 $4,000 $5,000 $6,000 $7,000

    OpenSource

    HandCoding

    Pervasive

    Oracle

    Microso

    InformaFca

    IBM

    Project cost per end point

    While cost per project is a very useful measurewhen comparing products, there is a further

    dimension that can help to dene or conrmvalue and that can provide a basis for planningintegration initiatives. By collecting data on thenumber of sources and targets (end points)involved with these projects, we can allow forproject scale and complexity in the compari-son of product costs. In addition, since reus-ability of integration artefacts (such as source/target schema and logic) is known to be a key

    Figure 8: Average costs (3-year TCO) per project per end point (sources and targets)

    Figure 7: Average cost per project

    0 10,000 20,000 30,000 40,000 50,000

    OpenSource

    HandCoding

    Pervasive

    Oracle

    Microso=

    InformaAca

    IBM

    advantage in containing overall integrationcosts, including the number of end points in

    our calculations should reect any differencesbetween the products productivity advantages.

    We believe that the score for IBM has been af-fected by the factors previously discussed andshould therefore be treated with caution. Theoverall results, however, do continue to alignwith our other cost-based rankings.

    Single product

    Multiple products

    Others

    Single product

    Multiple products

    Others

  • 7/28/2019 Comparing Data Integration Platforms_090910

    12/1810 2010 Bloor Research A Bloor Survey Results Paper

    Comparative costs and uses of Data Integration Platformsresearch and survey results

    Conclusions

    Several major themes emerge from the datathat we collected for this report. Organisa-tions adopt integration products for a disparaterange of integration scenarios: there is clearlyno shortage of data integration challenges. Sec-ondly, our results show dramatic, and some-times surprising, differences between vendorsand products in both overall TCO and in cost perproject and cost per source and target system.

    Reusability across multiple scenarios is po-tentially very valuable in terms of both ROIand, we can assume, organisations agility in

    the face of change. Our numbers appear toshow that Informatica and Oracle are consid-ered by their users to be somewhat more re-usable than their competitors, although theyare all at a similar level. In terms of actual us-age, the average number of projects per prod-uct or vendor was mostly in the 30 to 40 rangeover a three year period, but Oracle (and opensource) had signicantly lower numbers, thusconrming the exibility offered by other prod-uct sets, headed by Informatica.

    In terms of project timelines and effort, Perva-sive appears to be the easiest product to learn

    and requires the fewest resources for devel-opment of a rst integration solution. Withrespect to initial development, Informaticaand Microsoft hold the middle ground behindPervasive and are followed by IBM and Oracle.

    The rankings between vendors on costs ap-pears consistent, whether measuring initial(rst year) costs, ongoing costs or more de-tailed breakdowns including projects andnumber of end points. In our view, the guresspeak for themselves: in particular that Perva-sive offers by far the lowest cost both for initial

    acquisition and subsequent deployment, evenunder-cutting the costs of open source op-tions. Microsoft was the next closest in termsof total costs, and these comparisons hold up(or are accentuated) when we calculated costsper project, although Oracle slipped behind In-formatica and IBM in this comparison.

    One area of surprise to many will be the poorshowing for open source integration tools. Thismay be explained by the limited number of re-sponses received. Leaving this aside, althoughthe open source products fared well againstproducts at the top-end of the price curve (such

    as IBM and Oracle), they did not compare par-ticularly well against the more cost-effective

    productsin particular, Pervasives integrationproduct bested the open source tools in nearlyevery category. Although the open source re-sults did not meet our statistical signicancethresholds, the results are potentially indica-tive and are sufciently meaningful to warrantpublication, albeit with a disclaimer.

    While the same caveat also applies to handcoded solutions, it is noticeable that Pervasive,Microsoft and Informatica all work out as morecost-effective than hand coded solutions onceyou take into account how reusable these prod-

    ucts are across multiple projects, Of course,this is the story that these vendors have beenempahsising for some years, but it is good tohave evidence that this is, indeed, the case.

    We are inclined to make an additional observa-tion, based on all of these results, that theremay be a certain stratication of the market,with Informatica, Microsoft and Pervasive (andopen source products) being used successfullyto deliver short and medium length projects,while IBM, Informatica and Oracle (and handcoding) compete for all lengths of projects.This is, nonetheless, an observation and not

    a rm conclusion that can be asserted fromthe survey numbers. Indeed, if anything, theresults on the number of sources/targets perproject (end points) do not support this view,though numbers of end points do not tell thewhole story with respect to project complexity.

    We should add one nal note on product func-tionality for specic tasks. The more frequentlya vendor or product is used, in different scenar-ios, is certainly a reection of the functionalityof the product, but we must make the caveatthat we have not attempted to distinguish be-

    tween products on the basis of their function-ality: so one product may be more suitable forsupporting complex transformations than an-other, for example. Similarly, another productmay offer better performance or have morereal-time capabilities or have features thatare not present elsewhere. Thus cost is, andshould be, only one of several determining fac-tors when selecting data integration solutions.

    Further Information

    Further information is available fromhttp://www.BloorResearch.com/update/2042

    http://www.bloorresearch.com/update/2042http://www.bloorresearch.com/update/2042
  • 7/28/2019 Comparing Data Integration Platforms_090910

    13/1811 2010 Bloor ResearchA Bloor Survey Results Paper

    Comparative costs and uses of Data Integration Platformsresearch and survey results

    Appendix A: Methodology

    In order to gather the appropriate informationto reach our conclusions Bloor Research con-ducted a survey which produced 341 respons-es from companies using data integrationtools (or, in some cases, using hand coding)for a variety of purposes. However, some of therespondents provided no data on costs and wewere therefore forced to exclude them fromour analysis. In addition, there were somesets of answers were it appeared that the re-spondents had simply checked the same boxmultiple times rather than providing any sortof thoughtful answer to the question. Interest-

    ingly, the vast majority of such answers wereprovided by consultants! In any case, we feltobliged to ignore these answers, and conse-quently the results of this survey are based on163 completed responses.

    The survey was conducted within Bloor Re-searchs own subscriber community andthrough contacts provided by Pervasive Soft-ware, which sponsored this research. This hasresulted in a signicant preponderance of Per-vasive customers within the survey, along withthat companys most commonly encounteredcompetitors, notably Microsoft. No relevance

    should be attached to this fact.

    We received a signicant number of responsesfrom companies using Informatica, Oracle andIBM products. While we also had replies fromcompanies using SAP Business Objects, Da-taux, Ab Initio, Talend and other open sourceproducts, as well as various others, none ofthese were in sufcient numbers to make ananalysis of these products statistically sig-nicant. Similarly, we could not include handcoding results because a disproportionatenumber of those responses (more than two

    thirds) failed to include information on costs.Nonetheless, we have included foot notes withdata points on open source and hand codedsolutions because these approaches are im-portant in presenting an overall context forthis survey.

    One nal point should be claried. We did notsimply ask users to check the name of thevendor whose products they used but to in-dicate specically which product or productsthey used. Thus for IBM, for example, we hadresponses for IBM: multiple products, IBM:DataStage and IBM: InfoSphere. In the g-ures that follow we have added all these re-sults together (but we have not included CastIron, which was recently acquired by IBM) anddo not present them separately. Similar pointsapply to Informatica, Microsoft and Oraclethough we did not have to do this for Perva-

    sive because of their strategy of delivering asingle unied product for multiple integrationscenarios. This has a number of consequencesthat readers should bear in mind when consid-ering the results that follow:

    1. A single product should (there is no guar-antee) require less time to learn and usethan multiple products.

    2. One would expect a suite of products to of-fer more exibility than a single product.

    3. The average cost for multiple products is

    under-stated in our cost gures, becauseof the inclusion of single product users.

    4. The average cost of single products fromvendors with multiple products is over-es-timated, because of the inclusion of multi-ple product users.

    Readers should know that the majority of IBM,Microsoft and Oracle users responding to thissurvey indicated that they were using multipleproducts. This was not the case with Infor-matica, with the majority simply being Pow-

    erCenter users: in this respect and with duecaution, Informatica results can be treatedmore like a single product response.

    A more detailed listing of the vendors whoseproducts were in use is provided in Appendix B.

  • 7/28/2019 Comparing Data Integration Platforms_090910

    14/1812 2010 Bloor Research A Bloor Survey Results Paper

    Comparative costs and uses of Data Integration Platformsresearch and survey results

    Appendix B: Vendors/products identied by survey respondents

    The following were the total numbers of respondents by vendor, in de-scending order:

    Microsoft 75Pervasive 60Informatica 40Oracle 35IBM 31Hand coding 29SAP (including Business Objects) 9Open source (including Talend 2) 6Cast Iron (not included in IBM gures) 6Software AG (WebMethods) 5

    Boomi 4Dataux 3Ab Initio 2

    There were a large number of other products in use by just onerespondent.

  • 7/28/2019 Comparing Data Integration Platforms_090910

    15/1813 2010 Bloor ResearchA Bloor Survey Results Paper

    Comparative costs and uses of Data Integration Platformsresearch and survey results

    Appendix C: Methodology for bucket questions

    Many of the questions asked in our survey involved multi-part answerswith bucket style answers. For example, in the question asking aboutthe lengths of projects we asked respondents to tells us what percent-age of their projects fell into the less than 2 weeks, 26 weeks, 612weeks, 1224 weeks or more than 24 weeks categories. In orderto calculate averages, we took the median (that is, the half-way point)within each range and applied that as the multiplier for the relevantgure or percentage. For the top range, which was unlimited, we had totake a reasonable gure and apply this.

  • 7/28/2019 Comparing Data Integration Platforms_090910

    16/18

    Bloor Research overview

    Bloor Research is one of Europes leading IT re-search, analysis and consultancy organisations. Weexplain how to bring greater Agility to corporate ITsystems through the effective governance, manage-ment and leverage of Information. We have built areputation for telling the right story with independ-ent, intelligent, well-articulated communicationscontent and publications on all aspects of the ICTindustry. We believe the objective of telling the rightstory is to:

    Describe the technology in context to its busi-ness value and the other systems and processesit interacts with.

    Understand how new and innovative technolo-gies t in with existing ICT investments.

    Look at the whole market and explain all the so-lutions available and how they can be more ef-fectively evaluated.

    Filter noise and make it easier to nd the ad-ditional information or news that supports bothinvestment and implementation.

    Ensure all our content is available through themost appropriate channel.

    Founded in 1989, we have spent over two decades

    distributing research and analysis to IT user andvendor organisations throughout the world via onlinesubscriptions, tailored research services, events andconsultancy projects. We are committed to turningour knowledge into business value for you.

    About the author

    Philip HowardResearch Director - Data

    Philip started in the computer industry way back in1973 and has variously worked as a systems analyst,programmer and salesperson, as well as in marketingand product management, for a variety of companiesincluding GEC Marconi, GPT, Philips Data Systems,Raytheon and NCR.

    After a quarter of a century of not being his own bossPhilip set up what is now P3ST (Wordsmiths) Ltd in 1992and his rst client was Bloor Research (then ButlerBloor), with Philip working forthe company as an associate analyst. His relationship with Bloor Research hascontinued since that time and he is now Research Director. His practice area en-

    compasses anything to do with data and content and he has ve further analystsworking with him in this area. While maintaining an overview of the whole spacePhilip himself specialises in databases, data management, data integration, dataquality, data federation, master data management, data governance and datawarehousing. He also has an interest in event stream/complex event processing.

    In addition to the numerous reports Philip has written on behalf of Bloor Re-search, Philip also contributes regularly to www.IT-Director.com and www.IT-Analysis.com and was previously the editor of both Application DevelopmentNews and Operating System News on behalf of Cambridge Market Intelligence(CMI). He has also contributed to various magazines and published a number ofreports published by companies such as CMI and The Financial Times.

    Away from work, Philips primary leisure activities are canal boats, skiing,playing Bridge (at which he is a Life Master) and walking the dog.

  • 7/28/2019 Comparing Data Integration Platforms_090910

    17/18

    Copyright & disclaimer

    This document is copyright 2010 Bloor Research. No part of this publicationmay be reproduced by any method whatsoever without the prior consent of BloorResearch.

    Due to the nature of this material, numerous hardware and software productshave been mentioned by name. In the majority, if not all, of the cases, these productnames are claimed as trademarks by the companies that manufacture theproducts. It is not Bloor Researchs intent to claim these names or trademarksas our own. Likewise, company logos, graphics or screen shots have been repro-duced with the consent of the owner and are subject to that owners copyright.

    Whilst every care has been taken in the preparation of this document to ensurethat the information is correct, the publishers cannot accept responsibility forany errors or omissions.

  • 7/28/2019 Comparing Data Integration Platforms_090910

    18/18

    2nd Floor,145157 St John Street

    LONDON,EC1V 4PY, United Kingdom

    Tel: +44 (0)207 043 9750

    Fax: +44 (0)207 043 9748Web: www.BloorResearch.com

    email: [email protected]

    http://www.bloor-research.com/mailto:[email protected]:[email protected]://www.bloor-research.com/