hpe operations orchestration... · hpe operations orchestration 10.60 does not include significant...
TRANSCRIPT
HPE Operations Orchestration Software Version: 10.60
Windows and Linux Operating Systems
Benchmark Performance Guide
Document Release Date: May 2016 Software Release Date: May 2016
Benchmark Performance Guide
HPE Operations Orchestration (10.60) Page 2 of 19
Legal Notices Warranty The only warranties for Hewlett Packard Enterprise products and services are set forth in the express warranty statements accompanying such products and services. Nothing herein should be construed as constituting an additional warranty. Hewlett Packard Enterprise shall not be liable for technical or editorial errors or omissions contained herein.
The information contained herein is subject to change without notice.
Restricted Rights Legend Confidential computer software. Valid license from Hewlett Packard Enterprise required for possession, use or copying. Consistent with FAR 12.211 and 12.212, Commercial Computer Software, Computer Software Documentation, and Technical Data for Commercial Items are licensed to the U.S. Government under vendor's standard commercial license.
Copyright Notice © Copyright 2016 Hewlett Packard Enterprise Development LP
Trademark Notices Adobe™ is a trademark of Adobe Systems Incorporated.
Microsoft® and Windows® are U.S. registered trademarks of Microsoft
Corporation. UNIX® is a registered trademark of The Open Group.
This product includes an interface of the 'zlib' general purpose compression library, which is Copyright © 1995-2002 Jean-loup Gailly and Mark Adler.
Documentation Updates The title page of this document contains the following identifying information:
l Software Version number, which indicates the software version. l Document Release Date, which changes each time the document is updated. l Software Release Date, which indicates the release date of this version of the software.
To check for recent updates or to verify that you are using the most recent edition of a document, go to: https://softwaresupport.hpe.com/.
This site requires that you register for an HP Passport and to sign in. To register for an HP Passport ID, click Register on the HP Software Support site or click Create an Account on the HP Passport login page.
You will also receive updated or new editions if you subscribe to the appropriate product support service. Contact your HPE sales representative for details
Benchmark Performance Guide
HPE Operations Orchestration (10.60) Page 3 of 19
Support Visit the HP Software Support site at: https://softwaresupport.hpe.com.
This website provides contact information and details about the products, services, and support that HP Software offers.
HP Software online support provides customer self-solve capabilities. It provides a fast and efficient way to access interactive technical support tools needed to manage your business. As a valued support customer, you can benefit by using the support website to: • Search for knowledge documents of interest • Submit and track support cases and enhancement requests • Download software patches • Manage support contracts • Look up HP support contacts • Review information about available services • Enter into discussions with other software customers • Research and register for software training
Most of the support areas require that you register as an HP Passport user and to sign in. Many also require a support contract. To register for an HP Passport ID, click Register on the HP Support site or click Create an Account on the HP Passport login page.
To find more information about access levels, go to: https://softwaresupport.hpe.com/web/softwaresupport/access-levels. HPE Software Solutions Now accesses the HPE SW Solution and Integration Portal website. This site enables you to explore HP Product Solutions to meet your business needs, includes a full list of integrations between HP Products, as well as a listing of ITIL Processes. The URL for this website is
https://softwaresupport.hpe.com/web/softwaresupport/document/-/facetsearch/document/KM01702731
Benchmark Performance Guide
HPE Operations Orchestration (10.60) Page 4 of 19
Table of Contents Introduction ........................................................................................................................................................................... 5 Objectives ............................................................................................................................................................................. 5 Setup .................................................................................................................................................................................... 5
Environment ........................................................................................................................................ 6 Tools .................................................................................................................................................. 6 Throughput Flows ................................................................................................................................. 6
Large Context Flow ................................................................................................................................................................... 7 Short Flow ................................................................................................................................................................................. 7 Medium Flow ............................................................................................................................................................................ 8 Long Flows ............................................................................................................................................................................... 8 Multi-Instance Flow ................................................................................................................................................................... 9 Subflow ..................................................................................................................................................................................... 9 Parallel Flow ........................................................................................................................................................................... 10
Single Flow Performance Flows ............................................................................................................. 12 Large MI Flow ......................................................................................................................................................................... 12 Parallel Flow ........................................................................................................................................................................... 12 Subflows Level10 .................................................................................................................................................................... 13 Large Sequential Flow ............................................................................................................................................................ 13 Large Context Flow ................................................................................................................................................................. 14
Scenario.............................................................................................................................................................................. 14 Comparison ........................................................................................................................................................................ 15
Throughput ........................................................................................................................................ 15 Single Flow Performance ...................................................................................................................... 16
Analysis of Results ............................................................................................................................................................. 17 Recommendations for Environment Tuning .............................................................................................. 17 OO 10.60 .......................................................................................................................................... 17
Appendix: Comparison with HPE OO 9.07.0003 ................................................................................................................ 18 Throughput ........................................................................................................................................ 18 Single Flow Performance ...................................................................................................................... 19 Analysis of Results .............................................................................................................................. 19
Benchmark Performance Guide
HPE Operations Orchestration (10.60) Page 5 of 19
Introduction This document provides an overview of the HPE Operations Orchestration version 10.60
performance.
The following results are described in more detail throughout this document:
• HPE OO 10.60 overall execution throughput in comparison with HPE OO 10.50.
• HPE OO 10.60 single flow performance in comparison with HPE OO 10.50.
HPE Operations Orchestration version 10.60 performance shows similar results to HPE OO 10.50.
Objectives This document details the performance tests made in HPE Operations Orchestration 10.60 using
flow/step execution throughput (steps/time). This includes:
• HPE OO throughput in several environments:
o Low cost FOSS (Free and Open-Source Software) operating system
o High cost non-FOSS environments
o Oracle-based environments
o Clustered and stand-alone environments
• Single flow performance results of various scenarios comparing HPE OO 10.60 to HPE OO
10.50.
Basic tuning was applied to the environments described in this document. These configurations are
described in Recommendations for Environment Tuning.
Setup This section describes the different benchmark tests in this document, including:
• Environment-related details
• Tools that were used
• Flows that were triggered and the flow distribution
• Results that were achieved, which showcase:
o Throughput
o Single flow performance
Benchmark Performance Guide
HPE Operations Orchestration (10.60) Page 6 of 19
Environment The following table describes the hardware and software components used for the benchmark
tests:
Model Processors Memory Storage Network Notes
Server ProLiant BL460c G7 12 core
2667Mhz
16 GB Local 1 GB Windows 2012/RHEL
6.3
Database ProLiant DL380 G7 12 core
2933Mhz
32 GB DAS 1 GB RHEL 6.3 - Oracle
Tools The following tools were used to produce this benchmark:
• HPE LoadRunner 11.52
• HPE SiteScope 11.20
Throughput Flows This section describes the flows that were used during the benchmark throughput tests.
These flows were designed to emphasize the different functionality aspects of HPE OO and to load
test the different resources of the system (CPU, memory, and so on). By running a combination of
all of these flows, we tried to simulate a heterogeneous customer environment.
Note: The purpose of these flows was to load HPE OO as a platform, and not to perform any
actual work, as the goal of the benchmark is to verify the performance of HPE OO as a platform
and not to verify the performance of the HPE OO content.
Benchmark Performance Guide
HPE Operations Orchestration (10.60) Page 7 of 19
Large Context Flow
This flow receives a 4 MB context and has 103 steps.
Short Flow This flow uses “Generate Data” operation and has 2 steps.
Benchmark Performance Guide
HPE Operations Orchestration (10.60) Page 8 of 19
Medium Flow This flow uses the “Generate Data” operation and has 102 steps.
Long Flows This flow uses the “Generate Data” operation and has 10002 steps.
Benchmark Performance Guide
HPE Operations Orchestration (10.60) Page 9 of 19
Multi-Instance Flow
This flow contains a multi-instance implementation of the UUID generator and runs with 300 lanes
per flow.
Subflow
This flow runs an instance of a medium-sized flow as a subflow.
Benchmark Performance Guide
HPE Operations Orchestration (10.60) Page 10 of 19
Parallel Flow
This flow runs 55 lanes of parallel split (only part of the flow can be seen in the following image).
Benchmark Performance Guide
HPE Operations Orchestration (10.60) Page 11 of 19
Advanced Flow This flow was designed to mimic consecutive calls of multiple short and medium sized subflows
from within a parent flow, while passing data from the parent flow to the subflow and vice-versa.
Advanced Context Flow
This flow was designed to mimic consecutive calls of multiple short and medium sized subflows
from within a parent flow, while passing a large amount of data from the parent flow to the subflow
and vice-versa. 4500 ASCII chars are used and passed from inputs to outputs, primary results and
raw results. This flow uses scriptlets and filters.
Benchmark Performance Guide
HPE Operations Orchestration (10.60) Page 12 of 19
Single Flow Performance Flows In order to achieve more precise results, we chose to create flows with long execution times.This
section describes the flows that were used during the benchmark for single flow performance
tests.
Large MI Flow This flow loops 25 times and uses two multi-instance steps, each of which has 50 instance steps.
Parallel Flow This flow loops 50 lanes of a parallel split step 20 times (only part of the flow can be seen in the
following image).
Benchmark Performance Guide
HPE Operations Orchestration (10.60) Page 13 of 19
Subflows Level10 This flow loops 25 times, and each iteration contains 10 levels of subflows.
Large Sequential Flow This flow loops 10,000 times and uses the “Do Nothing” operation.
Benchmark Performance Guide
HPE Operations Orchestration (10.60) Page 14 of 19
Large Context Flow This flow loops 150 times, and uses the “List Appender” operation, that creates a large context
variable.
Scenario This section describes the scenario used for the benchmark:
• The workload was generated using HPE LoadRunner.
• The flow triggering was done using HPE OO REST API calls.
• The number of flows ran in each benchmark is 5000, which amounts to 1140280 steps.
• The distribution of the flows was as follows:
Flow Number Per Flow Type
Advanced Flow 1800
Medium Flow 1000
Parallel Flow, Short Flow, Sub Flow, Multi-Instance Flow 480
Advanced Context Flow 200
Large Context Flow, Long Flow 40
• We used HPE SiteScope integration with LoadRunner to monitor the different parts of the
system during the tests, including JMX monitors for the JVM monitors (memory, garbage
collection).
HPE Operations Orchestration (10.60) Page 15 of 19
Comparison Throughput The following chart show the throughput of 10.60 and 10.50 in either single or cluster mode
configurations with STANDARD persistence level.
Notes:
• A higher result shows better performance.
• HPE Operations Orchestration was scaled out (in versions 10.60, and 10.50) by adding
additional Central servers.
• For more information about throughput comparison of 10.60 vs 9.07.0003, see here.
• For more information about STANDARD persistence level, see the HPE OO 10.50
benchmark, version-related changes, page 7.
9951130
1470
9811126
1447
0
200
400
600
800
1000
1200
1400
1600
Single/Windows/Oracle Single/Linux/Oracle Cluster/Windows/Oracle
Throughput on different HP OO Deployments in STANDARD mode(steps/sec)
10.60 10.50
Benchmark Performance Guide
HPE Operations Orchestration (10.60) Page 16 of 19
Single Flow Performance The following chart compares the single flow performance of HP Operations Orchestration 10.50
to 10.60, configurations with STANDARD persistence level.
Notes:
• A lower result shows better performance.
• For more information about SFP comparison of 10.60 vs 9.07.0003, see here.
• For more information about STANDARD persistence level, see the HPE OO 10.50
benchmark, version-related changes, page 7.
72
27
215
206 0.37 10 1.27
73
27
217
226 0.38 10 1.3
0
50
100
150
200
Large MIFlow
Parallel Flow SubflowsLevel10
LargeSequential
Flow
LargeContext Flow
AdvacedFlow
Long Flow AdvancedContext Flow
Single Flow Performance: 10.60 Vs 10.50 in STANDARD mode (Sec)
10.60 10.50
Benchmark Performance Guide
HPE Operations Orchestration (10.60) Page 17 of 19
Analysis of Results HPE Operations Orchestration 10.60 does not include significant changes in the engine.
Therefore, we can see that the results of single flow performance and throughput are similar to
the 10.50 results, and no degradation was found during the benchmark.
Recommendations for Environment Tuning The following configurations were made during the benchmark tests.
OO 10.60 • The heap size was increased to 1 GB - 4 GB.
This can be configured in <OO Installation>\oo\central\conf\central-wrapper.conf.
# Initial Java Heap Size (in MB)
wrapper.java.initmemory=1024
# Maximum Java Heap Size (in MB)
wrapper.java.maxmemory=4096
• The number of execution threads was increased to 300 while the inBuffer capacity was
increased to 500.
These can both be configured in <OO Installation>\oo\central\conf\central-wrapper.conf
starting from HPE OO 10.50.
wrapper.java.additional.25=-Dcloudslang.worker.numberOfExecutionThreads=300
wrapper.java.additional.26=-Dcloudslang.worker.inBufferCapacity=500
• The number of database connections was increased to 20 - 100.
This can be configured in <OO Installation>\oo\central\conf\database.properties.
db.pool.maxPoolSize=100
db.pool.minPoolSize=20
Benchmark Performance Guide
HPE Operations Orchestration (10.60) Page 18 of 19
Appendix: Comparison with HPE OO 9.07.0003 This appendix contains a comparison between HPE OO 10.60 and HPE OO 9.07.0003 with Oracle database (Linux) and Central on Windows machine.
Throughput The following chart show the throughput of versions 10.60, 10.50 and 9.07.0003 in either single
or clustered mode, configured with EXTENDED persistence level and with the old flows
distribution.
Notes:
• A higher result shows better performance.
• HPE Operations Orchestration was scaled out by adding additional Central servers.
• For more information about EXTENDED persistence level, see the HPE OO 10.50
benchmark, Version-related changes, page 7.
• For more information about the old flow distribution, see the HPE OO 10.50 benchmark,
Flow Distribution Changes, page 6.
1425
1956
1432
1933
698
1426
0
500
1000
1500
2000
2500
Single Cluster
Single Flow Performance: 10.60,10.50 in EXTENDED mode and 9.07.003 (Sec)
10.60 10.50 9.07.0003
Benchmark Performance Guide
HPE Operations Orchestration (10.60) Page 19 of 19
Single Flow Performance The following chart compares the single flow performance of HPE Operations Orchestration
10.60 with HPE OO 9.07.0003, configured with EXTENDED persistence level.
Notes:
• A lower result shows better performance.
• For more information about EXTENDED persistence level, see the HPE OO 10.50
benchmark, version-related changes, page 7.
Analysis of Results HPE Operations Orchestration 10.60 includes significant changes in the engine in comparison
with HPE OO 9.07.0003. We can see that the results of single flow performance and throughput
are better than 9.07.0003 and similar to 10.50 results.
LargeContext Long Flow Medium Flow MI Flow Parallel Flow Short Flow Sub-Flow
10.60 0.49 9.5 0.28 2.9 1.82 0.22 0.349.07.0003 0.92 76 0.31 8.39 0.71 0.037 0.82
0.499.5
0.28 2.9 1.82 0.22 0.340.92
76
0.318.39
0.71 0.037 0.820
1020304050607080
Single Flow Performance: 10.60 in Extended mode Vs 9.07.0003 (Sec)
10.60 9.07.0003