profiling and testing procedures for a net-centric data ... · approved for public release 04...
TRANSCRIPT
Network CentricEnterprise
Network CentricEnterpriseNet-CentricDiplomacyNet-CentricDiplomacy
Approved for public release 04 October 05
Profiling and Testing Proceduresfor a Net-Centric Data ProviderProfiling and Testing Proceduresfor a Net-Centric Data ProviderDerik PackSpecial Communications Project SupportSPAWAR Systems Center Charleston
Approved for public release 04 October 05
Net-CentricityNet-Centricity
• Definition– A global web-enabled environment that promotes
information sharing, sense making, and decisionmaking.
• Pillars of Net-Centricity– Physical Infrastructure– Software Concepts and Infrastructure– Business Logic and Policy
Approved for public release 04 October 05
OutlineOutline
• Approaching Net-Centricity: Services– Advantages– Barriers to Acceptance
• An SOA Example: Net-Centric Diplomacy– Specifications– Architecture– Testing Metrics, Procedures, and Results– Operational Dashboard
• Lessons Learned
Approved for public release 04 October 05
An Intro to Service Oriented ArchitectureAn Intro to Service Oriented Architecture
• Operating system and programminglanguage independent
• Expose business processes• Loosely coupled
ServiceConsumer
ServiceRegistry
ServiceProvider
1. Publish2. Find
3. Bind
Approved for public release 04 October 05
Web ServicesWeb Services
• Transport over HTTP or HTTPS• Specifications
– XML, SOAP, SAML, UDDI, WSDL• Competing Organizations
– WS-I– W3C– Vendors
Approved for public release 04 October 05
Advantages of SOAAdvantages of SOA
• Lower cost of development• Higher component reuse• Process streamlining• Smoother integration paths
Approved for public release 04 October 05
SOA Barriers to AcceptanceSOA Barriers to Acceptance
• Standards– Misunderstanding of Standards
• Standards can be complex and documentation may be sparse• A certain level of knowledge is needed to understand the
interaction between standards
– Policy Issues• An implemented standard may impose requirements contrary
to the accepted policy of an organization
– Interoperability• Vague or poorly documented areas in a standard may lead to
interoperability issues
Approved for public release 04 October 05
SOA Barriers to AcceptanceSOA Barriers to Acceptance
• Technical– Security
• XML is plain text• No explicit security model with SOAP
– Performance• Processing SOAP is CPU intensive• Security information can further decrease performance
– Quality of Service• Web services implemented using transfer mechanisms that do
not ensure quality of service
– Transaction Support• No implicit support for ACID (Atomicity, Consistency, Isolation,
and Durability) transactions
Approved for public release 04 October 05
SOA Example: Net-Centric DiplomacySOA Example: Net-Centric Diplomacy
• Department of State Program• Electronic Publishing of Post Information
– Biographic reports– DoS telegraphs
• Initiative of Horizontal Fusion Portfolio• Uses DISA’s Net-Centric Enterprise Services
Approved for public release 04 October 05
Horizontal FusionHorizontal Fusion
• Department of Defense Portfolio• Providing example application layer of GlobalInformation Grid (GIG)
• Using DISA’s Net-Centric Enterprise Services(NCES)
• More information can be found athttp://horizontalfusion.dtic.mil/
Approved for public release 04 October 05
NCD Data Provider ImplementationNCD Data Provider Implementation
• NCES interaction– Security Services– Discovery Services
• Intelligent Federated IndexSearch (IFIS) WSDL
– Web Service Interface– Query Syntax
• Person Search• Keyword Search
• ncd_search_1_2– search– cancelSearch– getMoreResults
Approved for public release 04 October 05
NCD ArchitectureNCD Architecture
Approved for public release 04 October 05
Problems of measuring web serviceperformance
Problems of measuring web serviceperformance
• Few exhaustive web service performancetools exist
• Web services are not websites– The same metrics may not apply– Services may call other servers/services– Service(s) may encompass business logic to be tested– Semantic use of the service is not clearly defined
Approved for public release 04 October 05
Solutions for Web Service PerformanceTesting
Solutions for Web Service PerformanceTesting
• Define web service specific performancemetrics and tests
• Monitor dependent environment duringperformance testing
• Create dashboard application for productionenvironments for quick diagnostics of alldependencies
Approved for public release 04 October 05
Web Service Performance MetricsWeb Service Performance Metrics
• Round Trip Time (RTT)– The time required for a request to be sent from a
client, processed by the server and returned• Error
– Incorrect results or error messages received from theweb service
• Connections per Second (CPS)– The number of connections that are being sent to the
web application each second• (IFIS specific) Queries per Second
– The number of queries (search+getMoreResults calls)till a client has received all possible results
Approved for public release 04 October 05
Test TypesTest Types
• Continuous test– Set a constant connection rate and time of the test
• Ramped Test– Set a start and end connection rate and a number of
steps to increment the rate between the start and endof the test
• Burst Test– Set a one time burst of connections
• Adaptive Test– Search for the steady state connection rate for the
service in an adaptive manner
Approved for public release 04 October 05
Dependency TestingDependency Testing
• Required while using web service metrics– To map low performance to a given component– Determine which components can provide greatest
speedup to service• Testing includes
– Unit testing– Application profiling (CPU and memory)– Correct software configuration for given hardware
Approved for public release 04 October 05
Testing ProceduresTesting Procedures
• Burst tests and profiling for memoryproblems
• Continuous tests and error logging forfunctional testing
• Ramped tests to determine point of failure forserver
• Adaptive tests based on the point of failure tofind steady state connection rate of the server
Approved for public release 04 October 05
Ramped TestRamped Test
RTT vs Connections per Second
0
20000
40000
60000
80000
100000
120000
140000
160000
180000
200000
0 1 2 3 4 5
Connections per Second
RTT
(ms)
Avg. RTT Max RTT Min RTT
Approved for public release 04 October 05
Adaptive Test over 48 hoursAdaptive Test over 48 hours
Trend for Connection Rate over Time
1.5
1.7
1.9
2.1
2.3
2.5
2.7
2.9
3.1
3.3
3.5
0 500 1000 1500 2000 2500 3000
Time (minutes)
Con
nect
ion
Rat
e
Approved for public release 04 October 05
Histogram of Connection RateHistogram of Connection Rate
Histogram on Connection Rate
33 2 2 1 1 1 3 9 30
144
428
791
243203
287
474
150
60
100
200
300
400
500
600
700
800
900
2.5 2.55 2.6 2.65 2.7 2.75 2.8 2.85 2.9 2.95 3
3.05 3.1 3.15 3.2 3.25 3.3 More
Connection Rate
Freq
uenc
y
Approved for public release 04 October 05
Adaptive Test ResultsAdaptive Test Results
• Spikes at 26 and 48 hours– Not consistently reproduced in other tests– Can be attributed to environmental factors when
testing at a nominally stable service load• Mean connection rate of 3.06connections/second with a 99% confidence of0.01
• Test covers a likely query method for servicenot all query methods for the service.
Approved for public release 04 October 05
DashboardDashboard
• Web based clientthat monitors
– Department of Stateweb services
– Required externalweb services
– Database– Current application
configuration
• Decreasesdiagnostic time indevelopment andoperations
Approved for public release 04 October 05
ConclusionsConclusions
• Web services can make testing more iterativeand time consuming
• Constant race to best characterize theoperational environment because web serviceinterface makes it easily change
• Best test plan covers many possible uses ofweb service interfaces
Approved for public release 04 October 05
Questions?
Approved for public release 04 October 05
Author Contact InformationAuthor Contact Information
Derik Pack843-515-5015SWAWAR Systems Center [email protected]