get better i/o performance in vmware vsphere 5.1 environments with emulex 16gb fibre channel hbas

26
Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs Joey Dieckhans, VMware Yea-Cheng Wang, VMware Alex Amaya, Emulex

Upload: emulex-corporation

Post on 17-Jan-2015

1.188 views

Category:

Technology


3 download

DESCRIPTION

This webinar covers the improvements in storage I/O throughput and CPU efficiency that VMware vSphere gains when using an Emulex 16Gb Fibre Channel Host Bus Adapter (HBA) versus the previous generation HBA. Applications virtualized on VMware vSphere 5.1 that generate storage I/O of various block sizes can take full advantage of 16Gb Fibre Channel wire speed for better sequential and random I/O performance.

TRANSCRIPT

Page 1: Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

Joey Dieckhans, VMwareYea-Cheng Wang, VMwareAlex Amaya, Emulex

Page 2: Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

2© 2011 Emulex Corporation 2© 2011 Emulex Corporation

Agenda

Introduction

What’s New With VMware vSphere 5.1 for Storage

Performance Study

Emulex LPe16000 16Gb Fibre Channel (16GFC) PCIe 3.0 HBAs

Strategic Management

Conclusion

Q&A

Page 3: Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

What’s New With VMware vSphere 5.1 for Storage

Page 4: Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

Copyright © 2009 VMware Inc. All rights reserved. Confidential and proprietary.

Space Efficient Sparse Virtual DisksJoseph Dieckhans

A new Space Efficient Sparse Virtual Disk which

1. Reclaims wasted / stranded space in side a Guest OS

2. Uses a variable block size to better suit applications / use cases

VMDK

Wasted Blocks

VMDK

Traditional VMDK

No Wasted Blocks

Space Efficient Sparse VMDK

Page 5: Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

Copyright © 2009 VMware Inc. All rights reserved. Confidential and proprietary.

Increasing VMFS File Sharing LimitsJoseph Dieckhans

vSphere 5.1 supports sharing a file on a VMFS Datastore with up to 32 concurrent ESXi hosts. (previous limit was 8)

VMFS-5

VMDK

Page 6: Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

Copyright © 2009 VMware Inc. All rights reserved. Confidential and proprietary.

Storage DRS & vCloud DirectorJoseph Dieckhans

vCloud Director Interoperability/Support for Linked Clones• vCloud Director will use Storage DRS for the initial placement

of linked clones during Fast Provisioning.

• vCloud Director will use Storage DRS for managing space utilization and I/O load balancing.

Page 7: Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

Copyright © 2009 VMware Inc. All rights reserved. Confidential and proprietary.

Storage vMotion – Parallel Migration EnhancementJoseph Dieckhans

In vSphere 5.1 Storage vMotion performs up to 4 parallel

disk migrations per Storage vMotion operation

Page 8: Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

16GFC Performance Study by VMware

Page 9: Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

New 16GFC Support in vSphere5.1

• Provide new support for 16GFC on

vSphere 5.1 for better storage I/O

performance

• Performance results

Newly added 16GFC driver has twice the

throughput compared to 8GFC HBA, at better

cpio (cpu cost per I/O)

Reached 16GFC wire speed for random I/Os

in 8KB block sizes.

• Whitepaper

Storage I/O Performance on VMware vSphere5.1 over 16 Gigabit Fibre Channel

Page 10: Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

Comparison of Throughput and CPU Efficiency

16GFC Driver delivers double the throughput at better CPU efficiency per I/O

Sequential read I/Os over a 16GFC or a 8GFC port (Single Iometer worker in single VM)

Throughput and CPU cost per I/O comparison between two adapters. (see note on server configuration)

KB 4KB 8KB 16KB 32KB 64KB 256KB

0

200

400

600

800

1000

1200

1400

1600

1800

8Gb 16Gb

Block size

Seq

uen

tial

read

th

rou

gh

pu

t (M

Bp

s)

1KB 4KB 8KB 16KB 32KB 64KB256KB75%

80%

85%

90%

95%

100%

8Gb 16Gb

Block size

CPU

cost

per

I/O

(low

er

is b

ett

er)

Throughput CPU Cost per I/O

Page 11: Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

More Bandwidth and Better IOPs

16GFC Adapter can attain much better IOPs compared to the 8Gbps wire speed limit of a 8GFC port.

Random read I/Os from 1 VM to 8 VMs over a 16GFC port (single Iometer worker per VM)

1KB 4KB 8KB 16KB0

200

400

600

800

1,000

1,200

1,400

1,600

1,800

1 VM 2 VMs 4 VMs 6VMs 8 VMs

Block size

Ran

dom

read

th

rou

gh

pu

t (M

Bp

s)

1KB 4KB 8KB 16KB0

100,000

200,000

300,000

400,000

500,000

600,000

1 VM 2 VMs 4 VMs 6VMs 8VMs

Block size

I/O

s p

er

secon

ds

(IO

Ps)

Random Read Throughput Random Read IOPs

8Gbps wire speed limit on the throughput of a 8Gb FC HBA

Page 12: Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

Server and Workload Configuration

ESX Host

HP Intel Proliant DL370, Dual Quad Core Xeon W5580 processors

Emulex LPe16002 16GFC HBA initiator

Emulex LPe12000 8GFC HBA initiator

EMC VNX7500 Storage Array

8GFC target ports connected to 16GFC SAN switch for LPe16002 initiator

8GFC target ports connected to 8GFC SAN switch for LPe12000 initiator

32 SSD cached luns of size 256MB, with mirrored write cache enabled at the VNX array

Virtual Machine and Workload

Windows 2008 R2, 64-bit Gust O; single vcpu, and single PVSCSI virtual controller

Single Iometer worker, and 4 target luns in each VM, at 32oios per target lun

Page 13: Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

Emulex 16GFC PCIe 3.0 HBAs

Page 14: Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

14© 2011 Emulex Corporation 14© 2011 Emulex Corporation

Single Port Max IOPS

LPe16002 LPe12002

Page 15: Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

15© 2011 Emulex Corporation 15© 2011 Emulex Corporation

Single Port Max MB/s

LPe16002 LPe12002

Page 16: Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

16© 2011 Emulex Corporation 16© 2011 Emulex Corporation

Half the I/O Response TimeAverage I/O response during a single SSD LUN read I/O

LPe16002 LPe12002

Page 17: Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

17© 2011 Emulex Corporation 17© 2011 Emulex Corporation

Best Practices for 16GFC HBAs

Stay up to date with firmware and drivers tested and supported by VMware HCL

Update the firmware preferably during planned downtime

OEM adapters – visit partner website for latest firmware and drivers

Update inbox drivers

Always check with the storage vendor for the recommended queue depth settings

Always check with the storage vendor for the recommended Multipathing policy

Page 18: Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

HBA Management in Virtual Environments

Page 19: Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

19© 2011 Emulex Corporation 19© 2011 Emulex Corporation

OneCommand Manager for VMware vCenter Server

OneCommand Manager software plug-in for the VMware vCenter Server console– Real-time lifecycle management for

Emulex adapters from vCenter Server– Builds on Emulex CIM providers and

OCM features – no new agents– Extends the vCenter Server console

with an Emulex OneCommand tab

Display / manage adapters with multiple views and filters:– View per VMware host; per VMware cluster; per network fabric – Firmware version, hardware type and many other display filters

Batch update adapter firmware across VMware clusters– Deploy firmware across hosts in a cluster

Page 20: Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

20© 2011 Emulex Corporation 20© 2011 Emulex Corporation

OneCommand Manager for VMware vCenter ServerCluster View – Hosts in a VMware Cluster

Emulex OneCommand

Tab

VMware Hosts, VMs and Clusters

OCM Cluster-based Management Tasks

Data Window for Selected Items

Page 21: Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

Resources

Page 22: Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

22© 2011 Emulex Corporation 22© 2011 Emulex Corporation

Implementers Lab

One-stop site for IT administrators and system architects (implementers)

Technically accurate and straight-forward resources

Fibre Channel and OEM Ethernet and ESXi 5.0 Deployments

How-to Guides for Solutions from:– HP– IBM– Dell

Please wander around our website– Implementerslab.com

Page 24: Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

24© 2011 Emulex Corporation 24© 2011 Emulex Corporation

Final Thoughts…

Virtualization adoption is spreading– More virtualization spreading to cloud, VDI, and mission critical

applications

Virtualization density is increasing– Enabled by bigger servers, more memory, faster networks and vSphere

Fibre Channel is the most popular network for SANs– Networking is the #2 factor (after memory) for bigger VM deployments

16GFC from Emulex is here:– Lower latency, better throughput and more IOPS for bigger VM

Deployments– Best management for vSphere

Page 25: Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

25© 2011 Emulex Corporation

Q & A

Page 26: Get Better I/O Performance in VMware vSphere 5.1 Environments with Emulex 16Gb Fibre Channel HBAs

26© 2011 Emulex Corporation