indiana university data center

26
Presented by: Bill Ash, AIA LEED AP Associate SmithGroup Stacy Kukelhan, PE Project Manager EYP Mission Critical Facilities Indiana University Data Center Case Study Session 2 Tuesday, September 16, 2008 San Jose, California Labs21 DataCenters21

Upload: datacenters

Post on 14-Aug-2015

299 views

Category:

Documents


4 download

TRANSCRIPT

Page 1: Indiana University Data Center

Presented by:

Bill Ash, AIA LEED APAssociate SmithGroup

Stacy Kukelhan, PEProject ManagerEYP Mission Critical Facilities

Indiana University Data CenterCase Study

Session 2Tuesday, September 16, 2008San Jose, California

Labs21DataCenters21

Page 2: Indiana University Data Center

Through a case study of the Indiana University Data Center, participants will gain knowledge of sustainability and energy efficiency strategies as they apply to the design of data centers.

Learning Objectives

Page 3: Indiana University Data Center

• 1998 IT Strategic Plan, UITS• Desktop computing, networking, email• 2006 Acquired IBM e1350 Blade Center Cluster (Big Red)

Life SciencesAstronomyInformaticsComputational PhysicsHumanities

National Science FoundationNational Institutes of HealthNational Endowment for the Humanities

IU Information Technology

Page 4: Indiana University Data Center

Location IU Bloomington

Historic Campus Core

Greenways,QuadsJordan River

Planned Technology PrecinctOld High School BldgsRecreational Fields

Page 5: Indiana University Data Center

Location Technology Precinct

Technology Precinct Master Plan

Data Center SiteWrubel Computing Center(Existing Home, Information Technology)

• Build on disturbed land.• Increase density• Working landscape• Pedestrian circulation

Page 6: Indiana University Data Center

Planning Right-Sizing

Fitting the building on its site

Wrubel Computing Center(Existing Home, Information Technology) Data Center Site

• Build only what you need• Site Constraints

Existing Chiller BuildingData Center Growth

NEW EXISTING

Page 7: Indiana University Data Center

Planning Right-Sizing

Fitting the building on its site

Wrubel Computing Center(Existing Home, Information Technology) Data Center Site

Phased Master Plan, Site for data center shoe-horned between 2 buildings and an existing chiller plant. Logistics of chiller plant serving existing data ctr.

But allow for growth. Delicate balance of tier level mix, area demands and budget.

Existing Chiller BuildingData Center Growth

NEW EXISTING

• Build only what you need• Site Constraints• Other Variables

Whether equipment is located indoors or outdoorsFault tolerance (Tier 4)Concurrent maintainability (Tier 3)Sophistication of monitoring and security systemsFinal UPS design topologyFinal cooling topology Availability of long-lead equipment itemsChanging equipment and materials costs (e.g., cost of copper)Other spaces, such as administrative, storage, NOC, etc. Site workResistance to natural disasters

Page 8: Indiana University Data Center

Building Organization A House for MachinesLess than 4% of the building normally occupied by people

Page 9: Indiana University Data Center

Berms | Landscaping• Berm on all sides & trees protect and shade• Native & drought-resistant grasses• Recycled site waste

Building Components

Page 10: Indiana University Data Center

Cast-in-place Concrete Shell• 9,000 cubic yards of concrete cast on site• Concrete as sustainable material

Longevity/DurabilityThermal MassResource availabilityRecycled contentRegional ProductionMinimal waste

Building Components

Page 11: Indiana University Data Center

Precast Concrete Cladding• More economical than stone• Regional Production

Building Components

Wells Library Limestone Loose-laid limestone site walls

Page 12: Indiana University Data Center

Day 1 Protected Membrane Future Phase – Extensive System

Phasable Green Roof• Cost $9 - $15 / sf• Benefits

City IncentivesReduce RunoffRemaining Runoff delayed, cleaned, cooledEnergy Savings Roof LifeAestheticsAcousticsImproved Air QualityReduce Heat Island EffectHabitat Preservation | Biodiversity

Building Components

Page 13: Indiana University Data Center

Interior Environment• Flexibility of space• Bamboo wall cladding• Exposed concrete & cmu, low voc paints & stains

Building Components

Page 14: Indiana University Data Center

Approaches to Electrical Systems Organization & Efficiencies

• Equipment proximity to load • Locate PDUs as close to RPP as possible – minimize 208 V feeder lengths• Locate medium voltage transformers in UPS room

• Consider 575V critical power distribution in lieu of 480V• Calculated savings of $75,000 energy savings per year for

1.5 MW critical power system (at 6 cents per KWH)• Evaluate UPS technologies

• Varying costs, efficiencies and backup capabilities

Page 15: Indiana University Data Center

UPS Technologies

Technology / Manufacturer Ride Thru Time EfficiencyCost Relative toStatic On-Line Unit

Static On-Line UPS with Battery / Powerware, Liebert, MGE

As Required, Usually10-15 Minutes

92-94% 100%

Flywheel UPS /Active Power, Pentadyne, Piller 12-30 Seconds 94-97% 70%

Static Off-Line UPS with Battery /S&C 60 Seconds 98-99% 60%

Page 16: Indiana University Data Center

UPS Technologies

Rotary Units Advantages• Supports Tier 3 Enterprise Data Center• Low Install Base• Less than 2% Distortion• Higher Operating Efficiency• No major battery maintenance• Critical Load Ride Through Time – 20 seconds for generator startup• Reduced ventilation requirements

Online Static UPS Units Disadvantages• Higher level of distortion• Requirement for battery space• Requirement for battery maintenance• Performance affected by fluctuating loads• Lower operating efficiency• No major battery maintenance

Page 17: Indiana University Data Center

UPS Technologies Rotary UPS

Page 18: Indiana University Data Center

Approaches to Mechanical Systems Efficiencies & ReliabilityEnergy Efficiency – governed primarily by system selection• Chilled water systems (water cooled) – 0.5-0.7kW (annualized) per kW of UPS. • Chilled water systems (air cooled) – 1.0-1.4kW per kW of UPS• Direct expansion systems (air cooled) – 1.5-2.0kW per kW of UPS

Energy Efficiency – Other Design Considerations• Water Side Economizer• Air Side Economizer • VFD’s on Mechanical Equipment• Hot Water Heating

Reliability – governed primarily by system architecture• N+1, N+2, 2N, etc• Single or dual path piping systems • Requirement for make-up water• Is there an uninterruptible cooling requirement to support high density cooling?

Page 19: Indiana University Data Center

Water Side Economizer

• Uses cold condenser water to generate chilled water• Minimize or eliminate the requirement to operate the chillers when the ambient conditions permit• The cooling tower fans and pumps generated the cold condenser water• Energy savings dependent upon wet-bulb (moisture content of air)• Regions with lower wet-bulb are necessary• Usable Economizer Hours, Bloomington, IN – Approximately 1700 hours a year• Higher design chilled water temperature (above conventional 45F) extends use of economizer

Page 20: Indiana University Data Center

CHILLER

PUMP

EXCHANGERHEAT

CHILLED WATER RETURN

CHILLED WATER SUPPLY

CONTROLVALVE

Water Side Economizer

Page 21: Indiana University Data Center

Alternate Plant descriptionPlant Annualenergy cost Differential Percentage

No economizer 2200 ton load $760,000.00 Base BaseParallel economizer 2200 ton load $639,000.00 $121,000.00 84%Series economizer 2200 ton load $518,000.00 $242,000.00 68%

Note: Data based on $0.06/kWh

Indiana University Design – Parallel Economizer

Water Side Economizer

Indiana University Payback Period ~ 5 years.

Page 22: Indiana University Data Center

Air Side Economizer

• Uses outside air to directly cool support equipment rooms• Energy savings dependent on dry-bulb (thermometer temperature) and wet-bulb (moisture

content of air) temperature – enthalpy control• Need to control supply air temperature to produce optimal dry-bulb and moisture content – don’t

want air too dry• Equipment requirement for Data Center to operate in 40 to 55% RH range significantly reduces

available hours

Page 23: Indiana University Data Center
Page 24: Indiana University Data Center
Page 25: Indiana University Data Center
Page 26: Indiana University Data Center

Questions?