SHARE



ABOUT THE AUTHOR


Gary Audin
Gary Audin is the President of Delphi, Inc. He has more than 40 years of computer, communications and security...
Read Full Bio >>
SHARE



Gary Audin | October 29, 2012 |

 
   

Cool Technology

Cool Technology Cooling is an ongoing cost, and one that can be controlled. Design and operate to produce the lowest cost possible.

Cooling is an ongoing cost, and one that can be controlled. Design and operate to produce the lowest cost possible.

There are many articles about the data center and how to make it more efficient. There are fewer articles about the network closets and their efficiency. Many of the efficiency recommendations for the data center are valid for the network closet.

A major technology cost is electrical power: power to run the technology and power to cool it. Cooling IT technology can account for as much as 45% of the IT power bill. The cooler the temperature, the higher the power bill. Rising electrical rates makes it even more important to deliver cooling efficiency.

Paying Attention to Cooling
Network closet space is usually small, therefore designers tend to pack equipment densely together. This saves space but creates cooling problems. Too much in a rack and the equipment stays hot--too hot for too long. Data centers can be redesigned to be water-cooled to allow for the increased equipment density and its associated heat. This is an unlikely and expensive solution for the network closet.

The addition of Power over Ethernet (PoE) adds considerable heat due to PoE power supply in the LAN switch to operate IP phones. Another factor is the chip design in IT equipment. The faster the chip and the denser the components on the board, the hotter the equipment. So newer, smaller equipment can produce more heat per square inch then less densely designed slower equipment.

Prior to 2004, the operating temperature recommended for data centers was 72 degrees Fahrenheit. In 2008, improvements to the equipment had allowed this limit to rise to 81 degrees Fahrenheit, according to the ASHRAE Technical Committee 9.9 (American Society of Heating, Refrigerating, and Air-Conditioning Engineers). However, older equipment still requires a lower operating temperature.

Hot technology also means less reliable technology. According to the UpTime Institute, once the equipment operates at 88 degrees, reliability drops by 50%--another reason to manage cooling properly.

Recommendations for Cooling Efficiency
Each of these recommendations can be implemented separately. Some may already be in place. Others will require some investment and probably facility changes. Look at the ROI for these investments. The ROI should be profitable when you include the long term rate increases you may experience.

* Verify the highest operating temperatures that the equipment vendor recommends. Operate at that temperature, no higher, as this may negate your warranties and maintenance contracts. However running at a higher temperature that's still within the allowable range can reduce cooling costs.

* Physically inspect the rack density. Equipment may have been installed at the installer's convenience without considering the cooling requirements. This can produce hot spots that affect reliability and increase the cooling bill. Move the equipment around to balance out the heat production evenly.

* If you are located geographically where the outside temperature is below what is needed for cooling, consider economizers. By using the outside air, you reduce the cooling bill.

* If your cooling equipment is old, consider investing in more efficient systems that can monitor temperature better. The UpTime Institute has found that the average data center has 2.6 times more cooling capacity than needed. This applies to the closets as well.

* Get rid of old room humidifiers. Evaporative and atomizing humidifiers use far less energy.

* When building or rebuilding the closet, be as flexible in the configuration as possible. Flexibility will allow low-cost changes in the future that will keep up the cooling efficiency.

* Consider calling in energy design consultants who can recommend changes for cooling efficiency. This is a one-time cost that can save money for years.

* Monitor the entire space. This will help balance the cooling and reduce overcooling to compensate for hotspots. The monitoring should be in real time so that alerts can be issued when there are problems.

Cooling is an ongoing cost, and one that can be controlled. Cooling prevents problems but does not in itself produce useful work. It is like a tax. You have to pay for it, so design and operate to produce the lowest tax bill possible.





COMMENTS



Enterprise Connect Orlando 2017
March 27-30 | Orlando, FL
Connect with the Entire Enterprise Communications & Collaboration Ecosystem


Stay Up-to-Date: Hear industry visionaries in Keynotes and General Sessions delivering the latest insight on UC, mobility, collaboration and cloud

Grow Your Network: Connect with the largest gathering of enterprise IT and business leaders and influencers

Learn From Industry Leaders: Attend a full range of Conference Sessions, Free Programs and Special Events

Evaluate All Your Options: Engage with 190+ of the leading equipment, software and service providers

Have Fun! Mingle with sponsors, exhibitors, attendees, guest speakers and industry players during evening receptions

Special Offer - Save $200 Off Advance Rates

Register now with code NOJITTEREB to save $200 Off Advance Rates or get a FREE Expo Pass!

March 8, 2017

Enterprise IT's ability to innovate is critical to the success of the business -- 80% of CIOs agree. But the CIO role has never been more challenging than it is today, with rising operational respo

February 22, 2017

Sick of video call technology that make participants look like they're in the witness protection program? Turns out youre not alone. Poor-quality video solutions can give users an unprofessional ap

February 7, 2017

Securing voice communications used to be very simple since it was generally a closed system. However, with unified communications (UC) you no longer have the walled protection offered by a dedicate

February 24, 2017
UC analyst Blair Pleasant sorts through the myriad cloud architectural models underlying UCaaS and CCaaS offerings, and explains why knowing the differences matter.
February 17, 2017
From the most basics of basics to the hidden gotchas, UC consultant Melissa Swartz helps demystify the complex world of SIP trunking.
February 7, 2017
UC&C consultant Kevin Kieller, a partner at enableUC, shares pointers for making the right architectural choices for your Skype for Business deployment.
February 1, 2017
Elka Popova, a Frost & Sullivan program director, shares a status report on the UCaaS market today and offers her perspective on what large enterprises need before committing to UC in the cloud.
January 26, 2017
Andrew Davis, co-founder of Wainhouse Research and chair of the Video track at Enterprise Connect 2017, sorts through the myriad cloud video service options and shares how to tell if your choice is en....
January 23, 2017
Sheila McGee-Smith, Contact Center/Customer Experience track chair for Enterprise Connect 2017, tells us what we need to know about the role cloud software is playing in contact centers today.