EcoCooling joins the Node Pole Alliance

EcoCooling, the leaders in direct-air evaporative cooling, has joined the Node Pole Alliance, an active international network of over 80 world leading knowledge partners coming together to build the data centres of the future.
The Node Pole region encompasses three municipalities in the very north of Sweden, just by the Arctic Circle, and has the potential to become a global hub for data traffic. This is mostly due to its reliable power infrastructure, the ample supply of low cost renewable hydroelectric energy and low air temperatures ideal for natural cooling.
The Alliance members are companies from the technology and construction sectors who combine their knowledge and experience to build world-class data centres.
“We are very proud to have been able to join the Node Pole Alliance”, said Alan Beresford, MD at EcoCooling. “The direct-air evaporative cooling systems we have developed are ideal for the climate in the Node Pole region and make the most of the resources available.”
Air temperatures so close to the Arctic Circle are not only cool enough to make refrigeration in data centres redundant – they can even be too cold for the IT equipment. Some systems shut down if the temperature drops below 14 degrees Celsius. EcoCooling has designed patented control systems and atemperation processes to keep the cooling air within a tightly controlled temperature band – typically 18 to 21 degrees Celsius.
By joining the Node Pole Alliance EcoCooling will work alongside some of the most innovative companies like Hydro66, Vattenfall, Facebook, KnC Miner and ABB.

-ends-

About EcoCooling:
Established in 2002, EcoCooling is a UK based manufacturer of direct-air evaporative coolers.
http://www.ecocooling.co.uk/

As 200th Installation Announced, Direct-Air Evaporative Cooling Becomes Mainstream

EcoCooling, the leaders in direct-air evaporative cooling today revealed they have completed their 200th data centre cooling installation using the energy saving technology.

 

“Using CRECs (computer room evaporative coolers) instead of the conventional CRAC units (computer room air conditioning units) can save over 90 per cent of the energy needed to cool a data centre,” said EcoCooling managing and technical director Alan Beresford, “we are very pleased to announce Serve The World as the 200th data centre to adopt this solution at its 600kW Oslo facility in Norway.”

 

Data centre engineers are by nature very cautious and it has taken a number of years for the CREC cooling to be accepted as a safe and reliable alternative to expensive refrigeration-based CRAC cooling. Serve The World now joins a list of highly respected data centre operators able to operate with PUEs (power utilisation effectiveness) of 1.2 or less regardless of the level of occupancy in the data centre.

 

Other data centres which have grasped the power and cost saving EcoCooling CREC cooling technology include Insurance company Unum, UK telecoms companies BT and TalkTalk, public sector organisations Humberside police and Warwickshire County Council plus colocation specialist Capgemini, as well as Cambridge University and RNLI (the Royal Naval Lifeboat Institute)

 

Within the 200 installations there are data centres with power consumptions from 10kW to 1MW. For a 1MW installation the EcoCooling CREC solution would require only around 40kW of power compared to as much as 1000kW with conventional CRAC cooling. This saves the cost and infrastructure for 960 kW of power.

 

Aberdeen University Data Centre – cooled by EcoCooling CRECs has been awarded Data Centre Project of the Year in the BCS & Computing UK IT Industry Awards – covering the UK’s entire IT industry. Aberdeen beat off competition from Tesco, Capital One and the NHS.  A number of best practices including the deployment of EcoCooling CRECs has led to a PUE of less than 1.1.

 

EcoCooling’s direct-air cooled data centre projects are spread far and wide beyond the UK with installations also in New Zealand., Germany, Ireland and the latest Norway-based Serve the World.

 

Explaining how the CREC technology works, Beresford said, “in temperate climates there are up to 365 days every year when so-called ‘free cooling’ can be employed. On a fair proportion of these days it is simply enough to pass air from outside through the data centre servers and other active equipment at a suitable rate and no cooling of that external air is needed at all. On the remaining days, it is sufficient to use a very simple technique of water evaporation which takes heat out of the incoming air and cools it sufficiently to cool an entire data centre.”

 

“Concerns of data centre engineers about the use of fresh air in data centres have not materialised.  With over five years operational experience and research data now available from these 200 installations the EcoCooling CREC design principles and process controls have proven to provide a resilient and efficient cooling system. I think the list of major players that have fully researched the topic and have then implemented EcoCooling technology demonstrates that data centre engineers can now consider this power saving technology as being fully ‘of age’,” Beresford concluded.

 

New Cooling Solution Brings Major Savings To Telecoms And Server Rooms

For several years now Data Centres have been cutting the cost of cooling by 80 to 90 per cent thanks to direct-air evaporative cooling.

 

Market leader EcoCooling has now developed a smaller unit ideal for the telecoms room and small server rooms which have historically been some of the most expensive locations to cool due to the highly inefficient and often unsuitable refrigeration cooling units deployed.

 

Launching the new 15kW evaporative cooler, EcoCooling’s managing and technical director Alan Beresford explained, ‘small office-type air conditioners have been used to cool areas such as telecom rooms and small server rooms, but these are not really suited to cooling IT equipment and can be very inefficient. Refrigeration coolers naturally use a lot of energy and in-fact office type coolers simply aren’t designed to deal with the high levels of concentrated heat produced by modern servers, routers and switches.’

 

Often, to remove 15kW of heat from a server room the energy requirement to run the refrigeration coolers would amount to a further 15kW of electricity.

 

The new EcoCooling evaporative cooler requires a mere 400watts to remove 15kW of heat.  This can save over £10,000 per year in cooling costs.

 

With one leading University already looking to deploy 60 of these units – and save over 500kW of power the new product is set to be extremely popular!

Also, while refrigeration-based coolers whose efficiency gets far worse when they are partially-loaded, the new EcoCooling units are highly efficient at low loads.  5kW of cooling will require less than 50W of electricity.

 

The new 15kW cooler from EcoCooling requires no external condenser unlike conventional air conditioning units and is a self-contained compact unit at just 1.4m x 0.9m x 1.9m.  The units are also designed for ease and speed of maintenance and all maintenance is carried out inside the building.

 

The very simple installation method means the 15kW EcoCooling unit is significantly less expensive than conventional external units making the massive energy savings available to small server and telecoms rooms.

‘Rain Umbrella’ For Computer Servers Wins US Patent – And Sees Orders Triple Post SuperStorm Sandy

DCME0014_TurtleShellServerShield - An UmbrellaTo KeepWaterAnd DebrisOff DelicateEquipmentWhenLeaks AndStormsOccurAbove-W600Hundreds of thousands of data centers and server rooms are in multi-tenanted buildings across the US and elsewhere. Within them, critical computer servers and telecommunications equipment are regularly damaged by water dripping – and sometimes cascading – through ceilings from leaking or burst pipes on the floors above.

And worse: In storm-prone areas, it is only too frequent for roof-damage to allow water to come flooding in (even when the roof is several floors up) – causing widespread destruction of the server equipment and disruption to businesses.

An invention called the Turtle Shell server shield – effectively a massive umbrella for data center servers and telecoms racks – is already protecting thousands of servers around the US and as far afield as Norway and Pakistan.

Completely innovative, the Turtle Shell server shield has just been granted US patent number 8,413,385 in recognition of its uniqueness.

Glenn Mahoney, president at Turtle Shell Industries, and his team have been developing the product for four years – with considerable success.

Said Mahoney, “We’ve been called to many disaster sites where storms and pipe bursts have sent water cascading through the ceiling and right through $millions-worth of server and telecoms equipment – not only interrupting vital business operations but in most cases damaging the equipment beyond repair. It’s a highly distressing sight to see”.

“In one such situation – a major cable operator’s network center in New York – thousands of customers were offline because of the water damage. While the center was being rebuilt, the operator asked us to fit Turtle Shells, as one of several new disaster precautions. Less than two years later torrential storms hit again but even the newly reinforced roof gave way and water came cascading through again. However this time, the unique Turtle Shell ‘umbrellas’ kept the water out of the electronics and the equipment kept on working. With $millions worth of equipment saved to carry on earning revenue.”

You can see an amazing video of both the first and the second storm damage as this NYC data center is struck in 2008 and 2010 at ‘Turtle Shell in action’ here: http://turt.co/dcme14.

Turtle Shells are made from a very strong polycarbonate and shaped like a sideways “(“ extended over the full length of each suite or racks.

They can be installed over, under and around all manner of cables, conduits and support rods or brackets. Once installed Turtle Shells are total watertight. They can also be fitted with flexible curtains which can be operated manually or automatically to ensure that water doesn’t splash into the front and rear of racks.

“We’ve seen a 300 percent raise in sales since October,” said Mahoney. “As people on the East coast are recovering from super storm Sandy they are thinking seriously about how to build-in extra disaster protection. And Turtle Shells are proving to be the ideal solution not just for data centers, but telecoms and cable operators, hospitals, schools, universities and Government sites too.”

For further information on Turtle Shells and advice on how to protect your sensitive equipment from damage by falling water and debris visit turtleshellproducts.com

(Note: Turtle Shell Products is a client of Turtle Consulting Group, but the companies are not related. GREAT NAME THOUGH!)

 

Data Centres Could Experience 30 Per Cent More Failures as Temperatures Increase

Alan Beresford EcoCooling Managing DirectorMany data centre operators have been increasing the operating temperature in their data centres to reduce the massive costs of cooling.  But, warns Alan Beresford, technical director and md with EcoCooling – they run the risk of significantly more failures.

ASHRAE (the American Society of Heating and Refrigeration Engineers) is generally considered to set the standards globally for data centre cooling. A few years ago it relaxed its recommended operating range for data servers from 20-25C (Celsius) to 18-27C.

“For decades,” said Beresford, “data centres have operated at a 20-21C temperature. With the relaxation in the ASHRAE 2011 recommendation plus the pressure to cut costs – data centres have begun to significantly increase the ‘cold aisle’ temperature to 24-25C and in some cases right up to 27C.

“But many of them have not taken into account the study of server reliabilities detailed in the ASHRAE 2001 Thermal Guidelines for Data Processing Environments – which predicts that if the cold aisle temperature is increased from 20C to 25C, the level of failures increases by a very significant 24%. Upping the temperature to 27.5C increases the failure rates by a massive 34 per cent.

Warns Beresford: “And if the air temperature going into the front of the servers is 27C it’s going to be very hot (34-37C) coming out of the rear. For blade servers it can be a blistering 42C at the rear!

“It’s not just the servers that can fail,” states Beresford, “at the rear of the servers are electric supply cables, power distribution units and communication cables. Most of these are simply not designed to work at such elevated temperatures and are liable to early mortality.”

Interestingly, again according to ASHRAE’s published figures, if the temperature is reduced to 17C – the server reliability is improved by 13 per cent compared to conventional 20C operations.

“To cool the air to 17C would be completely uneconomic with conventional refrigeration cooling,” said Beresford, “our modelling shows it would require over 500kW kilowatts of electricity for every megawatt of IT equipment.

“However, with our evaporative direct air cooling CRECs (Computer Room Evaporative Coolers), this 17C operation would require
less than 40kW kilowatts – a saving of over 450kW compared to conventional refrigeration and a reduction of PUE (Power Usage Effectiveness) of 0.45.

When given the option of cooling a data centre with refrigeration at 27C compared with evaporative cooling at 17C at less than 10% of the energy use, 40% less temperature related server failures and a more stable environment for other components it is clear why over 150 UK data center operators have adopted this approach.

Alan Beresford has prepared an informative webinar explaining how evaporative cooling works and why it uses so little energy compared to conventional refrigeration. To watch it visit http://turt.co/dcme12

Beresford adds a final point, “when Engineers and technicians are working on servers, it’s usually at the rear where all the connections are. If they are going to have to work regularly in temperatures of 34C to 42C there might be health issues to consider too. Keeping their working environment under 30C is a far more acceptable solution.”

To find out more about EcoCooling CRECs visit www.ecocooling.org