EcoCooling joins the Node Pole Alliance

EcoCooling, the leaders in direct-air evaporative cooling, has joined the Node Pole Alliance, an active international network of over 80 world leading knowledge partners coming together to build the data centres of the future.
The Node Pole region encompasses three municipalities in the very north of Sweden, just by the Arctic Circle, and has the potential to become a global hub for data traffic. This is mostly due to its reliable power infrastructure, the ample supply of low cost renewable hydroelectric energy and low air temperatures ideal for natural cooling.
The Alliance members are companies from the technology and construction sectors who combine their knowledge and experience to build world-class data centres.
“We are very proud to have been able to join the Node Pole Alliance”, said Alan Beresford, MD at EcoCooling. “The direct-air evaporative cooling systems we have developed are ideal for the climate in the Node Pole region and make the most of the resources available.”
Air temperatures so close to the Arctic Circle are not only cool enough to make refrigeration in data centres redundant – they can even be too cold for the IT equipment. Some systems shut down if the temperature drops below 14 degrees Celsius. EcoCooling has designed patented control systems and atemperation processes to keep the cooling air within a tightly controlled temperature band – typically 18 to 21 degrees Celsius.
By joining the Node Pole Alliance EcoCooling will work alongside some of the most innovative companies like Hydro66, Vattenfall, Facebook, KnC Miner and ABB.

-ends-

About EcoCooling:
Established in 2002, EcoCooling is a UK based manufacturer of direct-air evaporative coolers.
http://www.ecocooling.co.uk/

As 200th Installation Announced, Direct-Air Evaporative Cooling Becomes Mainstream

EcoCooling, the leaders in direct-air evaporative cooling today revealed they have completed their 200th data centre cooling installation using the energy saving technology.

 

“Using CRECs (computer room evaporative coolers) instead of the conventional CRAC units (computer room air conditioning units) can save over 90 per cent of the energy needed to cool a data centre,” said EcoCooling managing and technical director Alan Beresford, “we are very pleased to announce Serve The World as the 200th data centre to adopt this solution at its 600kW Oslo facility in Norway.”

 

Data centre engineers are by nature very cautious and it has taken a number of years for the CREC cooling to be accepted as a safe and reliable alternative to expensive refrigeration-based CRAC cooling. Serve The World now joins a list of highly respected data centre operators able to operate with PUEs (power utilisation effectiveness) of 1.2 or less regardless of the level of occupancy in the data centre.

 

Other data centres which have grasped the power and cost saving EcoCooling CREC cooling technology include Insurance company Unum, UK telecoms companies BT and TalkTalk, public sector organisations Humberside police and Warwickshire County Council plus colocation specialist Capgemini, as well as Cambridge University and RNLI (the Royal Naval Lifeboat Institute)

 

Within the 200 installations there are data centres with power consumptions from 10kW to 1MW. For a 1MW installation the EcoCooling CREC solution would require only around 40kW of power compared to as much as 1000kW with conventional CRAC cooling. This saves the cost and infrastructure for 960 kW of power.

 

Aberdeen University Data Centre – cooled by EcoCooling CRECs has been awarded Data Centre Project of the Year in the BCS & Computing UK IT Industry Awards – covering the UK’s entire IT industry. Aberdeen beat off competition from Tesco, Capital One and the NHS.  A number of best practices including the deployment of EcoCooling CRECs has led to a PUE of less than 1.1.

 

EcoCooling’s direct-air cooled data centre projects are spread far and wide beyond the UK with installations also in New Zealand., Germany, Ireland and the latest Norway-based Serve the World.

 

Explaining how the CREC technology works, Beresford said, “in temperate climates there are up to 365 days every year when so-called ‘free cooling’ can be employed. On a fair proportion of these days it is simply enough to pass air from outside through the data centre servers and other active equipment at a suitable rate and no cooling of that external air is needed at all. On the remaining days, it is sufficient to use a very simple technique of water evaporation which takes heat out of the incoming air and cools it sufficiently to cool an entire data centre.”

 

“Concerns of data centre engineers about the use of fresh air in data centres have not materialised.  With over five years operational experience and research data now available from these 200 installations the EcoCooling CREC design principles and process controls have proven to provide a resilient and efficient cooling system. I think the list of major players that have fully researched the topic and have then implemented EcoCooling technology demonstrates that data centre engineers can now consider this power saving technology as being fully ‘of age’,” Beresford concluded.

 

New Cooling Solution Brings Major Savings To Telecoms And Server Rooms

For several years now Data Centres have been cutting the cost of cooling by 80 to 90 per cent thanks to direct-air evaporative cooling.

 

Market leader EcoCooling has now developed a smaller unit ideal for the telecoms room and small server rooms which have historically been some of the most expensive locations to cool due to the highly inefficient and often unsuitable refrigeration cooling units deployed.

 

Launching the new 15kW evaporative cooler, EcoCooling’s managing and technical director Alan Beresford explained, ‘small office-type air conditioners have been used to cool areas such as telecom rooms and small server rooms, but these are not really suited to cooling IT equipment and can be very inefficient. Refrigeration coolers naturally use a lot of energy and in-fact office type coolers simply aren’t designed to deal with the high levels of concentrated heat produced by modern servers, routers and switches.’

 

Often, to remove 15kW of heat from a server room the energy requirement to run the refrigeration coolers would amount to a further 15kW of electricity.

 

The new EcoCooling evaporative cooler requires a mere 400watts to remove 15kW of heat.  This can save over £10,000 per year in cooling costs.

 

With one leading University already looking to deploy 60 of these units – and save over 500kW of power the new product is set to be extremely popular!

Also, while refrigeration-based coolers whose efficiency gets far worse when they are partially-loaded, the new EcoCooling units are highly efficient at low loads.  5kW of cooling will require less than 50W of electricity.

 

The new 15kW cooler from EcoCooling requires no external condenser unlike conventional air conditioning units and is a self-contained compact unit at just 1.4m x 0.9m x 1.9m.  The units are also designed for ease and speed of maintenance and all maintenance is carried out inside the building.

 

The very simple installation method means the 15kW EcoCooling unit is significantly less expensive than conventional external units making the massive energy savings available to small server and telecoms rooms.

Data Centres Could Experience 30 Per Cent More Failures as Temperatures Increase

Alan Beresford EcoCooling Managing DirectorMany data centre operators have been increasing the operating temperature in their data centres to reduce the massive costs of cooling.  But, warns Alan Beresford, technical director and md with EcoCooling – they run the risk of significantly more failures.

ASHRAE (the American Society of Heating and Refrigeration Engineers) is generally considered to set the standards globally for data centre cooling. A few years ago it relaxed its recommended operating range for data servers from 20-25C (Celsius) to 18-27C.

“For decades,” said Beresford, “data centres have operated at a 20-21C temperature. With the relaxation in the ASHRAE 2011 recommendation plus the pressure to cut costs – data centres have begun to significantly increase the ‘cold aisle’ temperature to 24-25C and in some cases right up to 27C.

“But many of them have not taken into account the study of server reliabilities detailed in the ASHRAE 2001 Thermal Guidelines for Data Processing Environments – which predicts that if the cold aisle temperature is increased from 20C to 25C, the level of failures increases by a very significant 24%. Upping the temperature to 27.5C increases the failure rates by a massive 34 per cent.

Warns Beresford: “And if the air temperature going into the front of the servers is 27C it’s going to be very hot (34-37C) coming out of the rear. For blade servers it can be a blistering 42C at the rear!

“It’s not just the servers that can fail,” states Beresford, “at the rear of the servers are electric supply cables, power distribution units and communication cables. Most of these are simply not designed to work at such elevated temperatures and are liable to early mortality.”

Interestingly, again according to ASHRAE’s published figures, if the temperature is reduced to 17C – the server reliability is improved by 13 per cent compared to conventional 20C operations.

“To cool the air to 17C would be completely uneconomic with conventional refrigeration cooling,” said Beresford, “our modelling shows it would require over 500kW kilowatts of electricity for every megawatt of IT equipment.

“However, with our evaporative direct air cooling CRECs (Computer Room Evaporative Coolers), this 17C operation would require
less than 40kW kilowatts – a saving of over 450kW compared to conventional refrigeration and a reduction of PUE (Power Usage Effectiveness) of 0.45.

When given the option of cooling a data centre with refrigeration at 27C compared with evaporative cooling at 17C at less than 10% of the energy use, 40% less temperature related server failures and a more stable environment for other components it is clear why over 150 UK data center operators have adopted this approach.

Alan Beresford has prepared an informative webinar explaining how evaporative cooling works and why it uses so little energy compared to conventional refrigeration. To watch it visit http://turt.co/dcme12

Beresford adds a final point, “when Engineers and technicians are working on servers, it’s usually at the rear where all the connections are. If they are going to have to work regularly in temperatures of 34C to 42C there might be health issues to consider too. Keeping their working environment under 30C is a far more acceptable solution.”

To find out more about EcoCooling CRECs visit www.ecocooling.org

 

Safehosts London City Data Centre Opens With A 1.06 PUE

What’s very light, hardly eats, sits outside and loves the heat? Phil Turtle went to Safehosts new 5MW co-location data centre in the City of London to find out.

To be honest, this was my third visit to this highly impressive new facility disguised as a 1970s office block in the vicinity of London’s Borough tube station. The first – a midnight tour already infamous in the data centre industry – cannot be described in these pages, save to say that a second visit during daylight hours was necessary to unscramble recollections of bright LED lighting, temperatures that were approaching arctic and Safehosts’ technical wizard Lee Gibbins telling the group that they’d got 100kW of IT load running and the cooling system was only drawing 400 watts. It was so cold we envisaged polar bears moving in sometime soon.

This third visit, however, was to meet with Safehosts’ CTO Lee Gibbins and Alan Beresford, md of EcoCooling whose equipment Safehosts had installed – not only because it hardly ‘eats’ any energy – but for a host of other practical and business reasons such as: It ‘sits outside’ koala-style half-way up the wall as it turns out and doesn’t mind very hot days. Alan Beresford said, “The cost of energy and the capital cost of cooling equipment are by far the biggest non-productive costs of any data centre. For a new co-location site in central London, space is also at a premium and any white space which has to be given over to CRAC (computer room air conditioning) units or even to in-row coolers uses up many racks-worth of space which could otherwise be used for operational racks to generate revenue.

Safehosts’ Gibbins explained that their building used to be a five story commercial office block. “I initially selected it because I foresaw the opportunity to use the existing windows as air ducts without extensive building works and hence to have a very fast project turnaround with low development costs. It also meant that we could deliver the project with very little impact on the visual appearance of the building and most unusually – no discernible noise.”

However there were limitations with the building that meant normal cooling would not have been possible. The top story being a lightweight upward extension meant that the roof was essentially non load-bearing and unsuitable for heavy refrigeration plant. The small yard at the rear was large enough only for the initial two Himoinsa. 500KVA generator sets, the fuel store, the substation for one of its dual 5MW main feeds and parking for four vehicles. So an innovative approach to cooling equipment space requirements was very high on Gibbins’ agenda.

Having used EcoCooling’s fresh air CREC (Computer Room Evaporative Cooling) system for over three years at Safehosts’ Cheltenham data centre, Gibbins had no qualms over using the same technology to achieve the PUE target of 1.06 that the business had set itself.

“And that’s 1.06 PUE from day one with an initially sparsely populated co-location facility, not a hopeful full-capacity prediction,” Gibbins said.

“Few data centres spend much of their life at, or even near, full capacity,” explained Beresford. “If we take one floor at Safehosts as an example; at around 1MW capacity then using the conventional approach you would need to install three big, heavy, 500kW coolers to provide n+1 redundancy – possibly deploying two initially.

“But whilst monsters such as these are maybe 3:1 coefficient of performance (CoP) at full load – i.e. 100kW or electricity for 300kW of cooling output – at part load it quickly falls to at worst 1:1. So for 150kW of cooling it will still be consuming 150kW! This is why some partly populated data centres routinely have a PUE of 2.5 or worse”

Direct fresh air evaporative cooling on the other hand requires very little power and the EcoCooling units come in much smaller ‘chunks’ – 30kW, 60kW or 90kW. So, as Gibbins explained, “we could have started by installing just two or three units initially though in fact, as the CapEx is so much lower, we decided to start with six.

Compared to the 50-100kW consumption that conventional cooling requires for up to 100kW of IT load, this solution draws a maximum of 4kW per 100 kW. “That’s not only a massive energy saving,” explained Gibbins, “it also means I’ve got an extra 1.3 MW of my 7 MW incoming supply available for revenue generating IT load.

“I imagine everyone knows just how easy it is to max-out the utility power feeds these days – long before a data centre is full. So having an extra 1.3 MW for production kit is a major bonus.”

Returning to the lack of space for cooling equipment at the Safehosts City of London site, the question had to be asked: How did Beresford’s team at EcoCooling solve the space problem? Sky hooks? Not far off as it transpired. “We like to throw away the ‘it’s always done like this’ approach – which frankly is all too prevalent in data centre design,” said Beresford, explaining that by applying a little lateral thinking, the matrix of window openings on the rear wall of this old office block was ideal to enable exterior wall-mounting of the small and lightweight EcoCoolers. “Each one only weighs around 90kg, well within the wall’s load bearing strength.”
This writer had noted, with some confusion, that the air-flow routing within the data hall was far from conventional. In the cold aisle, cold air fell down through filter panels in the ceiling rather than coming up through floor tiles. “Hot air rises, cold air falls,” explained Beresford with a wry smile. “Conventional setups push the cool air under floors and upwards through floor grills working against natural convection. We work with convection – since it’s free – and not against it.” That answered one question, but why were there servo-operated louvered flaps between the hot aisles and the cold air inlet from the external cooling units? Strangely it turns out that, whilst in conventional data centre cooling arrangements great lengths are needed to keep the expensive cold air from being contaminated by hot air leakage, in the evaporative cooling scenario, the incoming air is frequently so cold that hot air needs to be re-circulated and mixed into it in order to keep the servers warm enough! ”Many servers, if they get down to 10° C, will actually close themselves down” explained Beresford, “and we don’t want outages because the servers are seeing air which is too cold!”

Of course three of the big questions around direct air evaporative cooling are atmospheric contamination, relative humidity and ‘very hot’ days.

On the contamination front, the coolers are wrapped in a filter ‘blanket’ giving a EU4/G4 grade standard as a first line of defence.

Further filters to G4 standard are fitted instead of ceiling tiles to the cold aisles in the Safehosts installation – but these, it turns out, are a defence against particulates from both the re-circulated hot air and the incoming cold air. This gives the same filtration standard as a conventional CRAC installation.

“And using 600mm square ceiling filters saved me the cost of ceiling tiles,” quipped Safehosts’ Gibbins.

One other misconception that needs to be explained,” said Beresford, is that direct evaporative cooling cannot meet the relative humidity (RH) standards required in a data centre. The unique patented EcoCooling control system manages both temperature and RH. The temperature is stable and the RH never goes over the allowable limits – so contrary to rumour, the incoming air is not over-laden with moisture.”
And ‘very hot’ days? Well of course, explained Beresford, in a temperate climate such as the UK, there aren’t actually that many – which is not so good for us humans – but great for data centres.

He went on to paint a very interesting picture. “Very hot days are actually quite short term events. We can always be sure, in the UK for example, that come night-time the temperature of the external air will fall below 20° C. So there is only a limited time when it is technically ‘very hot’.”

Refrigeration units become much less efficient as the external ambient temperature rises. Because the condenser units are out in the sun they get extremely hot – far hotter than ambient temperature. They also suffer from their hot air exhaust being sucked back into the inlet raising internal temperatures even higher and causing failures.

As readers will know, on very hot days conventional cooling often can’t cope and it’s quite common to see the data centre doors wide open and massive portable fans deployed to move lots more external air through the datacentre to try to keep things cool. And, to be honest, just getting more air through like that usually works.

Evaporative direct air cooling actually has two very significant advantages over refrigerator cooling on ‘very hot’ days, Beresford claims. Firstly, airflow is not restricted because EcoCoolers have masses of airflow available. So as the ambient temperature increases the system controller ramps up the fans delivering far more cool air to the server rows than CRACs or DX (direct exchange) systems are able to. Without having to open the data centre doors to the outside air.

“What’s more, the higher the temperature the better the cooling because in the UK the hotter the day the lower the humidity so the level of cooling actually increases. So on a hot day in the UK an EcoCooler can reduce the temperature by 12 degrees or more – the air coming off the cooler will never be above 22C whatever the outside temperature.

Although direct air evaporative cooling seems to have many advantages, Beresford is a realistic engineer. “Direct air evaporative cooling isn’t for everyone or everywhere. But in many countries and many operations it offers massive energy savings and significant data hall space saving – allowing better, revenue-earning, use of scarce power and sellable space – as Safehosts have demonstrated here.”
From Safehosts’ perspective Gibbins concludes, “Using evaporative direct air cooling with its zero rack space internal footprint, lightweight wall-mounted coolers and 0.04 effect on PUE, has allowed us to turn an unlikely building into a state of the art colocation centre right in the City of London and enables us to start with a 1.06 PUE from day one. I’m very happy with that.”