What’s very light, hardly eats, sits outside and loves the heat? Phil Turtle went to Safehosts new 5MW co-location data centre in the City of London to find out.
To be honest, this was my third visit to this highly impressive new facility disguised as a 1970s office block in the vicinity of London’s Borough tube station. The first – a midnight tour already infamous in the data centre industry – cannot be described in these pages, save to say that a second visit during daylight hours was necessary to unscramble recollections of bright LED lighting, temperatures that were approaching arctic and Safehosts’ technical wizard Lee Gibbins telling the group that they’d got 100kW of IT load running and the cooling system was only drawing 400 watts. It was so cold we envisaged polar bears moving in sometime soon.
This third visit, however, was to meet with Safehosts’ CTO Lee Gibbins and Alan Beresford, md of EcoCooling whose equipment Safehosts had installed – not only because it hardly ‘eats’ any energy – but for a host of other practical and business reasons such as: It ‘sits outside’ koala-style half-way up the wall as it turns out and doesn’t mind very hot days. Alan Beresford said, “The cost of energy and the capital cost of cooling equipment are by far the biggest non-productive costs of any data centre. For a new co-location site in central London, space is also at a premium and any white space which has to be given over to CRAC (computer room air conditioning) units or even to in-row coolers uses up many racks-worth of space which could otherwise be used for operational racks to generate revenue.
Safehosts’ Gibbins explained that their building used to be a five story commercial office block. “I initially selected it because I foresaw the opportunity to use the existing windows as air ducts without extensive building works and hence to have a very fast project turnaround with low development costs. It also meant that we could deliver the project with very little impact on the visual appearance of the building and most unusually – no discernible noise.”
However there were limitations with the building that meant normal cooling would not have been possible. The top story being a lightweight upward extension meant that the roof was essentially non load-bearing and unsuitable for heavy refrigeration plant. The small yard at the rear was large enough only for the initial two Himoinsa. 500KVA generator sets, the fuel store, the substation for one of its dual 5MW main feeds and parking for four vehicles. So an innovative approach to cooling equipment space requirements was very high on Gibbins’ agenda.
Having used EcoCooling’s fresh air CREC (Computer Room Evaporative Cooling) system for over three years at Safehosts’ Cheltenham data centre, Gibbins had no qualms over using the same technology to achieve the PUE target of 1.06 that the business had set itself.
“And that’s 1.06 PUE from day one with an initially sparsely populated co-location facility, not a hopeful full-capacity prediction,” Gibbins said.
“Few data centres spend much of their life at, or even near, full capacity,” explained Beresford. “If we take one floor at Safehosts as an example; at around 1MW capacity then using the conventional approach you would need to install three big, heavy, 500kW coolers to provide n+1 redundancy – possibly deploying two initially.
“But whilst monsters such as these are maybe 3:1 coefficient of performance (CoP) at full load – i.e. 100kW or electricity for 300kW of cooling output – at part load it quickly falls to at worst 1:1. So for 150kW of cooling it will still be consuming 150kW! This is why some partly populated data centres routinely have a PUE of 2.5 or worse”
Direct fresh air evaporative cooling on the other hand requires very little power and the EcoCooling units come in much smaller ‘chunks’ – 30kW, 60kW or 90kW. So, as Gibbins explained, “we could have started by installing just two or three units initially though in fact, as the CapEx is so much lower, we decided to start with six.
Compared to the 50-100kW consumption that conventional cooling requires for up to 100kW of IT load, this solution draws a maximum of 4kW per 100 kW. “That’s not only a massive energy saving,” explained Gibbins, “it also means I’ve got an extra 1.3 MW of my 7 MW incoming supply available for revenue generating IT load.
“I imagine everyone knows just how easy it is to max-out the utility power feeds these days – long before a data centre is full. So having an extra 1.3 MW for production kit is a major bonus.”
Returning to the lack of space for cooling equipment at the Safehosts City of London site, the question had to be asked: How did Beresford’s team at EcoCooling solve the space problem? Sky hooks? Not far off as it transpired. “We like to throw away the ‘it’s always done like this’ approach – which frankly is all too prevalent in data centre design,” said Beresford, explaining that by applying a little lateral thinking, the matrix of window openings on the rear wall of this old office block was ideal to enable exterior wall-mounting of the small and lightweight EcoCoolers. “Each one only weighs around 90kg, well within the wall’s load bearing strength.”
This writer had noted, with some confusion, that the air-flow routing within the data hall was far from conventional. In the cold aisle, cold air fell down through filter panels in the ceiling rather than coming up through floor tiles. “Hot air rises, cold air falls,” explained Beresford with a wry smile. “Conventional setups push the cool air under floors and upwards through floor grills working against natural convection. We work with convection – since it’s free – and not against it.” That answered one question, but why were there servo-operated louvered flaps between the hot aisles and the cold air inlet from the external cooling units? Strangely it turns out that, whilst in conventional data centre cooling arrangements great lengths are needed to keep the expensive cold air from being contaminated by hot air leakage, in the evaporative cooling scenario, the incoming air is frequently so cold that hot air needs to be re-circulated and mixed into it in order to keep the servers warm enough! ”Many servers, if they get down to 10° C, will actually close themselves down” explained Beresford, “and we don’t want outages because the servers are seeing air which is too cold!”
Of course three of the big questions around direct air evaporative cooling are atmospheric contamination, relative humidity and ‘very hot’ days.
On the contamination front, the coolers are wrapped in a filter ‘blanket’ giving a EU4/G4 grade standard as a first line of defence.
Further filters to G4 standard are fitted instead of ceiling tiles to the cold aisles in the Safehosts installation – but these, it turns out, are a defence against particulates from both the re-circulated hot air and the incoming cold air. This gives the same filtration standard as a conventional CRAC installation.
“And using 600mm square ceiling filters saved me the cost of ceiling tiles,” quipped Safehosts’ Gibbins.
One other misconception that needs to be explained,” said Beresford, is that direct evaporative cooling cannot meet the relative humidity (RH) standards required in a data centre. The unique patented EcoCooling control system manages both temperature and RH. The temperature is stable and the RH never goes over the allowable limits – so contrary to rumour, the incoming air is not over-laden with moisture.”
And ‘very hot’ days? Well of course, explained Beresford, in a temperate climate such as the UK, there aren’t actually that many – which is not so good for us humans – but great for data centres.
He went on to paint a very interesting picture. “Very hot days are actually quite short term events. We can always be sure, in the UK for example, that come night-time the temperature of the external air will fall below 20° C. So there is only a limited time when it is technically ‘very hot’.”
Refrigeration units become much less efficient as the external ambient temperature rises. Because the condenser units are out in the sun they get extremely hot – far hotter than ambient temperature. They also suffer from their hot air exhaust being sucked back into the inlet raising internal temperatures even higher and causing failures.
As readers will know, on very hot days conventional cooling often can’t cope and it’s quite common to see the data centre doors wide open and massive portable fans deployed to move lots more external air through the datacentre to try to keep things cool. And, to be honest, just getting more air through like that usually works.
Evaporative direct air cooling actually has two very significant advantages over refrigerator cooling on ‘very hot’ days, Beresford claims. Firstly, airflow is not restricted because EcoCoolers have masses of airflow available. So as the ambient temperature increases the system controller ramps up the fans delivering far more cool air to the server rows than CRACs or DX (direct exchange) systems are able to. Without having to open the data centre doors to the outside air.
“What’s more, the higher the temperature the better the cooling because in the UK the hotter the day the lower the humidity so the level of cooling actually increases. So on a hot day in the UK an EcoCooler can reduce the temperature by 12 degrees or more – the air coming off the cooler will never be above 22C whatever the outside temperature.
Although direct air evaporative cooling seems to have many advantages, Beresford is a realistic engineer. “Direct air evaporative cooling isn’t for everyone or everywhere. But in many countries and many operations it offers massive energy savings and significant data hall space saving – allowing better, revenue-earning, use of scarce power and sellable space – as Safehosts have demonstrated here.”
From Safehosts’ perspective Gibbins concludes, “Using evaporative direct air cooling with its zero rack space internal footprint, lightweight wall-mounted coolers and 0.04 effect on PUE, has allowed us to turn an unlikely building into a state of the art colocation centre right in the City of London and enables us to start with a 1.06 PUE from day one. I’m very happy with that.”