At our recent CPD accredited evaporative cooling webinar, I received some interesting questions in the Q&A session at the end. I would like to share my answers, which have been edited to make them clearer.
Q: How do you deal with freezing conditions?
A: During colder periods, where freezing outside conditions would be expected, the evaporative coolers would not be used, and they would have been earlier drained down and isolated as part of a winterisation process. This means that there is no chance of any water freezing inside the unit, as it contains no water.
Freezing conditions are then dealt with by carefully mixing quantities of that cooler outside air with warmer waste air from the IT equipment, therefore providing air at desirable conditions.
Q: Any experience outside the UK?
A: We have evaporative coolers installed in data centre environments in Europe. Beyond Europe we have evaporative cooling installations, but not in data centre environments.
In terms of their performance in other locations, we have a worldwide weather database, and our own simulation software, which allows us to estimate their performance against the ASHRAE guidelines.
So for any particular location around the world, we can use that weather data to simulate the performance of the system – pleas get in touch if that would be of interest to you.
Q: How does this system manage external smoke and contaminants?
A: Contaminants would be managed by the filtration system, so we would take a view on the location and the types of contaminants that would be expected and we can mitigate that with correct filter selection.
With regards to external smoke, obviously that is a risk that is difficult to design against because you cannot filter out smoke. I would say that it is a very low risk, especially in the UK, and it would be a risk that you would have to consider if it is something that you wanted to run with. You could shut off your system to stop drawing smoke into the data centre. Of course cool smoke would be unlikely to cause any problems in the data centre apart from the smell. If there was some concern that hot smoke could get into the data centre, then you would have to look at having a back up system to re-circulate the air internally if smoky conditions were expected, but I really consider that to be a very low risk in theUK.
Q: How many coolers would you expect in a typical data centre?
A: The overall system design is based on the maximum volumetric air flow rate requirement per kW of IT equipment, typically defined as a CFM/kW (cubic feet per minute per kilowatt) rate by the IT equipment manufacturer. We then select the quantity of units, based on their individual capacity, to achieve the total rate.
For example, 65kW of IT equipment at 150CFM/kW converts to an SI unit rate of approximately 4.6m³/s, meaning a single Colt size 16 unit would be capable of the entire air flow requirements.
The air flow is based on the maximum demand of the IT equipment, which of course may not be all the time. Selecting variable speed fans allows you to match the air flow to the data centre requirements at any given time, therefore reducing running costs.
Q: Could you recommend any equipment that would help eliminate the 1.1% beyond limit periods?
A: Yes, some additional equipment would mitigate this, but this presentation aims to show that those few hours of the year where the system is slightly outside the ASHRAE limits should be of no concern – remember we never fall out of the temperature requirement, and we have discussed why the dew point/humidity exceedance is not an issue.
Q: How does the system deal with hard water?
A: Particularly hard, or soft, water has created no known issues within our coolers, and therefore it is not a concern.
This is principally mitigated by the temperature and speed of the water moving around the system, and the bi-annual maintenance routine which includes a brief clean down of the components.
Q: Please explain the acronym MERV?
A: MERV is an American terminology meaning the minimum efficiency reporting value, and related to standards of filtration.
Q: Have you considered pre-cooling air entering the cooler?
A: The current system using untreated outside air is very capable and provides the conditions we are looking for, with no pre-conditioning of the air required.
Of course, the warmer and drier the air entering the evaporative cooler is, the greater the capacity for cooling; if the air was already cooled before it entered the evaporative cooler, there might not be any need to cool it any further.
Q: In terms of capital costs, is this system cheaper than the standard data design with CRAC units?
A: In our experience, the larger the data centre, the better value the evaporative cooling system is. For a small data centre (say 50kW), the cost would be comparable between our system and the traditional CRAC system. However, our system would have considerably lower energy consumption so the payback period would be much shorter.
When the data centres get larger, the cost of our system would be much lower than the CRAC system.
Furthermore, the maintenance costs would be a lot lower as well.
Q: What is the hourly water consumption of one unit?
A: It is difficult to give an hourly figure, as the water consumption changes throughout the day, as the outside ambient conditions change.
However, water usage for 1MW IT load for a London location would be approximately 132 cubic metres (132,000 litres) annually.
Compare that to an indirect evaporative cooling system, which may use up to 3000 cubic metres (3,000,000 litres), more than 20x the amount.
Q: Is evaporative cooling the future of data centre cooling?
A: In my opinion – yes.
The industry as a whole is still moving slowly, and there are still some misconceptions about the performance of evaporative coolers, but we are trying to change that – and I hope this presentation contributed toward that.
Its ability to provide high volumes of air at desirable conditions, and at a very low capital and operating cost can only mean it has a big future.
Q: Are the units produced only for Colt or individual customers?
A: We manufacture these units in our own factory and sell them into the construction industry. You can purchase single or multiple units.
Q: With this system, will it affect the ceiling void and floor void due to the bigger duct work?
A: The system will provide an equivalent air volume within the data centre to what would have bee provided by the CRAC unit. Therefore, the actual size and air supply of the ceiling void and floor void will not change with the use of the evaporative cooling.
The only change will be that whereas the CRAC unit only circulates air inside the data centre, the evaporative cooling unit will need penetrations and duct work routes to wherever the location of the evaporative cooler is, which could be on the roof or wall for example.
Q: Is there a risk of triggering fire suppressions systems regarding smoke?
A: The way to mitigate that would be to install a duct mounted smoke detector on the intake points of the evaporative cooling system. The system would close off the air intake should smoke be detected, in order to prevent smoke entering the data centre.
Q: What is the biggest capacity of the unit?
A: The biggest unit is nominally 27,000/m²/hr. We have three units; Size 10 (nominally 10,000 m³/hr), size 16 (nominally 16,000 m³/hr), and size 27 (nominally 27,000m³/hr). You can calculate the size and quantity required by applying the general guidance given as mentioned in an earlier answer.
Q: Does evaporative cooling occupy more space than a CRAC unit? Is this a constraint?
A: Compared with a CRAC unit, they are different shapes and in completely different location, and therefore it is difficult to answer this directly.
The evaporative coolers would not be located in the data centre, they would be located externally. This means that they take up no space in the data centre meaning more space can be given over to IT equipment, which can only be good.
You would need an external location for the evaporative cooling plant, either on the roof, on an external wall, or perhaps in a separate room if you have louvre systems on the outside to allow air to be exchanged.
If given a set of plans with a roof layout and a site layout, we can give guidance on the space they would take up.
Q: What is the USP for this type of system?
A: Several, but the biggest USP, which sums up the system is ‘capability’.
The system is capable of meeting the airflow needs of your data centre, and beyond that, the massively reduced power consumption, low maintenance cost, long life and the reduced capital cost are all USPs.
Q: What about indirect evaporative cooling solutions?
A: We haven’t developed an indirect evaporative cooling system because the capability of the direct air system is so good.
An indirect system will consume a vast amount more water and electricity because of the inefficiency of the exchanges of heat, and therefore it takes away the benefits of evaporative cooling.
Q: Tell us about your software controls
A: We use a non-proprietary communication protocol, which is very configurable and capable, and easily integrated into other BMS systems that you may have in your data centre.
Missed the webinar?
Laurence Cockman is a Senior Consultant for Colt UK and specialises in the design and product application of energy efficient HVAC systems.