Not long ago artificial intelligence (AI) and associated technologies always seemed futuristic, but today there is no shortage of companies offering AI solutions and claiming business benefits from the technology. But is it real? Does AI offer tangible, measurable benefits? Those are the questions I will attempt to tackle as we look at the application of AI in data centers.
Let’s first discuss the overall ambition, which is to apply an algorithm that enables a data center to “self-optimize” in terms of efficiency with respect to energy usage. Self-optimize for our purposes means the data center will autonomously adjust its power and cooling systems. Today, a facility or data center manager may adjust the temperature of the IT room or the water temperature of the chiller plant (used to cool the data center) and monitor the effects of the changes, at various IT workload levels, on energy usage and the utility bill. The manager could then keep track of the benefits at different temperatures and manually adjust the temperature using the controls on the equipment, such as the thermostat. However, your data center could have hundreds of computer room air conditioners (CRACs) with thermostats. It’s a daunting or even an impossible task to manually adjust all of them. But if an AI system was in place, it could learn (via the algorithm) the optimum temperatures at different times of day and at different IT utilization levels and automatically adjust the cooling systems accordingly. What’s more, it would keep track of the data and continually refine the algorithm to make it more effective over time. Pretty cool stuff!
The baseline requirement for AI systems is real time data from all the major components in the entire ecosystem. For data centers, this includes data from all the systems involved with facility cooling (chillers, cooling towers, variable speed drives, condensers, evaporators, air handlers, economizers, etc.), IT room cooling (CRACs, air handlers, containment systems, fans, etc.) and the IT equipment (server utilization rate, temperature, power consumption). It is not a trivial task to get all this data – you face issues stemming from multiple communication protocols, data sets varying from different manufacturers, and data being locked by the manufacturers. Also, many of the devices may have displays but sometimes have no data output or remote control capability needed to get data to an AI system. For a data center AI system to operate and deliver the tangible benefits some are claiming, it really needs to be a greenfield deployment or new design with an AI system as a core component.
Schneider Electric is deeply involved with many projects and collaborations around AI in data centers, but they are far from mainstream. We have one of the best IoT enabled, interoperable architectures called EcoStruxure for Data Centers. We are seeing AI systems being deployed as test cases in many data centers and we’re witnessing major advances in data center metering and data pools in the cloud, with analytics delivering predictions that can be tied to services around improving data center efficiency. It’s early in the game yet but I do see a future where AI systems play a crucial role in data center operations, driving efficiencies in a way that humans just can’t.
The post Demystifying Artificial Intelligence for Data Centers appeared first on Schneider Electric Blog.