Data centers use less energy than you think

The new, comprehensive analysis presents more nuanced presents of global energy use

If the world is using more and more data, then it must be using more and more energy, right? Not so, says a comprehensive new analysis.

Researchers at Northwestern University, Lawrence Berkeley National Laboratory, and Koomey Analytics have developed the most detailed model to date of global data center energy use. With this model, the researchers found that, although demand for data has increased rapidly, massive efficiency gains by data centers have kept energy use roughly flat over the past decade.

This detailed, comprehensive model provides a more nuanced view of data center energy use and its drivers, enabling the researchers to make strategic policy recommendations for better managing this energy use in the future.

"While the historical efficiency progress made by data centers is remarkable, our findings do not mean that the IT industry and policymakers can rest on their laurels," said Eric Masanet, who led the study. "We think there is enough remaining efficiency potential to last several more years. But ever-growing demand for data means that everyone -- including policymakers, data center operators, equipment manufacturers, and data consumers -- must intensify efforts to avoid a possible sharp rise in energy use later this decade." CREDIT David Lohner{module INSIDE STORY}

The paper will be published on Feb. 28 in the journal Science.

Masanet is an adjunct professor at Northwestern's McCormick School of Engineering and the Mellichamp Chair in Sustainability Science for Emerging Technologies at the University of California, Santa Barbara. He conducted the research with a Ph.D. student and coauthor Nuoa Lei at Northwestern.

Filled with computing and networking equipment, data centers are central locations that collect, store and process data. As the world relies more and more on data-intensive technologies, the energy use of data centers is a growing concern.

"Considering that data centers are energy-intensive enterprises in a rapidly evolving industry, we do need to analyze them rigorously," said study co-author Arman Shehabi, a research scientist at Lawrence Berkeley National Laboratory. "Less detailed analyses have predicted rapid growth in data center energy use, but without fully considering the historical efficiency progress made by the industry. When we include that missing piece, a different picture of our digital lifestyles emerges."

To paint that more complete picture, the researchers integrated new data from numerous sources, including information on data center equipment stocks, efficiency trends, and market structure. The resulting model enables a detailed analysis of the energy used by data center equipment (such as servers, storage devices, and cooling systems), by type of data center including supercomputing centers and by world region.

The researchers concluded that recent efficiency gains made by data centers have likely been far greater than those observed in other major sectors of the global economy.

"Lack of data has hampered our understanding of global data center energy use trends for many years," said co-author Jonathan Koomey of Koomey Analytics. "Such knowledge gaps make business and policy planning incredibly difficult."

Addressing these knowledge gaps was a major motivation for the research team's work. "We wanted to give the data center industry, policymakers, and the public a more accurate view of data center energy use," said Masanet. "But the reality is that more efforts are needed to better monitor energy use moving forward, which is why we have made our model and datasets publicly available."

By releasing the model, the team hopes to inspire more research into the topic. The researchers also translated their findings into three specific types of policies that can help mitigate future growth in energy use, urging policymakers to act now:

  • Extend the life of current efficiency trends by strengthening IT energy standards such as ENERGY STAR, providing financial incentives and disseminating best energy efficiency practices;
  • Increase research and development investments in next-generation supercomputing, storage, and heat removal technologies to mitigate future energy use, while incentivizing renewable energy procurement to mitigate carbon emissions in parallel;
  • Invest in data collection, modeling, and monitoring activities to eliminate blind spots and enable more robust data center energy policy decisions.

Large exoplanet could have the right conditions for life

Astronomers have found an exoplanet more than twice the size of Earth to be potentially habitable, opening the search for life to planets significantly larger than Earth but smaller than Neptune.

A team from the University of Cambridge used the mass, radius, and atmospheric data of the exoplanet K2-18b and determined that it's possible for the planet to host liquid water at habitable conditions beneath its hydrogen-rich atmosphere. The results are reported in The Astrophysical Journal Letters.

The exoplanet K2-18b, 124 light-years away, is 2.6 times the radius and 8.6 times the mass of Earth and orbits its star within the habitable zone, where temperatures could allow liquid water to exist. The planet was the subject of significant media coverage in the autumn of 2019, as two different teams reported the detection of water vapor in its hydrogen-rich atmosphere. However, the extent of the atmosphere and the conditions of the interior underneath remained unknown. CAPTION Artist's impression of K2-18b.  CREDIT Amanda Smith{module INSIDE STORY}

"Water vapor has been detected in the atmospheres of a number of exoplanets but, even if the planet is in the habitable zone, that doesn't necessarily mean there are habitable conditions on the surface," said Dr. Nikku Madhusudhan from Cambridge's Institute of Astronomy, who led the new research. "To establish the prospects for habitability, it is important to obtain a unified understanding of the interior and atmospheric conditions on the planet - in particular, whether liquid water can exist beneath the atmosphere."

Given the large size of K2-18b, it has been suggested that it would be more like a smaller version of Neptune than a larger version of Earth. A 'mini-Neptune' is expected to have a significant hydrogen 'envelope' surrounding a layer of high-pressure water, with an inner core of rock and iron. If the hydrogen envelope is too thick, the temperature and pressure at the surface of the water layer beneath would be far too great to support life.

Now, Madhusudhan and his team have shown that despite the size of K2-18b, its hydrogen envelope is not necessarily too thick and the water layer could have the right conditions to support life. They used the existing observations of the atmosphere, as well as the mass and radius, to determine the composition and structure of both the atmosphere and interior using detailed numerical models and statistical methods to explain the data.

The researchers confirmed the atmosphere to be hydrogen-rich with a significant amount of water vapor. They also found that levels of other chemicals such as methane and ammonia were lower than expected for such an atmosphere. Whether these levels can be attributed to biological processes remains to be seen.

The team then used the atmospheric properties as boundary conditions for models of the planetary interior. They explored a wide range of models that could explain the atmospheric properties as well as the mass and radius of the planet. This allowed them to obtain the range of possible conditions in the interior, including the extent of the hydrogen envelope and the temperatures and pressures in the water layer.

"We wanted to know the thickness of the hydrogen envelope - how deep the hydrogen goes," said co-author Matthew Nixon, a Ph.D. student at the Institute of Astronomy. "While this is a question with multiple solutions, we've shown that you don't need much hydrogen to explain all the observations together."

The researchers found that the maximum extent of the hydrogen envelope allowed by the data is around 6% of the planet's mass, though most of the solutions require much less. The minimum amount of hydrogen is about one-millionth by mass, similar to the mass fraction of the Earth's atmosphere. In particular, a number of scenarios allow for an ocean world, with liquid water below the atmosphere at pressures and temperatures similar to those found in Earth's oceans.

This study opens the search for habitable conditions and bio-signatures outside the solar system to exoplanets that are significantly larger than Earth, beyond Earth-like exoplanets. Additionally, planets such as K2-18b are more accessible to atmospheric observations with current and future observational facilities. The atmospheric constraints obtained in this study can be refined using future observations with large facilities such as the upcoming James Webb Space Telescope.

Tokyo Tech builds a novel processor that solves a notoriously complex mathematical problem

Scientists at the Tokyo Institute of Technology have designed a novel processor architecture that can solve combinatorial optimization problems much faster than existing ones. Combinatorial optimization are complex problems that show up across many different fields of science and engineering and are difficult for conventional computers to handle, making specialized processor architectures very important.

The power of applied mathematics can be seen in the advancements of engineering and other sciences. However, often the mathematical problems used in these applications involve complex calculations that are beyond the capacities of modern computers in terms of time and resources. This is the case for combinatorial optimization problems.

Combinatorial optimization consists in locating an optimal object or solution in a finite set of possible ones. Such problems ubiquitously manifest in the real world across different fields. For example, combinatorial optimization problems show up in finance as portfolio optimization, in logistics as the well-known "traveling salesman problem", in machine learning, and in drug discovery. However, current computers cannot cope with these problems when the number of variables is high. Research overview of STATICA, a novel processor architecture.{module INSIDE STORY}

Fortunately, a team of researchers from the Tokyo Institute of Technology, in collaboration with Hitachi Hokkaido University Laboratory, and the University of Tokyo, have designed a novel processor architecture to specifically solve combinatorial optimization problems expressed in the form of an Ising model. The Ising model was originally used to describe the magnetic states of atoms (spins) in magnetic materials. However, this model can be used as an abstraction to solve combinatorial optimization problems because the evolution of the spins, which tends to reach the so-called lowest-energy state, mirrors how an optimization algorithm searches for the best solution. In fact, the state of the spins in the lowest-energy state can be directly mapped to the solution of a combinatorial optimization problem.

The proposed processor architecture, called STATICA, is fundamentally different from existing processors that calculate Ising models, called annealers. One limitation of most reported annealers is that they only consider spin interactions between neighboring particles. This allows for faster calculation but limits their possible applications. In contrast, STATICA is fully connected and all spin-to-spin interactions are considered. While STATICA's processing speed is lower than those of similar annealers, its calculation scheme is better: it uses parallel updating.

In most annealers, the evolution of spins (updating) is calculated iteratively. This process is inherently serial, meaning that spin switchings are calculated one by one because the switching of one spin affects all the rest in the same iteration. In STATICA, the updating process is carried out in parallel using what is known as stochastic cell automata. Instead of calculating spin states using the spins themselves, STATICA creates replicas of the spins and spin-to-replica interactions are used, allowing for parallel calculation. This saves a tremendous amount of time due to the reduced number of steps needed. "We have proven that conventional approaches and STATICA derive the same solution under certain conditions, but STATICA does so in N times fewer steps, where N is the number of spins in the model," remarks Prof. Masato Motomura, who led this project. Furthermore, the research team implemented an approach called delta-driven spin updating. Because only spins that changed in the previous iteration are important when calculating the following one, a selector circuit is used to only involve spins that flipped in each iteration.

STATICA offers reduced power consumption, higher processing speed, and better accuracy than other annealers. "STATICA aims at revolutionizing annealing processors by solving optimization problems based on the mathematical model of stochastic cell automata. Our initial evaluations have provided strong results," concludes Prof. Motomura. Further refinements will make STATICA an attractive choice for combinatorial optimization.