Berkeley Lab energy efficiency experts also calculate energy cost of upgrading ventilation.  

With winter around the corner some homeowners may be thinking about plugging all the leaks in their home to make them less drafty. Imagine if every homeowner in the country did that-how much energy could be saved? Using physics-based modeling of the U.S. housing stock, researchers from Lawrence Berkeley National Laboratory (Berkeley Lab) found in a new study that upgrading airtightness to a uniform level could achieve as much as $33 billion in annual energy savings.    

"Currently people who weatherize can get their homes about 20 to 30 percent tighter. But they're not sealing all the cracks. There's still quite a bit left on the table, and those extra leaks and cracks could potentially save a lot of energy," said Berkeley Lab scientist Jennifer Logue, lead author of the study, "Energy impacts of envelope tightening and mechanical ventilation for the U.S. residential sector," which was recently published online in the journal Energy and Buildings. Her co-authors were Berkeley Lab scientists Max Sherman, Iain Walker and Brett Singer.      

Logue said the purpose of their study was to figure how much energy is wasted from leaky homes and determine the optimal standard of airtightness-one that would maximize energy savings while minimizing the cost of achieving those savings. This is an important question because the residential sector-113 million homes-uses about 23 percent of total U.S. source energy annually. (Source energy includes site energy, the energy consumed by buildings for heating and electricity, as well as the raw energy required to transmit, deliver and produce it.) Heating and cooling accounts for about half of the site energy used in residences.      

The largest potential savings are in the hottest and coldest climates. As new air enters homes through leaks and cracks, it has to be cooled or heated. Although the trend has been towards building tighter houses, Logue says the science is still not settled on the best ways to minimize leaks. "More research is needed to figure out what are the most effective ways to weatherize," she said. "There are still benefits to be gained if we can figure out how to weatherize more effectively."      

The Berkeley Lab researchers considered five levels of tightening: "average" tightening, "advanced" tightening, the International Energy Conservation Code (IECC) standard, the R2000 standard (common in Canada, tighter than IECC) and the "passive house" standard, the tightest and most difficult to achieve.      

They found that upgrading all homes to be as airtight as the top 10 percent of similar homes (advanced tightening) would decrease energy demand by 2.6 quads annually-out of the total 22 quads of source energy used by the residential housing sector-leading to roughly $22 billion in savings in energy bills. Reaching the IECC standard would yield savings of 3.83 quads in annual source energy, yielding $33 billion in savings.      

The study found that the IECC standard offered most of the benefit that the tighter standards would yield. Moreover this standard is likely more achievable than the tighter standards. According to their analysis, raising the U.S. housing stock to the IECC standard would reduce airflow in homes by a median value of 50 percent.      

"As we move forward and look to build better housing stock, we want to know what standards we should enforce," Logue said. "It looks like the IECC standard gets us the majority of the benefit of air sealing. More research is needed to determine the costs of implementing each of these standards in new homes to see which are cost-effective. As we get better at air sealing, we can move towards tighter envelopes in buildings. "      

The analysis in the study factored in the energy costs of increasing ventilation where necessary to maintain good indoor air quality. A separate analysis looked at the energy cost of only bringing the housing stock into compliance with ASHRAE 62.2, a national ventilation standard for homes that ensures sufficient ventilation for human health.      

"We found that the energy burden would be pretty small, only about an additional 0.2 quads of source energy annually"-less than 0.1 percent of total source energy that goes to the residential housing sector-"to get everyone to the level where they're getting enough whole-house ventilation," Logue said.

Cleaner, Cheaper Energy is Goal of Supercomputer Research

University of Utah engineers will use a five-year, $16 million grant to conduct supercomputer simulations aimed at developing a prototype low-cost, low-emissions coal power plant that could electrify a mid-sized city. The goal of this “predictive science” effort is to help power poor nations while reducing greenhouse emissions in developed ones.

The grant by the U.S. Department of Energy’s National Nuclear Security Administration is enabling University of Utah researchers Philip J. Smith and Martin Berzins, along with university President David W. Pershing, to establish the Carbon Capture Multidisciplinary Simulation Center. All three are professors in the university’s College of Engineering.

The researchers will use supercomputers to simulate and predict performance for a proposed 350-megawatt boiler system that burns pulverized coal with pure oxygen rather than air. The design – which never has been built – captures carbon dioxide released during power generation.

Through computer modeling, researchers will address uncertainties that could arise in building such a power plant. They hope their findings will expedite deployment of clean, economical power to developing nations, where nearly 1.2 billion people lack electricity, and help industrialized nations meet increasingly stringent emissions standards, like those recently proposed by the Environmental Protection Agency. The work also will establish the University of Utah as a hub of predictive science.

Plans have yet to be made regarding when or where a prototype of the new coal power plant design will be built by the European company that designed it.

“This is a very important award for the university – not only because of its size, but because it builds on the strength of our faculty and staff in energy, uncertainty, parallel computing and visualization,” says Pershing, a distinguished professor of chemical engineering. “It will provide critical funding for students to study these topics on some of the world’s largest supercomputers.”

Stemming from the ban on nuclear testing, predictive science allows engineers to virtually test and manage nuclear weapons, as well as all-new technologies such as power plants before they are built. This enables officials to check the safety, security and reliability of the U.S. nuclear stockpile and new technologies without physical testing.

“Society made projections off of ‘gut feel,’ and now it’s asking science to compute what will happen, so we’re making decisions off of computed results,” says Smith, a professor of chemical engineering and director of the new center.

Berzins, a professor of computer science, adds: “These simulations and others like them will make use of computers that are expected to be able to perform 1 million-trillion operations per second in the next decade or so, or as many operations per second as a billion personal computers today.”

The majority of the grant will enable students, faculty and staff to develop and run software on Department of Energy supercomputers.

“For perspective, that’s using half a million to 1 million computer processors at once on one problem for several days,” says Smith.

The University of Utah will subcontract with coal reaction and advanced math experts from Brigham Young University and University of California, Berkeley.

This grant builds upon the University of Utah’s experience in predictive science, particularly its previous 13-year effort – also funded by the National Nuclear Security Administration – for simulation studies of accidental fires and explosions.

Ever wanted to see a nuclear reactor core in action?  A new computer algorithm developed by researchers at the U.S. Department of Energy’s (DOE) Argonne National Laboratory allows scientists to view nuclear fission in much finer detail than ever before.

Development of the UNIC code is funded principally by DOE’s Office of Nuclear Energy through the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program.

A team of nuclear engineers and computer scientists at Argonne National Laboratory are developing the neutron transport code UNIC, which enables researchers for the first time to obtain a highly detailed description of a nuclear reactor core.

The code could prove crucial in the development of nuclear reactors that are safe, affordable and environmentally friendly. To model the complex geometry of a reactor core requires billions of spatial elements, hundreds of angles and thousands of energy groups—all of which lead to problem sizes with quadrillions of possible solutions.

Such calculations exhaust computer memory of the largest machines, and therefore reactor modeling codes typically rely on various approximations. But approximations limit the predictive capability of computer simulations and leave considerable uncertainty in crucial reactor design and operational parameters.

“The UNIC code is intended to reduce the uncertainties and biases in reactor design calculations by progressively replacing existing multilevel averaging techniques with more direct solution methods based on explicit reactor geometries,” said Andrew Siegel, a computational scientist at Argonne and leader of Argonne’s reactor simulation group.An elevation plot of the highest energy neutron flux distributions from an axial slice of the reactor is shown superimposed over the same slice of the underlying geometry. This figure shows the rapid spatial variation in the high energy neutron distribution between within each plate along with the more slowly varying, global distribution. The figure is significant since UNIC allows researchers to capture both of these effects simultaneously.

UNIC has run successfully at DOE leadership computing facilities, home to some of the world’s fastest supercomputers, including the energy-efficient IBM Blue Gene/P at Argonne and the Cray XT5 at Oak Ridge National Laboratory. Although still under development, the code has already produced new scientific results.

In particular, the Argonne team has carried out highly detailed simulations of the Zero Power Reactor experiments on up to 163,840 processor cores of the Blue Gene/P and 222,912 processor cores of the Cray XT5, as well as on 294,912 processor cores of a Blue Gene/P at the Jülich Supercomputing Center in Germany. With UNIC, the researchers have successfully represented the details of the full reactor geometry for the first time and have been able to compare the results directly with the experimental data.

Argonne’s UNIC code provides a powerful new tool for designers of safe, environmentally friendly nuclear reactors – a key component of our nation’s current and future energy needs. By integrating innovative design features with state-of-the-art numerical solvers, UNIC allows researchers not only to better understand the behavior of existing reactor systems but also to predict the behavior of many of the newly proposed systems having untested design characteristics.

Development of the UNIC code is funded principally by DOE’s Office of Nuclear Energy through the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. The Argonne UNIC project is a key part of the NEAMS efforts to replace the traditional “test-based” approach to nuclear systems design with a new “science-based” approach in which advanced modeling and simulation play a dominant role.

A video of a more detailed simulation of the Zero Power Reactor experiment is available here

In the five years since the China Energy Group of the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) released its last edition of the China Energy Databook, China has achieved two dubious distinctions: it surpassed the United States in energy consumption and it surpassed the United States in energy-related emissions of carbon dioxide, becoming the world leader on both scores.

 

With these important shifts in the global energy landscape, the eighth edition of the China Energy Databook is being released this week. The Databook is the most comprehensive publicly available resource known to exist covering China's energy and environmental statistics. The China Energy Group researchers have amassed an enormous trove of data from firsthand sources and organized much of it into a relational database, making it far more useful for research and analytical purposes.

 

"We have gathered statistical information on energy and energy demand drivers from all different resources, such as the China Environment Yearbook, the Transportation Yearbook, the Power Yearbook, the Iron and Steel Yearbook, the Cement Almanac, statistics of oil companies and power companies," said Lynn Price, head of the China Energy Group. "A library collection like this doesn't exist anywhere else in the world."

 

One of the most important changes in the new edition is that it captures the extensive retroactive changes China has made to its energy consumption data. "They used a lot more coal than they originally admitted to, several hundred million tons more," said David Fridley, one of the two lead editors of this and several previous Databooks. "Like many other people, we were writing articles around 2000 about the decline in China's energy consumption in the late 1990s. Now the decline has completely disappeared-they found it. It was underreported."

 

Another major shift is that China has become a voracious energy importer, especially of coal and liquefied natural gas. "Like the United States, China has become among the world's largest importers of oil, gas and coal," Fridley said. "Compared to 2005-2006, it has probably doubled its presence. Certainly in coal-they imported virtually no coal in 2005 and they're now importing close to 200 million tons. This is reflected in our chapter on imports and exports, which shows the impact of China on world energy markets."

 

One chapter is devoted to international comparisons, allowing users a quick way to see how the nature of energy and economic activities differs from country to country. The main story in China? "The continued dominance of coal and the continued dominance of industry," Fridley said. Industry consumes two-thirds of China's energy, compared to less than 30 percent in the United States. Relatedly, transportation accounts for only 15 percent of China's energy consumption, compared to 30 percent in the United States.

 

The other part of the story is China's dramatic jump in renewable energy production. Whereas the last version of the Databook did not even have data on wind and solar power generation, China is now among the top wind power producers in the world.

 

The Databook contains information for energy and environment at both the national and provincial levels, mainly from 1980 through 2011, with some data series beginning in 1949. There are bright spots in the data on China's environment. Levels of sulfur dioxide, a pollutant emitted largely by power plants, and chemical oxygen demand, a measure of organic pollution in the water, are down even as CO2 emissions have continued to rise.

 

"What they don't report is PM2.5 [or fine particulate matter], which has moved to a position of top concern as major pollution events have hit north China," said Fridley. "Instead they just have a general category of particulates."

 

The Databook has undergone several changes in format over the last few decades. The first four editions were paper-based. But when the publication topped 400 pages, distribution was switched to CDs in 2001. At that time data related to China's energy balance was put in a relational database and other datasets, such as energy efficiency, were put in spreadsheets.

 

The latest edition, with more than 140,000 datapoints, is the first that will be available for download. (Click here.) The lead editors were Fridley and John Romankiewicz.

 

"One of the missions of our group is helping the rest of the world understand what's going on in China," said Price. "Unfortunately, we receive very limited funding to produce this extremely popular publication. Let's just say this edition has been a labor of love."

Advanced supercomputer simulations could be a big hit for truckers and the people who design guardrails, protective barriers and roadway signs. Srdjan Simunovic of Oak Ridge National Laboratory is part of a team conducting a study aimed at gaining a better understanding of crash performance of these safety structures.

"Very limited computational simulation work has been conducted on crash performance of barriers when impacted by medium- and heavy-duty trucks because of the computational cost and complexity of full-scale truck models," Simunovic said.

The design and engineering of these structures strongly influence the injury-causing g-forces experienced by vehicle occupants and whether the trucks are redirected back into traffic, causing additional hazards.

Partners in the study, funded by the National Transportation Research Center Inc., are Battelle Memorial Institute and the University of Tennessee.

Page 6 of 7