Raytheon leads new supercomputing center dedicated to advancing U.S. weather forecasting

NOAA has announced that Raytheon Intelligence and Space has been chosen to design and develop the Earth Prediction Innovation Center (EPIC), an extramural center that will unite academia, industry and government to help create the most user-friendly and user-accessible comprehensive Earth modeling system.

Raytheon Intelligence and Spaceoffsite link brings to EPIC proven expertise in scientific leadership, software engineering, software infrastructure, and delivery of support services to government, academia and industry researchers who will collaborate within the EPIC framework. epic title 01 3b084

”Extreme weather events, exacerbated by climate change, are increasing,” said Craig McLean, NOAA’s acting chief scientist and NOAA assistant administrator for Oceanic and Atmospheric Research. “EPIC will help the United States diversify the community that contributes to improving weather forecasting to save lives, protect property, and strengthen our economy.”

“The creation of EPIC is a foundational piece in a major, multi-step effort by NOAA to expand and strengthen community modeling and help us accelerate the improvements in operational weather and climate forecasting,” said Louis W. Uccellini, Ph.D., director, NOAA’s National Weather Service. “This effort will improve forecasts and decision-support activities to ensure communities are ready for, and respond to, oncoming extreme weather, water, and climate events.”

EPIC is a collaborative effort involving the larger research community from academia, public agencies, and private industry — known as the Weather Enterprise — to contribute to the overall development of the operational models used by NOAA’s National Weather Service to meet its mission of saving lives and property. EPIC will facilitate the community modeling approach by making it easier for developers across diverse sectors to contribute to the development of these operational models using a common modeling infrastructure.

These research collaborations are already helping NOAA accelerate development of the nation’s Unified Forecast Systemoffsite link (UFS), a community-based, comprehensive Earth modeling system, which is becoming the core of NOAA's operational Numerical Weather Prediction applications.

In order to support the accelerated drive to improve weather modeling to help save lives, protect property and strengthen the nation’s economy, NOAA is tripling its operational supercomputing capacity. New supercomputers and cloud computing capabilities are also being leveraged to quickly transition research and development advancements, including those that will occur through EPIC, into operations at the National Weather Service.

UFS is integrating numerous environmental models into a unified Earth modeling system that will be used to predict weather from local to global domains at time scales from minutes to seasons. This unified system allows better collaboration between NOAA and the extramural science community, and will accelerate the development and integration of innovation into NOAA's operational weather forecast systems.

To encourage community collaborations, NOAA has publicly released user-friendly computer codes for medium-range and short-range weather prediction. NOAA has also recently upgraded its flagship global weather model to improve forecasting of high-impact weather events such as hurricanes, severe weather outbreaks, rainfall, and blizzards.

EPIC is a national priority supported by the Weather Research and Forecasting Innovation Act of 2017, which calls for NOAA to prioritize improving weather data, modeling, computing, forecasting, and warnings for the protection of life and property and to enhance the national economy. Congress codified EPIC and further called for NOAA to accelerate community-developed scientific and technological advances to its operational numerical weather prediction in the National Integrated Drought Information System Reauthorization Act of 2018.

Oxford Brookes prof shows how vertical turbines could be the future for wind farms

The now-familiar sight of traditional propeller wind turbines could be replaced in the future with wind farms containing more compact and efficient vertical turbines

The now-familiar sight of traditional propeller wind turbines could be replaced in the future with wind farms containing more compact and efficient vertical turbines. New research from Oxford Brookes University has found that the vertical turbine design is far more efficient than traditional turbines in large-scale wind farms, and when set in pairs the vertical turbines increase each other's performance by up to 15%.

A research team from the School of Engineering, Computing and Mathematics (ECM) at Oxford Brookes led by Professor Iakovos Tzanakis conducted an in-depth study using more than 11,500 hours of supercomputer simulation to show that wind farms can perform more efficiently by substituting the traditional propeller-type Horizontal Axis Wind Turbines (HAWTs), for compact Vertical Axis Wind Turbines (VAWTs).

Vertical turbines are more efficient than traditional windmill turbines Farm of Vertical Axis Wind Turbines

The research demonstrates for the first time at a realistic scale, the potential of large-scale VAWTs to outcompete current HAWT wind farm turbines.

VAWTs spin around an axis vertical to the ground, and they exhibit the opposite behavior of the well-known propeller design (HAWTs). The research found that VAWTs increase each other's performance when arranged in grid formations. Positioning wind turbines to maximize outputs is critical to the design of wind farms.

Professor Tzanakis comments "This study evidences that the future of wind farms should be vertical. Vertical axis wind farm turbines can be designed to be much closer together, increasing their efficiency and ultimately lowering the prices of electricity. In the long run, VAWTs can help accelerate the green transition of our energy systems, so that more clean and sustainable energy comes from renewable sources."

With the UK's wind energy capacity expected to almost double by 2030, the findings are a stepping stone towards designing more efficient wind farms, understanding large-scale wind energy harvesting techniques, and ultimately improving the renewable energy technology to more quickly replace fossil fuels as sources of energy.

A cost-effective way to meet wind power targets

According to the Global Wind Report 2021, the world needs to be installing wind power three times faster over the next decade, to meet net-zero targets and avoid the worst impacts of climate change.

Lead author of the report and Bachelor of Engineering graduate Joachim Toftegaard Hansen commented: "Modern wind farms are one of the most efficient ways to generate green energy, however, they have one major flaw: as the wind approaches the front row of turbines, turbulence will be generated downstream. The turbulence is detrimental to the performance of the subsequent rows.

"In other words, the front row will convert about half the kinetic energy of the wind into electricity, whereas for the back row, that number is down to 25-30%. Each turbine costs more than £2 million/MW. As an engineer, it naturally occurred to me that there must be a more cost-effective way."

The study is the first to comprehensively analyze many aspects of wind turbine performance, with regards to array angle, the direction of rotation, turbine spacing, and the number of rotors. It is also the first research to investigate whether the performance improvements hold for three VAWT turbines set in a series.

Dr. Mahak co-author of the article and Senior Lecturer in ECM comments: "The importance of using computational methods in understanding flow physics can't be underestimated. These types of design and enhancement studies are a fraction of the cost compared to the huge experimental test facilities. This is particularly important at the initial design phase and is extremely useful for the industries trying to achieve maximum design efficiency and power output."

Cambridge researcher develops cancer algorithm that flags genetic weaknesses in tumors

A new way to identify tumors that could be sensitive to particular immunotherapies has been developed using data from thousands of NHS cancer patient samples sequenced through the 100,000 Genomes Project. The MMRDetect clinical algorithm makes it possible to identify tumors that have 'mismatch repair deficiencies' and then improve the personalization of cancer therapies to exploit those weaknesses.

The study, led by researchers from the University of Cambridge's Department of Medical Genetics and MRC Cancer Unit, identified nine DNA repair genes that are critical guardians of the human genome from damage caused by oxygen and water, as well as errors during cell division.

The team used a genome editing technology, CRISPR-Cas9, to 'knock out' (make inoperative) these repair genes in healthy human stem cells. In doing so, they observed strong mutation patterns, or mutational signatures, which offer useful markers of those genes and the repair pathways they are involved in, failing.

The study, funded by Cancer Research UK and published today, indicates that these signatures of repair pathway defects are ongoing and could therefore serve as crucial biomarkers in precision medicine.

Senior author, Dr. Serena Nik-Zainal, a Cancer Research UK Advanced Clinician Scientist at Cambridge University's MRC Cancer Unit, said: "When we knock out different DNA repair genes, we find a kind of fingerprint of that gene or pathway being erased. We can then use those fingerprints to figure out which repair pathways have stopped working in each person's tumor, and what treatments should be used specifically to treat their cancer."

The new computer algorithm, MMRDetect, uses the mutational signatures that were identified in the knock-out experiments and were trained on whole-genome sequencing data from NHS cancer patients in the 100,000 Genomes Project, to identify tumors with 'mismatch repair deficiency' which makes them sensitive to checkpoint inhibitors, immunotherapies. Having developed the algorithm on tumors in this study, the plan now is to roll it out across all cancers picked up by Genomics England.

The breakthrough demonstrates the value of researchers working with the 100,000 Genomes Project, a pioneering national whole-genome sequencing endeavor.

Parker Moss, Chief Commercial and Partnerships Officer at Genomics England, said: "We are very excited to see such impactful research being supported by the 100,000 Genomes Project, and that our data has helped to develop a clinically significant tool. This is a fantastic example of how the sheer size and richness of the 100,000 Genomes Project data can contribute to important research.

"The outcomes from Dr. Nik-Zainal and her team's work demonstrate perfectly how quickly and effectively we can return value to patient care by bringing together a community of leading researchers through Genomics England's platform."

The study offers important insights into where DNA damage comes from in our bodies. Water and oxygen are essential for life but are also the biggest sources of internal DNA damage in humans.

Dr. Nik-Zainal said: "Because we are alive, we need oxygen and water, yet they cause a constant drip of DNA damage in our cells. Our DNA repair pathways are normally working to limit that damage, which is why, when we knocked out some of the crucial genes, we immediately saw lots of mutations."

"Some DNA repair genes are like precision tools, able to fix very specific kinds of DNA damage. Human DNA has four building blocks: adenine, cytosine, guanine, and thymine. As an example, the OGG1 gene has a very specific role in fixing guanine when it is damaged by oxygen. When we knocked out OGG1, this crucial defense was severely weakened resulting in a very specific pattern of guanines that had mutated into thymines throughout the genome."

To be most effective, the MMRDetect algorithm could be used as soon as a patient has received a cancer diagnosis and their tumor is characterized by genome sequencing. The team believes that this tool could help to transform the way a wide range of cancers are treated and save many lives.

Michelle Mitchell, Chief Executive of Cancer Research UK, said: "Determining the right treatments for patients will give them the best chance of surviving their disease. Immunotherapy in particular can be powerful, but it doesn't work on everyone, so figuring out how to tell when it will work is vital to making it the most useful treatment it can be.

"Our ability to map and mine useful information from the genomes of tumors has improved massively over the past decade. Thanks to initiatives like the 100,000 Genomes Project, we are beginning to see how we might use this information to benefit patients. We look forward to seeing how this research develops, and its possibilities in helping future patients."