Supercomputer simulation of the chorus event at Mars.
Supercomputer simulation of the chorus event at Mars.

Chinese scientists verify trap-release-amplify model by reproducing electromagnetic waves on Mars

The study reveals whistler-mode chorus waves similar to the Earth's detected on Mars, validating the role of magnetic field Inhomogeneity in frequency sweeping phenomenon. 

Chinese scientists have reproduced the observed whistler mode chorus waves on Mars using data from the Mars Atmosphere and Volatile Evolution (MAVEN) mission and compared them with phenomena on the Earth. They found that both Mars and Earth exhibit whistler mode chorus waves triggered by nonlinear processes with the key role played by background magnetic field inhomogeneity in frequency sweeping. 

The study provides crucial support for understanding chorus waves in the Martian environment and verifies the previously proposed "Trap-Release-Amplify" (TaRA) model under more extreme conditions. Magnetic field and chorus emissions at Mars and Earth.

Whistler-mode chorus waves are electromagnetic wave emissions widely presented in planetary magnetospheres. When their electromagnetic signals are converted into sound, they resemble the harmonious chorus of birds in the early morning, hence the name "chorus waves." Chorus waves can accelerate high-energy electrons in space through resonance, leading to a rapid increase in electron flux in Earth's radiation belts during geomagnetic storms. Additionally, they scatter high-energy electrons into the atmosphere, creating diffuse and pulsating auroras. 

One characteristic of chorus waves is their narrowband frequency sweeping structure. The excitation mechanism of this sweeping structure has been of great interest for decades, and scientists have proposed various theoretical models. However, there has been an ongoing debate regarding why frequency sweeping occurs in chorus waves and how to calculate the sweeping frequency. One central point of contention is whether the background magnetic field inhomogeneity plays a crucial role in frequency sweeping and how it affects the sweeping phenomenon. 

The TaRA model, previously proposed by a team from the University of Science and Technology of China of the Chinese Academy of Sciences, is based on modern plasma physics theories and suggests that the frequency sweeping of chorus waves in the magnetosphere is the result of the combined effects of nonlinear processes and background magnetic field inhomogeneity. The model provides a corresponding formula for calculating the sweeping frequency. However, the variation in magnetic field inhomogeneity in Earth's magnetosphere is limited, making it difficult to test the TaRA model in a larger parameter space. 

There exist distinct magnetic field environments between Mars and Earth. The Earth possesses a global dipole-like magnetic field, while Mars only has localized remnant magnetization. In the remnant magnetization environment of Mars, similar chorus wave events have also been observed by the MAVEN satellite. The calculations reveal a difference of five orders of magnitude in background magnetic field inhomogeneity between Mars and Earth. By comparing wave events observed on Earth and Mars, the previously proposed TaRA model can be tested under more extreme conditions. 

To validate this model, in this study, scientists from the University of Science and Technology of China and their collaborators observed the particle distribution on Mars using the MAVEN satellite. They combined it with the corresponding Martian crustal remnant magnetic field model. 

Employing a first-principles particle simulation method, scientists reproduced the observed chorus wave phenomena on Mars. Through the analysis of particle phase space distribution, they confirmed that the sweeping process of these waves is consistent with that of chorus waves on Earth, both triggered by nonlinear processes. 

Furthermore, scientists used two different methods provided by the TaRA model to calculate the sweeping frequency of chorus waves and compared them with the observation and simulation results. The results demonstrated high consistency between the sweeping frequencies calculated based on nonlinear processes and background magnetic field inhomogeneity and the supercomputer simulation results. 

These findings indicated that although Mars and Earth possess distinct magnetic and plasma environments, the observed chorus wave phenomena on Mars follow the same fundamental physical processes as those in Earth's magnetosphere. This study validated the wide applicability of the TaRA model in describing the sweeping physical processes of chorus waves under extreme conditions with a five-order difference in magnetic field inhomogeneity, which confirms the existence of chorus waves on Mars, and provides support for testing and applying the TaRA model under extreme conditions. 

Syracuse University researchers co-authored a study exploring the extent to which human activities are contributing to hydrogeochemical changes in U.S. rivers. The image above is Mills River in Pisgah National Forest, North Carolina.
Syracuse University researchers co-authored a study exploring the extent to which human activities are contributing to hydrogeochemical changes in U.S. rivers. The image above is Mills River in Pisgah National Forest, North Carolina.

Syracuse prof Wen uses supercomputer modeling to discover the sources of salinization, alkalinization in watersheds

From protecting biodiversity to ensuring the safety of drinking water, the biochemical makeup of rivers and streams around the United States is critical for human and environmental welfare. Studies have found that human activity and urbanization are driving the salinization (increased salt content) of freshwater sources across the country. In excess, salinity can make water undrinkable, increase the cost of treating water, and harm freshwater fish and wildlife. 

Along with the rise in salinity has also been an increase in alkalinity over time, and past research suggests that salinization may enhance alkalinization. But unlike excess salinity, alkalinization can have a positive impact on the environment due to its ability to neutralize water acidity and absorb carbon dioxide in the Earth’s atmosphere – a key component to combating climate change. Therefore, understanding the processes at play which are affecting salinity and alkalinity has important environmental and health implications. EES Professor Tao Wen and his co-authors used machine learning to detect sources of salinization and alkalinization in U.S. watersheds.

A team of researchers from Syracuse University and Texas A&M University has applied a machine learning model to explore where and to what extent human activities are contributing to the hydrogeochemical changes, such as increases in salinity and alkalinity in U.S. rivers.

The group used data from 226 river monitoring sites across the U.S. and built two machine-learning models to predict monthly salinity and alkalinity levels at each site. These sites were selected because long-term continuous water quality measurements have been recorded for at least 30 years. From urban to rural settings, the model explored a diverse range of watersheds, which are areas where all flowing surface water converges to a single point, such as a river or lake. It evaluated 32 watershed factors ranging from hydrology, climate, geology, soil chemistry, land use, and land cover to pinpoint the factors contributing to rising salinity and alkalinity. The team’s models determined human activities as major contributors to the salinity of U.S. rivers while rising alkalinity was mainly attributed more to natural processes than human activities.

The team, which included Syracuse University researchers Tao Wen, assistant professor in the College of Arts and Sciences’ Department of Earth and Environmental Sciences (EES), Beibei E, a graduate student in EES, Charles T. Driscoll, University Professor of Environmental Systems and Distinguished Professor in the College of Engineering and Computer Science, and Texas A&M assistant professor Shuang Zhang, recently had their findings published in the journal Science of the Total Environment.

What’s Driving Salinization and Alkalinization?

The results from the group’s sodium prediction model, which detected human activities such as the application of road salt as major contributions to the salinity of U.S. rivers, were consistent with previous studies. This model specifically revealed population density and impervious surface percentage (artificial surfaces such as roads) as the two most important contributors to the higher salt content in U.S. rivers.

According to Wen, the accuracy of the salinity model provided an important proof of concept for the research team.

“Regarding causes of salinity in rivers, the results from our machine learning model matched those of previous studies which focused on field observation, lab work, and statistical analysis,” says Wen. “This proved that our approach was working.”

With the salinity results confirming the accuracy of the team’s model, they then turned their attention to alkalinity. Their model identified natural processes as predominantly contributing to variation in river alkalinity, a contrast to previous research that identified human activities as the main contributor to alkalinization. They found that local climatic and hydrogeological conditions including runoff, sediment, soil pH, and moisture, were features most affecting river alkalinity.

Critical to the Carbon Cycle

Their findings have important environmental and climate implications as alkalinity in rivers forms a vital link in the carbon cycle. Consider the movement of carbon during a rainstorm. When it rains, carbon dioxide from the atmosphere combines with water to form carbonic acid. When the carbonic acid reaches the ground and comes into contact with certain rocks, it triggers a chemical reaction that extracts gaseous carbon dioxide from the atmosphere and transports it to the ocean via land water systems like lakes and rivers. Known as rock weathering, this natural process continuously erodes rocks and sequesters atmospheric CO2 over millions of years. It is also a key regulator of greenhouse gases that contribute to global warming.

“Rock weathering is the primary source of alkalinity in natural waters and is one of the main ways to bring down carbon dioxide in the air,” says Wen. Think of it as a feedback loop: when there is too much carbon dioxide in the atmosphere, temperatures increase leading to enhanced rock weathering. With more rock being dissolved into watersheds due to enhanced rock weathering, alkalinity rises and in turn, brings down carbon dioxide.

“Alkalinity is a critical component of the carbon cycle,” says Wen. “While we found that natural processes are the primary drivers of alkalinization, these natural factors can still be changed by humans. We can alter the alkalinity level in rivers by changing the natural parameters, so we need to invest more to restore the natural conditions of watersheds and tackle global warming and climate changes to deal with alkalinization in U.S. rivers.”

The results from the team’s study can help inform future research about enhanced rock weathering efforts – where rocks are ground up and spread across fields. Distributing rock dust across large areas increases the amount of contact between rain and rock, which enhances carbon removal from the atmosphere. Wen says the team’s model can help answer questions about the evolution of natural conditions in different regions – an important step needed to implement enhanced rock weathering more effectively.

The work was funded through a $460,000 National Science Foundation grant awarded to Wen.

POET Technologies confirms sample availability of POET Infinity, testing with a pair of customers

Alpha samples of the chiplet-based transmitter platform for 400G, 800G, and 1.6T data center solutions are ready for shipment

POET Technologies has announced alpha sample readiness of POET Infinity, a chiplet-based transmitter platform for 400G, 800G, and 1.6T pluggable transceivers and co-packaged optics solutions. Two lead customers have agreed to partner with POET to test the alpha version of the Infinity chiplet.

The POETInfinity chiplet complements the POET 800G 2xFR4 Receiver optical engine that the Company announced in February 2023 and completes the 800G chipset for 2xFR4 QSFP-DD or OSFP applications with two Infinity chiplets and one Receiver optical engine. Both customers intend to develop 800G 2xFR4 QSFP-DD and OSFP transceiver solutions using the POET Optical Engine chipsets.

The Infinity chiplet is the industry’s first implementation of directly modulated lasers (DMLs) for 100G/lane applications. DMLs are power efficient, and cost-effective and become a highly scalable solution when paired with the POET Optical Interposer platform. The chiplet incorporates 100G PAM4 DMLs, DML Drivers, and an integrated optical multiplexer for a complete 400GBASE-FR4 transmitter solution on a chip. The small size of the chiplet and a daisy-chain architecture enables side-by-side placement of multiple instances to achieve 800G and 1.6T speeds.

The POET Infinity product line carries forward the POET differentiation of all passive alignments and monolithically integrated waveguides, multiplexers and demultiplexers, which translates to lower cost, lower power consumption, and ease-of-assembly benefits for customers.

“The availability of a transmitter solution for 400G, 800G, and 1.6T speeds that is power efficient, cost-effective, and highly scalable for the data center market is a major achievement,” said Dr. Suresh Venkatesan, Chairman & CEO of POET. “Our customers are excited to receive the samples and test them because it simplifies their transceiver design significantly and shortens the design cycle with POET optical engines that incorporate all of the required optical elements as well as the key electronic components, including laser drivers and trans-impedance amplifiers.”

The development of a production version of the POET Infinity chiplet is on track and POET expects to deliver beta samples by Q4 of 2023 and start production by the first half of 2024. The ethernet transceiver market for 400G and above data rates is projected to exceed $6 billion by 2028.

 

Planners must take into account that the combined effects of densely-packed housing and climate change can make living in cities unendurable. Photo: Shutterstock
Planners must take into account that the combined effects of densely-packed housing and climate change can make living in cities unendurable. Photo: Shutterstock

Norway simulates air currents for more habitable cities

Densely-packed housing makes urban areas vulnerable to overheating, pollution, and dangerous wind gusts. The effects of climate change can aggravate these problems, but we can also work to prevent them. This can be done by simulating microclimates.

Cities and urban areas are characterized by high population densities and widespread man-made environments. In city centers, this means surfaces covered with concrete and asphalt, less vegetation, more air pollution, and large, densely-packed buildings that reduce both air circulation and access to daylight. 

At the same time, and especially in coastal areas, extremely high wind velocities can develop between tall buildings due to airflow channeling effects. This can make pedestrian walkways and cycle paths dangerous and unsafe, especially for vulnerable groups.

Climate change can make cities unbearable to live in

The impact of climate change may be reinforcing the negative effects of densely-packed housing, and many places in the world are at risk of becoming uninhabitable. Europe is probably not the most severely affected, but even here the effects of climate change, such as heat waves, droughts, and flooding, represent a threat to both infrastructure and human life. The European heatwave of 2003 claimed more than 70,000 lives. The summer of 2022 was the hottest ever experienced in Europe, with more than 20,000 people dying as a direct result of the heat. 

For this reason, it is increasingly important to reduce the impact of extreme heat in urban areas, especially in southern Europe, but also in Norway. In Oslo, in particular, and in the southern parts of Østlandet County, we can expect to experience heat waves in the future. This is why we must do what we can to create habitable conditions in our cities, and in urban buildings, when temperatures increase to levels that we are not used to. We must begin during the planning phase of the construction process.

Advanced simulations deliver more accurate airflow calculations

So, how can urban planners take a warmer climate into account? Advanced simulation tools such as Computational Fluid Dynamics (CFD) can calculate air movements in a variety of environments. This can be done as early as during the planning phase of larger building projects and as part of the overall urban planning process.

In brief, CFD offers a method of calculating the movement of fluids, including liquids and gases such as air. In recent decades, there has been a boom in the use of such methods, in step with the increase in the power of modern computers. In the last 15 to 20 years, the method has also been applied in studies of microclimates. Microclimates represent climatic conditions that develop at scales of less than two kilometers, and for which in the past we based our investigations on measurements and observations.

Trondheim study demonstrates the effect of different measures

Studies of the microclimate at Gløshaugen in Trondheim have demonstrated the effectiveness of the use of various building materials, as well as the impact that the volume of vegetation and natural environments had on local climatic conditions, and how these factors impact the energy consumption of buildings and the thermal comfort of people outdoors.

The simulations also showed that the cumulative evaporation from lawns and trees can reduce the air temperature by several degrees (up to 2.4 °C at Gløshaugen), and thus reduce the need for cooling the buildings in summer. 

These results can be used to look into different scenarios for urban design or to calculate future urban climates. Having said this, it is difficult to give general recommendations because the systems are highly location-specific, and measures must take into account all the relevant factors that influence the microclimate in a given situation.

Multiple applications

CFD can also be used to make high-resolution calculations of wind conditions, such as turbulence and wind speed. In this way, it will be possible to identify sites where urban wind turbines for local renewable energy generation can be profitably located. Moreover, the dispersion of harmful emissions and pollution can be simulated in areas such as heavily trafficked roads, and industrial plants, and also in the aftermath of accidents and fires. 

In other words, CFD can deliver useful information to inform construction projects, urban planning, and architecture, which in the past has been inaccessible in terms of the level of resolution, precision, and speed that we are seeing today.

Lloyd Minor and Fei-Fei Li
Lloyd Minor and Fei-Fei Li

Stanford reveals a responsible AI initiative, RAISE-Health

Responsible AI for Safe and Equitable Health will address ethical and safety issues in AI innovation, define standards for the field, and convene experts on the topic.

Responding to rapid advances in artificial intelligence and the urgent need to define its responsible use in health and medicine, Stanford Medicine and the Stanford Institute for Human-Centered Artificial Intelligence (HAI) today announced the launch of RAISE-Health (Responsible AI for Safe and Equitable Health). This pioneering initiative seeks to address critical ethical and safety issues surrounding AI innovation and help others navigate this complex and evolving field. 

Co-led by Stanford School of Medicine dean Lloyd Minor, MD, and Stanford HAI co-director and computer science professor Fei-Fei Li, Ph.D., the new initiative will establish a go-to platform for responsible AI in health and medicine; define a structured framework for ethical standards and safeguards; and regularly convene a diverse group of multidisciplinary innovators, experts and decision-makers on the topic.

Both awareness of AI and skepticism about its use in health care have skyrocketed in the last 12 months. According to a recent Pew survey, a majority of Americans said they would be uncomfortable with their provider using AI in their own health care, underscoring the crossroads at which society finds itself.

“AI has the potential to impact every aspect of health and medicine,” Minor said. “We have to act with urgency to ensure that this technology advances in line with the interests of everyonefrom the research bench to the patient bedside and beyond.”

The goals of the RAISE-Health initiative include enhancing clinical care outcomes through responsible integration of AI; accelerating research to solve the biggest challenges in health and medicine; and educating patients, care providers and researchers to navigate AI advances.

“AI is evolving at an incredible pace; so, too, must our capacity to manage, navigate and direct its path,” said Li, whose research includes a focus on ambient intelligence — using AI to monitor and respond to human activity in homes, hospitals, and other environments. “Through this initiative, we are seeking to engage our students, our faculty, and the broader community to help shape the future of AI, ensuring it reflects the interests of all stakeholders — patients, families, and society at large.”

Building on the goals of Stanford HAI, groundbreaking Stanford faculty research, and ongoing collaborations with policymakers and Silicon Valley innovators, RAISE-Health will be a trusted repository for the AI work being done at Stanford University, Stanford Medicine, and well beyond — hosting standards, tools, models, data, research, and best practices.

While AI offers the potential for transforming health globally, decision-makers must first address AI’s safety and ethical use to responsibly harness its full potential and build public trust in these systems.