Amazon rainforest near Manaus, Brazil. Photo: CIFOR, Neil Palmer/CIAT on Flickr
Amazon rainforest near Manaus, Brazil. Photo: CIFOR, Neil Palmer/CIAT on Flickr

The heat from the Amazon rainforest drives temperatures around the globe

While the Amazon rainforest and the Tibetan Plateau sit on different sides of the globe, scientists now discovered that changes in the South American ecosystem can trigger changes in the vicinity of the Himalayas

“Logging, road construction, and warming are already today stressing the Amazon rainforest, and will likely do so even more in the future – and while the Amazon region is of course an important Earth system element by itself, it’s also a burning question if and how changes in that region could affect other parts of the world,” explains Jingfang Fan from Beijing Normal University, China, and the Potsdam Institute for Climate Impact Research (PIK) in Germany. “For the first time, we’ve now been able to robustly identify and quantify these so-called teleconnections. Our research confirms that Earth system tipping elements are indeed inter-linked even over long distances, and the Amazon is one key example of how this could play out.”

Analysis of air temperature changes in 65.000 subregions in the past 40 years

The researchers analyzed near-surface air temperature changes in a grid of more than 65.000 subregions, regarded as nodes, they put on the globe, using data from the past 40 years. By doing so, they could see how changes at one node influenced those at another one. They succeeded to detect a pronounced propagation pathway over more than 20.000 kilometers –  from South America via Southern Africa to the Middle East and finally to the Tibetan Plateau. This pathway can be explained by the main atmospheric and oceanic circulation patterns.

In the next step, the researchers used state-of-the-art climate supercomputer simulations to see how global warming, caused by greenhouse gas emissions from burning fossil fuels, might modify the long-distance linkages until 2100. “We’ve been surprised to see how strongly climate extremes in the Amazon are connected to climate extremes in Tibet,” says Jürgen Kurths from PIK, a co-author of the paper. “When it’s getting warmer in the Amazon, it also does so in Tibet, hence for temperature, there’s a positive correlation. It’s different for precipitation. When we have more rain in the Amazon, there’s less snowfall in Tibet.”

The researchers detected the early warning signals based on the snow cover data and reveal that the Tibetan Plateau has been losing stability and approaching a tipping point since 2008. “This has been overlooked so far,” says Kurths. Despite its remote location, the Tibetan Plateau is relevant for a lot of people’s livelihoods due to its role as an important water reservoir.

"This is a risk we should rather avoid"

“Our research underlines that tipping cascades are a risk to be taken seriously: inter-linked tipping elements in the Earth system can trigger each other, with potentially severe consequences,” says Hans Joachim Schellnhuber from PIK, also a co-author. “To be clear, it’s unlikely that the climate system as a whole will tip. Yet, over time, sub-continental tipping events can severely affect entire societies and threaten important parts of the biosphere. This is a risk we should rather avoid. And we can do so by rapidly reducing greenhouse gas emissions and by developing nature-based solutions for removing CO2 from the atmosphere.”

UK scientists perform supercomputer simulations to predict the behavior of TBOS as it turns to glass

Research from an international team of scientists has cast new light on the physics of vitrification – the process by which glass forms.

Their findings, which center on the analysis of a common feature of glasses called the boson peak, could help pave the way for new developments in materials science.

The peak can be observed in a glass when special equipment is used to study the vibrations of its constituent atoms, where it spikes in the terahertz range. The boson peak also gives glasses a characteristic additional heat capacity over crystals formed from the same material.

The extra-low vibrations of atoms or molecules that cause the boson peak are believed to play a role in whether a cooling liquid forms a glass or a crystal, but the process is still not fully understood.

Researchers from the UK, Slovenia, and Japan worked together to analyze and model how the boson peak emerges in samples of tetrabutyl orthosilicate - a viscous liquid that does not crystallize and is used in the production of some types of glass.

Professor Klaas Wynne, of the University of Glasgow’s School of Chemistry, is one of the paper’s corresponding authors. Prof Wynne said: “This work helps to advance our understanding of vitrification, which is something of a hot topic in physics at the moment.

“When liquids are cooled quickly, they can form either glasses or crystals - a process that is poorly understood but important to applications.

“Glasses can be made from a wide range of materials, and they are used in all kinds of industries outside of the obvious application of windowpanes. Strong, flexible metallic glasses are used in aviation, for example, and others can be used in drugs where they can help control the rate that medication is absorbed into the body.

“However, a process called secondary relaxation can cause crystals to form in glasses after they cool, sometimes years later. It’s still not entirely clear which molecular processes cause this to happen, and a better understanding of how glasses form could help us make better, safer glasses in the future.”

“One of the challenges of investigating the boson peak is that it happens alongside other processes like molecular vibrations and rotations, which makes it difficult to isolate and analyze. We set out to examine how the boson peak functions under different conditions, using a range of techniques, to help expand our understanding of glass formation.”

The researchers chose to study tetrabutyl orthosilicate or TBOS, because its molecular structure is symmetrical, which makes it easier to isolate the boson peak from all the other contributions. They used a suite of observation techniques, including Raman spectroscopy, to monitor the behavior of TBOS molecules as they cooled from a liquid into the glass under a range of temperature conditions.

They were able to see for the first time that, as TBOS cools to form a glass, it begins but does not complete the process of crystallization, offering a key insight into the molecular process of vitrification.

In parallel with the experimental techniques, researchers at the University of Warwick performed supercomputer simulations that were capable of accurately reflecting the laboratory observations and correctly predicting the behavior of TBOS as it turns to glass. 

Dr. Gabriele Sosso, of the Department of Chemistry at the University of Warwick, is also a corresponding author of the paper. Dr. Sosso added: “The symmetry of the TBOS molecules provided a unique opportunity to make a connection between modeling and experiments.
 
“In the past few years, we have learned a lot about glasses, large thanks to computer simulations of what we often refer to as ‘simple’ models – think of two- or three-dimensional networks of spherical particles. These simple models are incredibly useful to unravel the subtleties of disordered systems - TBOS, however, is a whole different beast! It was very rewarding to apply what the community has taught us about model systems to a real-life molecular glass such as TBOS.
 
“The fact that the boson peak in glassy TBOS seems to emerge from very specific structural features represents an incredibly enticing prospect for the computational community. I for one cannot wait to see what these structural features would look like in other types of molecular glasses – exciting times ahead!”

UC Berkeley astronomer Joshua Dillon under one of the HERA radio dishes in 2017. (Photo courtesy of Joshua Dillon)
UC Berkeley astronomer Joshua Dillon under one of the HERA radio dishes in 2017. (Photo courtesy of Joshua Dillon)

UC Berkeley astronomers double the sensitivity of the HERA array

An array of 350 radio telescopes in the Karoo desert of South Africa is getting closer to detecting “cosmic dawn” — the era after the Big Bang when stars first ignited and galaxies began to bloom.

The Hydrogen Epoch of Reionization Array (HERA) team says that it has doubled the sensitivity of the array, which was already the most sensitive radio telescope in the world dedicated to exploring this unique period in the history of the universe. 

While they have yet to detect radio emissions from the end of the cosmic dark ages, their results do provide clues to the composition of stars and galaxies in the early universe. In particular, their data suggest that early galaxies contained very few elements besides hydrogen and helium, unlike our galaxies today. A 13.8-billion-year cosmic timeline indicates the era shortly after the Big Bang observed by the Planck satellite, the era of the first stars and galaxies observed by HERA and the era of galaxy evolution to be observed by NASA’s future James Webb Space Telescope. HERA image.

When the radio dishes are fully online and calibrated, ideally this fall, the team hopes to construct a 3D map of the bubbles of ionized and neutral hydrogen as they evolved from about 200 million years ago to around 1 billion years after the Big Bang. The map could tell us how early stars and galaxies differed from those we see around us today, and how the universe as a whole looked in its adolescence.

“This is moving toward a potentially revolutionary technique in cosmology. Once you can get down to the sensitivity you need, there’s so much information in the data,” said Joshua Dillon, a research scientist in the University of California, Berkeley’s Department of Astronomy. “A 3D map of most of the luminous matter in the universe is the goal for the next 50 years or more.”

Other telescopes also are peering into the early universe. The new James Webb Space Telescope (JWST) has now imaged a galaxy that existed about 325 million years after the birth of the universe in the Big Bang. But the JWST can see only the brightest of the galaxies that formed during the Epoch of Reionization, not the smaller but far more numerous dwarf galaxies whose stars heated the intergalactic medium and ionized most of the hydrogen gas.

HERA seeks to detect radiation from the neutral hydrogen that filled the space between those early stars and galaxies and, in particular, determine when that hydrogen stopped emitting or absorbing radio waves because it became ionized.

The fact that the HERA team has not yet detected these bubbles of ionized hydrogen within the cold hydrogen of the cosmic dark age rules out some theories of how stars evolved in the early universe.

Specifically, the data show that the earliest stars, which may have formed around 200 million years after the Big Bang, contained few other elements than hydrogen and helium. This is different from the composition of today’s stars, which have a variety of so-called metals, the astronomical term for elements, ranging from lithium to uranium, that are heavier than helium. The finding is consistent with the current model for how stars and stellar explosions produced most of the other elements.

“Early galaxies have to have been significantly different than the galaxies that we observe today in order for us not to have seen a signal,” said Aaron Parsons, principal investigator for HERA and a UC Berkeley associate professor of astronomy. “In particular, their X-ray characteristics have to have changed. Otherwise, we would have detected the signal we’re looking for.”

The atomic composition of stars in the early universe determined how long it took to heat the intergalactic medium once stars began to form. Key to this is the high-energy radiation, primarily X-rays, produced by binary stars where one of them has collapsed into a black hole or neutron star and is gradually eating its companion. With few heavy elements, a lot of the companion’s mass is blown away instead of falling onto the black hole, meaning fewer X-rays and less heating of the surrounding region.

The new data fit the most popular theories of how stars and galaxies first formed after the Big Bang, but not others. Preliminary results from the first analysis of HERA data reported a year ago, hinted that those alternatives — specifically, cold reionization — were unlikely.

“Our results require that even before reionization and by as late as 450 million years after the Big Bang, the gas between galaxies must have been heated by X-rays. These likely came from binary systems where one star is losing mass to a companion black hole,” Dillon said. “Our results show that if that’s the case, those stars must have been very low ‘metallicity,’ that is, very few elements other than hydrogen and helium in comparison to our sun, which makes sense because we’re talking about a period in time in the universe before most of the other elements were formed.”

The Epoch of Reionization

The origin of the universe in the Big Bang 13.8 billion years ago produced a hot cauldron of energy and elementary particles that cooled for hundreds of thousands of years before protons and electrons combined to form atoms — primarily hydrogen and helium. Looking at the sky with sensitive telescopes, astronomers have mapped in detail the faint variations in temperature from this moment — what’s known as the cosmic microwave background — a mere 380,000 years after the Big Bang.

Aside from this relict heat radiation, however, the early universe was dark. As the universe expanded, the clumpiness of matter seeded galaxies and stars, which in turn produced radiation — ultraviolet and X-rays — that heated the gas between stars. At some point, hydrogen began to ionize — it lost its electron — and formed bubbles within the neutral hydrogen, marking the beginning of the Epoch of Reionization.

To map these bubbles, HERA and several other experiments are focused on a wavelength of light that neutral hydrogen absorbs and emits, but ionized hydrogen does not. Called the 21-centimeter line (a frequency of 1,420 megahertz), it is produced by the hyperfine transition, during which the spins of the electron and proton flip from parallel to antiparallel. Ionized hydrogen, which has lost its only electron, doesn’t absorb or emit this radio frequency.

Since the Epoch of Reionization, the 21-centimeter line has been red-shifted by the expansion of the universe to a wavelength 10 times as long — about 2 meters, or 6 feet. HERA’s rather simple antennas, a construct of chicken wire, PVC pipe, and telephone poles, are 14 meters across to collect and focus this radiation onto detectors.

“At two meters wavelength, a chicken wire mesh is a mirror,” Dillon said. “And all the sophisticated stuff, so to speak, is in the supercomputer backend and all of the data analysis that comes after that.”

The new analysis is based on 94 nights of observing in 2017 and 2018 with about 40 antennas — phase 1 of the array. Last year’s preliminary analysis was based on 18 nights of phase 1 observations.

The new paper’s main result is that the HERA team has improved the sensitivity of the array by a factor of 2.1 for light emitted about 650 million years after the Big Bang (a redshift, or an increase in wavelength, of 7.9), and 2.6 for radiation emitted about 450 million years after the Big Bang (a redshift of 10.4).

The HERA team continues to improve the telescope’s calibration and data analysis in hopes of seeing those bubbles in the early universe, which are about 1 millionth the intensity of the radio noise in the neighborhood of Earth. Filtering out the local radio noise to see the radiation from the early universe has not been easy.

“If it’s Swiss cheese, the galaxies make the holes, and we’re looking for the cheese,” so far, unsuccessfully, said David Deboer, a research astronomer in UC Berkeley’s Radio Astronomy Laboratory.

Extending that analogy, however, Dillon noted, “What we’ve done is we’ve said the cheese must be warmer than if nothing had happened. If the cheese were really cold, it turns out it would be easier to observe that patchiness than if the cheese were warm.”

That mostly rules out cold reionization theory, which posited a colder starting point. The HERA researchers suspect, instead, that the X-rays from X-ray binary stars heated up the intergalactic medium first.

“The X-rays will effectively heat up the whole block of cheese before the holes will form,” Dillon said. “And those holes are the ionized bits.”

“HERA is continuing to improve and set better and better limits,” Parsons said. “The fact that we’re able to keep pushing through, and we have new techniques that are continuing to bear fruit for our telescope, is great.”

The HERA collaboration is led by UC Berkeley and includes scientists from across North America, Europe, and South Africa. The construction of the array is funded by the National Science Foundation and the Gordon and Betty Moore Foundation, with key support from the government of South Africa and the South African Radio Astronomy Observatory (SARAO).

Illustration of DNA molecules  Credit: KTSDESIGN/SCIENCE PHOTO LIBRARY via Getty Images
Illustration of DNA molecules Credit: KTSDESIGN/SCIENCE PHOTO LIBRARY via Getty Images

Cambridge chemists use Chem-map to lift the veil from the human genome black box

Many life-saving drugs directly interact with DNA to treat diseases such as cancer, but scientists have struggled to detect how and why they work, until now.

The University of Cambridge researchers have proposed a new DNA sequencing method to detect where and how small molecule drugs interact with the targeted genome.

“Understanding how drugs work in the body is essential to creating better, more effective therapies,” said Dr. Zutao Yu from the Yusuf Hamied Department of Chemistry. “But when a therapeutic drug enters a cancer cell with a genome that has three billion bases, it’s like entering a black box.”

The powerful method, called Chem-map, lifts the veil of this genomic black box by enabling researchers to detect where small molecule drugs interact with their targets on the DNA genome.

Each year, millions of cancer patients receive treatment with genome-targeting drugs, such as doxorubicin. But despite decades of clinical use and research, the molecular mode of action with the genome is still not well-understood.

“Lots of life-saving drugs directly interact with DNA to treat diseases such as cancer,” said co-first author Dr. Jochen Spiegel. “Our new method can precisely map where drugs bind to the genome, which will help us to develop better drugs in the future.”

Chem-map allows researchers to conduct in situ mapping of small molecule-genome interactions with unprecedented precision, by using a strategy called small-molecule-directed transposase Tn5 tagmentation. This detects the binding site in the genome where a small molecule binds to genomic DNA or DNA-associated proteins.

In the study, the researchers used Chem-map to determine the direct binding sites of the widely used anticancer drug doxorubicin in human leukemia cells. The technique also showed how the combined therapy of using doxorubicin on cells already exposed to the histone deacetylase (HDAC) inhibitor tucidinostat could have a potential clinical advantage.

The technique was also used to map the binding sites of certain molecules on DNA G-quadruplexes, known as G4s. G4s are four-stranded secondary structures that have been implicated in gene regulation and could be possible targets for future anti-cancer treatments.

“I am so proud that we have been able to solve this longstanding problem – we have established a highly efficient approach which will open many paths for new research,” said Yu.

Professor Sir Shankar Balasubramanian, who led the research, said: “Chem-map is a powerful new method to detect the site in the genome where a small molecule binds to DNA or DNA-associated proteins. It provides enormous insights on how some drug therapies interact with the human genome and makes it easier to develop more effective and safer drug therapies.”

Yi Xing, PhD, leads the Center for Computational and Genomic Medicine at Children's Hospital of Philadelphia
Yi Xing, PhD, leads the Center for Computational and Genomic Medicine at Children's Hospital of Philadelphia

CHOP researcher Dr. Xing develops more accurate computational tool for long-read RNA sequencing

The tool, called ESPRESSO, will allow for better diagnosis of rare genetic diseases caused by disrupted RNA and for the discovery of potential therapeutic targets in diseases like cancer

On the journey from gene to protein, a nascent RNA molecule can be cut and joined, or spliced, in different ways before being translated into a protein. This process, known as alternative splicing, allows a single gene to encode several different proteins. Alternative splicing occurs in many biological processes, like when stem cells mature into tissue-specific cells. In the context of disease, however, alternative splicing can be dysregulated. Therefore, it is important to examine the transcriptome – that is, all the RNA molecules that might stem from genes – to understand the root cause of a condition.

However, historically it has been difficult to "read" RNA molecules in their entirety because they are usually thousands of bases long. Instead, researchers have relied on so-called short-read RNA sequencing, which breaks RNA molecules and sequences them in much shorter pieces – somewhere between 200 to 600 bases, depending on the platform and protocol. Supercomputer programs are then used to reconstruct the full sequences of RNA molecules. Short-read RNA sequencing can give highly accurate sequencing data, with a low per-base error rate of approximately 0.1% (meaning one base is incorrectly determined for every 1,000 bases sequenced). Nevertheless, it is limited in the information that it can provide due to the short length of the sequencing reads. In many ways, short-read RNA sequencing is like breaking a large picture into jigsaw pieces that are all the same shape and size and then trying to piece the picture back together.

Recently, "long-read" platforms that can sequence RNA molecules over 10,000 bases in length end-to-end have become available. These platforms do not require RNA molecules to be broken up before being sequenced, but they have a much higher per-base error rate, typically between 5% and 20%. This well-known limitation has severely hampered the widespread adoption of long-read RNA sequencing. In particular, the high error rate has made it difficult to determine the validity of novel, previously unknown RNA molecules discovered in a particular condition or disease.

To circumvent this problem, researchers at the Children's Hospital of Philadelphia (CHOP) have developed a new computational tool that can more accurately discover and quantify RNA molecules from these error-prone long-read RNA sequencing data. The tool, called ESPRESSO (Error Statistics PRomoted Evaluator of Splice Site Options), was reported today in Science Advances.

"Long-read RNA sequencing is a powerful technology that will allow us to uncover RNA variation in rare genetic diseases and other conditions, like cancer," said Yi Xing, Ph.D., director of the Center for Computational and Genomic Medicine at CHOP and senior author of the study. "We are probably at an inflection point in how we discover and analyze RNA molecules. The transition from short-read to long-read RNA sequencing represents an exciting technological transformation and computational tools that reliably interpret long-read RNA sequencing data are urgently needed."

ESPRESSO can accurately discover and quantify different RNA molecules from the same gene – known as RNA isoforms – using error-prone long-read RNA sequencing data alone. To do so, the computational tool compares all long RNA sequencing reads of a given gene to its corresponding genomic DNA and then uses the error patterns of individual long reads to confidently identify splice junctions – places where the nascent RNA molecule has been cut and joined – as well as their corresponding full-length RNA isoforms. By finding areas of perfect matches between long RNA sequencing reads and genomic DNA, as well as borrowing information across all long RNA sequencing reads of a gene, the tool can identify highly reliable splice junctions and RNA isoforms, including those that have not been previously documented in existing databases.

The researchers evaluated the performance of ESPRESSO using simulated data and data on real biological samples. They found that ESPRESSO performs better than multiple currently available tools, both in terms of discovering RNA isoforms and quantifying them. The researchers also generated and analyzed over 1 billion long RNA sequencing reads covering 30 human tissue types and three human cell lines, providing a useful resource for studying human transcriptome variation at the resolution of full-length RNA isoforms.

"ESPRESSO addresses a long-standing problem of long-read RNA sequencing and could usher in discovery opportunities," Dr. Xing said. "We envision that ESPRESSO will be a useful tool for researchers to explore the RNA repertoire of cells in various biomedical and clinical settings."

This work was supported in part by the Immuno-Oncology Translational Network (IOTN) of the National Cancer Institute's Cancer Moonshot Initiative (U01CA233074), other National Institutes of Health funding (R01GM088342, R01GM121827, and R56HG012310), along with a National Institutes of Health T32 Training Grant in Computational Genomics (T32HG000046).

Gao et al. "ESPRESSO: Robust discovery and quantification of transcript isoforms from error-prone long-read RNA-seq data," Science Advances, January 20, 2023, DOI: 10.1126/sciadv.abq5072