Dr Tristan Salles from the School of Geosciences. Photo: Stefanie Zingsheim
Dr Tristan Salles from the School of Geosciences. Photo: Stefanie Zingsheim

University of Sydney geoscientist Salles builds a model that reveals Earth's past 100 million years

The digital tool can help us understand the past, predict Earth's future

For the first time, scientists have a high-resolution model of how today's geophysical landscapes were created and how millions of tonnes of sediment have flowed to the oceans.

Climate, tectonics, and time combine to create powerful forces that craft the face of our planet. Add the gradual sculpting of the Earth’s surface by rivers and what to us seems solid as a rock is constantly changing.

However, our understanding of this dynamic process has at best been patchy. 

{media id=298,layout=solo}

Scientists today have published new research revealing a detailed and dynamic model of the Earth’s surface over the past 100 million years. 

Working with scientists in France, University of Sydney geoscientists have published this new model in the prestigious journal Science.

For the first time, it provides a high-resolution understanding of how today’s geophysical landscapes were created and how millions of tonnes of sediment have flowed to the oceans.

Lead author Dr. Tristan Salles from the University of Sydney School of Geosciences said: “To predict the future, we must understand the past. But our geological models have only provided a fragmented understanding of how our planet’s recent physical features formed.

“If you look for a continuous model of the interplay between river basins, global-scale erosion, and sediment deposition at high resolution for the past 100 million years, it just doesn’t exist.

“So, this is a big advance. It’s not only a tool to help us investigate the past but will help scientists understand and predict the future, as well.”  Lead author Dr Tristan Salles from the School of Geosciences at the University of Sydney.  CREDIT Stefanie Zingsheim, The University of Sydney

Using a framework incorporating geodynamics, tectonic and climatic forces with surface processes, the scientific team has presented a new dynamic model of the past 100 million years at high resolution (down to 10 kilometers), broken into frames of a million years.

Second author Dr. Laurent Husson from Institut des Sciences de la Terre in Grenoble, France, said: “This unprecedented high-resolution model of Earth’s recent past will equip geoscientists with a more complete and dynamic understanding of the Earth’s surface.

“Critically, it captures the dynamics of sediment transfer from the land to oceans in a way we have not previously been able to.”

Dr. Salles said that understanding the flow of terrestrial sediment to marine environments is vital to comprehend present-day ocean chemistry.

“Given that ocean, chemistry is changing rapidly due to human-induced climate change, having a more complete picture can assist our understanding of marine environments,” he said. 

{media id=299,layout=solo}

The model will allow scientists to test different theories as to how the Earth’s surface will respond to changing climate and tectonic forces.

Further, the research provides an improved model to understand how the transportation of Earth sediment regulates the planet’s carbon cycle over millions of years.

“Our findings will provide a dynamic and detailed background for scientists in other fields to prepare and test hypotheses, such as in biochemical cycles or biological evolution.” 

Authors Dr. Salles, Dr. Claire Mallard, and Ph.D. student Beatriz Hadler Boggiani are members of the EarthColab Group, and Associate Professor Patrice Rey and Dr. Sabin Zahirovic are part of the EarthByte Group. Both groups are in the School of Geosciences at the University of Sydney.

The research was undertaken in collaboration with French geoscientists from CNRS, France, Université Lyon, and ENS Paris.

Dr Marietta Iacucci MD, PhD
Dr Marietta Iacucci MD, PhD

Birmingham prof Iacucci builds AI to predict future flares of ulcerative colitis activity

Ulcerative colitis assessment could be improved after new research shows that an artificial intelligence model could predict flare-ups and complications after reading biopsies. intestine g31a8cc64c 1920.xa026b8ce c453c

In a new paper published in Gastroenterology today, researchers supported by the National Institute for Health and Care Research Birmingham Biomedical Research Centre have trialed an AI diagnostic tool that can read digitized biopsies taken during colonoscopy.

The Computer-Aided Diagnostic model was able to predict the risk of flare-ups for ulcerative colitis, which is a relapsing-remitting condition and makes the prognosis for the disease uncertain. In the trial, the model was able to predict patients at risk of a flare in the disease as well as humans.

The system was trained on existing digitized biopsies and was able to detect activity related to ulcerative colitis with 89% accuracy for positive results. It was also able to identify markers of inflammation activity and healing in the same area as biopsies were taken with 80% accuracy, similar to human pathologists.

Professor Marietta Iacucci from the Institute of Immunology and Immunotherapy at the University of Birmingham and University College Cork in Ireland, and co-lead author of the paper said:

“The power of AI in healthcare is evident in trials like these, where a model can be used to standardize in real-time histological assessment of Ulcerative Colitis disease activity. But most importantly it provides analytical support and enables clinicians to support those at the greatest risk of relapsing symptoms and disease course.

“Ulcerative Colitis is a complex condition to predict, and developing machine learning-derived systems to make this diagnostic job quicker and more accurate could be a game changer. As models like this further develop, the predictive quality is likely to improve even more, and our paper demonstrates how beneficial such technology could be for clinicians and, crucially patients.”

Dr. John-Jose Nunez
Dr. John-Jose Nunez

UBC develops AI model that predicts cancer patient survival by reading doctor's notes

A team of researchers from the University of British Columbia and BC Cancer have developed an artificial intelligence (AI) model that predicts cancer patient survival more accurately and with more readily available data than previous tools.

The model uses natural language processing (NLP) – a branch of AI that understands complex human language – to analyze oncologist notes following a patient’s initial consultation visit—the first step in the cancer journey after diagnosis. By identifying characteristics unique to each patient, the model was shown to predict six-month, 36-month, and 60-month survival with greater than 80 percent accuracy. The findings were published today in JAMA Network Open.

“Predicting cancer survival is an important factor that can be used to improve cancer care,” said lead author Dr. John-Jose Nunez, a psychiatrist and clinical research fellow with the UBC Mood Disorders Centre and BC Cancer. “It might suggest health providers make an earlier referral to support services or offer a more aggressive treatment option upfront. Our hope is that a tool like this could be used to personalize and optimize the care a patient receives right away, giving them the best outcome possible.” Cancer Research AI Artificial Intelligence 1200x700 d57da

Traditionally, cancer survival rates have been calculated retrospectively and categorized by only a few generic factors such as cancer site and tissue type. Despite familiarity with these rates, it can be challenging for oncologists to accurately predict a patient’s survival due to the many complex factors that influence patient outcomes.

The model developed by Dr. Nunez and his collaborators, which includes researchers from BC Cancer and UBC’s departments of computer science and psychiatry, is able to pick up on unique clues within a patient’s initial consultation document to provide a more nuanced assessment. It is also applicable to all cancers, whereas previous models have been limited to certain cancer types.  

“The AI essentially reads the consultation document similar to how a human would read it,” said Dr. Nunez. “These documents have many details like the age of the patient, the type of cancer, underlying health conditions, past substance use, and family histories. The AI combines all this to paint a complete picture of patient outcomes.”

The researchers trained and tested the model using data from 47,625 patients across all six BC Cancer sites located across British Columbia. To protect privacy, all patient data remained stored securely at BC Cancer and was presented anonymously. Unlike chart reviews by human research assistants, the new AI approach has the added benefit of maintaining complete confidentiality of patient records.

“Because the model is trained on B.C. data, that makes it a potentially powerful tool for predicting cancer survival here in the province,” said Dr. Nunez.

In the future, the technology could be applied in cancer clinics across Canada and around the world.

“The great thing about neural NLP models is that they are highly scalable, portable, and don’t require structured data sets,” said Dr. Nunez. “We can quickly train these models using local data to improve performance in a new region. I would suspect that these models provide a good foundation anywhere in the world where patients are able to see an oncologist.”

Dr. Nunez is a recipient of the 2022/23 UBC Institute of Mental Health Marshall Fellowship and is supported by funding from the BC Cancer Foundation. In another stream of work, Dr. Nunez is examining how to facilitate the best-possible psychiatric and counseling care for cancer patients using advanced AI techniques. He envisions a future where AI is integrated into many aspects of the health system to improve patient care.

“I see AI acting almost like a virtual assistant for physicians,” said Dr. Nunez. “As medicine gets more and more advanced, having AI to help sort through and make sense of all the data will help inform physician decisions. Ultimately, this will help improve quality of life and outcomes for patients.”

The Karl G. Jansky Very Large Array radio telescope consists of 27 giant telescope dishes (each 25 meters in diameter) in New Mexico. Image credit: Shutterstock.
The Karl G. Jansky Very Large Array radio telescope consists of 27 giant telescope dishes (each 25 meters in diameter) in New Mexico. Image credit: Shutterstock.

Oxford produces new insights into binary star systems

Researchers from the University of Oxford have contributed to a major international study that has captured a rare and fascinating space phenomenon. An artist’s impression of an X-ray binary star system. This shows the normal star in yellow/red, the disc of rotating matter, the neutron star sitting in its centre, and the ‘radio jet’ shooting out. Credit: Instituto de Astrofísica de Canarias

Scientists have long been intrigued by X-ray binary star systems, where two stars orbit around each other with one of the two stars being either a black hole or a neutron star. Both black holes and neutron stars are created in supernova explosions and are very dense – giving them a massive gravitational pull. This makes them capable of capturing the outer layers of the normal star that orbits around it in the binary system, seen as a rotating disc of matter (mimicking a whirlpool) around the black hole/neutron star.

According to theoretical calculations, these rotating discs should show a dynamic instability: about once an hour, the inner parts of the disc rapidly fall onto the black hole/neutron star, after which these inner regions re-fill and the process repeats. Up to now, this violent and extreme process had only been directly observed once, in a black hole binary system. For the first time, it has now been seen in a neutron star binary system, called Swift J1858.6-0814. This discovery demonstrates that this instability is a general property of these discs (and not caused by the presence of a black hole).

The phenomenon was captured by combining data from five ground-based and space-based telescopes, together encompassing multiple wavelengths. The scientific team, an international collaboration of astronomers led by the Instituto de Astrofísica de Canarias, formed ad-hoc when the neutron star system was first discovered in 2018. These telescopes include the Karl G. Jansky Very Large Array: one of the world’s most sensitive radio telescopes, located in New Mexico, consisting of 27 massive (25-meter diameter) telescope dishes.

Dr. Jakob van den Eijnden, from the University of Oxford’s Department of Physics, lead the analysis of the data from the Karl G. Jansky Very Large Array. He said: ‘Our observations of the radio wavelength data highlighted an important property of these instabilities. We found that when the whirlpool empties, some of the gas is shot into space in so-called "radio jets": narrow beams of gas shot out at speeds close to the speed of light." Dr Jakob van den Eijnden. Image credit: St Hilda’s College, University of Oxford.

"I think that the international collaboration and involvement of many early-career researchers is one of the most exciting aspects of this work. We analyzed a truly unique dataset, that was extremely challenging to collect because the gas capturing process is ‘transient’: it takes place for only a couple of months, unpredictably, before shutting off again," said Eijnden.

The brightness of these jets is observed to be variable, which is now explained by blobs of jet material being launched at these extreme speeds whenever the disc starts or finishes emptying (causing peaks in brightness). When the disc stabilizes, the jets cease and the brightness reduces. Drawing this conclusion was only possible by comparing the variability observed with telescopes across the electromagnetic spectrum — from radio to X-ray wavelengths — which simultaneously probes the behavior of the disc and the jet.

Dr van den Eijnden added: "This discovery, only the second example of these instabilities, also highlights the rarity of this behavior. Therefore, finding more examples across different types of binary systems is a first priority. Due to the transient nature of this process, it is unpredictable when we will get another chance. By then, we will need to be prepared to repeat our international observing efforts."

The poster of the FUGIN (FOREST Unbiased Galactic plane Imaging survey with Nobeyama 45-m telescope) project (https://nro-fugin.github.io/). The upper panel shows the distribution of molecular clouds in the Milky Way Galaxy obtained by the Nobeyama 45-m radio telescope. The lower panel shows infrared observation by the Spitzer Space Telescope.
The poster of the FUGIN (FOREST Unbiased Galactic plane Imaging survey with Nobeyama 45-m telescope) project (https://nro-fugin.github.io/). The upper panel shows the distribution of molecular clouds in the Milky Way Galaxy obtained by the Nobeyama 45-m radio telescope. The lower panel shows infrared observation by the Spitzer Space Telescope.

Osaka Metro uses AI to draw the most accurate map of star birthplaces in the Galaxy

140,000 molecular gas clouds, where stars form, locations predicted

Stars are formed by molecular gas and dust coalescing in space. These molecular gases are so dilute and cold that they are invisible to the human eye, but they do emit faint radio waves that can be observed by radio telescopes.

Observing from Earth, a lot of matter lies ahead and behind these molecular clouds and these overlapping features make it difficult to determine their distance and physical properties such as size and mass. So, even though our Galaxy, the Milky Way, is the only galaxy close enough to make detailed observations of molecular clouds in the whole universe, it has been very difficult to investigate the physical properties of molecular clouds in a cohesive manner from large-scale observations.

A research team led by Dr. Shinji Fujita from the Osaka Metropolitan University Graduate School of Science in Japan, identified about 140,000 molecular clouds in the Milky Way Galaxy, which are areas of star formation, from large-scale data of carbon monoxide molecules, observed in detail by the Nobeyama 45-m radio telescope. Using artificial intelligence, the research team estimated the distance of each of these molecular clouds, determined their size and mass and successfully mapped their distribution, covering the first quadrant of the Galactic plane, in the most detailed manner to date.

“The results not only give a bird's eye view of the Galaxy but will also help in various studies of star formation,” explained Dr. Fujita. “In the future, we would like to expand the scope of observations with the Nobeyama 45-m radio telescope and incorporate radio telescope observation data of the sky in the southern hemisphere, which cannot be observed from Japan, for a complete distribution map of the entire Milky Way.”

This study was financially supported by Grants-in-Aid for Scientific Research (KAKENHI) of the Japanese Society for the Promotion of Science (grant numbers 17H06740 and JP21H00049) and “Young interdisciplinary collaboration project” in the National Institutes of Natural Sciences.