A weir on the Koeye River is one location where Wild Salmon Center is partnering with First Nations to pilot the Salmon Vision technology. (PC: Olivia Leigh Nowak/Le Colibri Studio.)
A weir on the Koeye River is one location where Wild Salmon Center is partnering with First Nations to pilot the Salmon Vision technology. (PC: Olivia Leigh Nowak/Le Colibri Studio.)

WSC, First Nations develop Salmon Vision, a real-time machine learning model to track salmon returns

The Wild Salmon Center has partnered with several First Nations to use a combination of cutting-edge artificial intelligence tools and traditional Indigenous fishing methods to gain a better understanding of salmon runs in real time. The Salmon Vision deep learning model, which uses advanced artificial intelligence tools to identify and count fish species, is currently being utilized in various rivers around the North and Central Coasts of British Columbia. By 2024, Salmon Vision aims to provide reliable real-time fish count data to First Nations fisheries managers, thereby increasing their involvement in fisheries management decisions.

Fisheries managers on British Columbia’s Central Coast have to make decisions without knowing how many salmon are returning until after fishing seasons are over. They have to make forecasts and set harvest targets for commercial and recreational fisheries based on modeled data from the past. Emergency closures also have to be decided on when salmon populations start to decline. However, with the unpredictable and accelerating effects of climate change, it is increasingly difficult to rely on past data to predict future salmon returns. 

Dr. Will Atlas, Wild Salmon Center Senior Watershed Scientist, suggests a solution called “Salmon Vision.” A first-of-its-kind technology that combines artificial intelligence with ancient fishing weir technology, the Salmon Vision computer deep learning model can identify and count fish species. Developed by WSC in data partnership with the Gitanyow Fisheries Authority and Skeena Fisheries Commission, Salmon Vision aims to enable real-time salmon population monitoring for First Nations fisheries managers and beyond.

Automating fish counting is crucial for informed decisions while salmon are still running, according to many of our First Nations partners. Dr. Atlas suggests that underwater video technology can help us see those salmon returning to rivers. 

The Salmon Vision pilot study has annotated over 500,000 video frames captured at Indigenous-run fish counting weirs on the Kitwanga and Bear Rivers of B.C.'s Central Coast. Early assessments indicate that the technology is adept at tracking 12 different fish species passing through custom fish-counting boxes at the two weirs, with scores surpassing 90 and 80 percent accuracy for coho and sockeye salmon: two of the principal fish species targeted by First Nations, commercial, and recreational fishers. 

The Heiltsuk Nation is running Salmon Vision on a weir on the Koeye River. For First Nations like the Heiltsuk, weirs represent more than a revitalization of an age-old fishing technology. The rebuilding of weirs on rivers like the Koeye is a statement of First Nations sovereignty and their seat at the table in fisheries management decisions, as they were banned in the late 1800s by Canada's Department of Fisheries and Oceans as a way to consolidate control of fishery resources. 

"Modern-day expression of Heiltsuk title and rights and an avenue for us to be a part of the latest science," says William Housty, Associate Director of the Heiltsuk Integrated Resource Management Department. "And to make decisions not just for the betterment of this creek, but for the whole ecosystem." 

The Salmon Vision team is implementing automated counting on a trial basis in several rivers around the B.C. North and Central Coasts with partner First Nations. The goal is to provide reliable real-time fish count data to these partners by 2024. Ultimately, Dr. Atlas says, this groundbreaking A.I. technology could be in place in rivers across the North Pacific. 

"How many salmon are returning everywhere that we're fishing for salmon is the information we need," Dr. Atlas says. "You can't tell me with a straight face that you're having a sustainable fishery if you don't know how many fish you have coming back. And that's a problem right around the Pacific Rim." 

It's a problem with a promising solution, one that's just now coming into focus.

Xiaoqian Jiang, PhD, chair of the Department of Health Data Science and Artificial Intelligence at McWilliams School of Biomedical Informatics.
Xiaoqian Jiang, PhD, chair of the Department of Health Data Science and Artificial Intelligence at McWilliams School of Biomedical Informatics.

UTHealth Houston wins $6.4M NIH grant to develop deep learning model for Alzheimer’s

UTHealth Houston has been granted a $6.4 million fund by the National Institute on Aging for the next five years. The fund is intended to help develop an artificial intelligence approach to study the genetic factors related to Alzheimer’s disease. A team of researchers led by Zhongming Zhao, Ph.D., and Xiaoqian Jiang, Ph.D., both principal investigators and professors at McWilliams School of Biomedical Informatics in UTHealth Houston, are in the process of creating a deep-learning AI system. This system will link brain imaging with cell-specific genetic factors. To validate their AI models, the researchers will use neuroimaging and genetic data from Rush University Medical Center. They will also join the National Alzheimer’s Disease Sequencing Project AI/Machine Learning Consortium.

Zhao, who is also the director of the Center for Precision Health at McWilliams School of Biomedical Informatics, stated that this project will bridge the gap in Alzheimer's disease research between neuroimaging and genetic studies. Although there have been numerous computational analytical approaches published in each field, few can better address the link between neuroimaging and genetic data for a deeper understanding of the disease.

A significant amount of data related to molecular neuroimaging biomarkers and clinical information has already been generated in the context of Alzheimer's disease. However, researchers have not been able to connect many of the causal factors associated with the disease. To address this, scientists plan to use advanced machine-learning technology and an AI multimodality approach to group genetic and functional data. This will help to characterize the genetic risk of Alzheimer's disease. The researchers are calling this approach the "deep-learning brain" because it will focus on brain reading. The goal is to extend this model to the single-cell level, which will be called the "single-cell deep brain." This will allow for a more powerful way to dissect the genetic components of Alzheimer's disease.

To address the cognitive decline associated with Alzheimer's disease, researchers plan to integrate neuroimaging data into the deep-learning system. This will involve pairing distinct imaging features with genomic data to visualize their commonalities. Overall, the approach holds great promise for studying Alzheimer's disease and improving our understanding of this neurodegenerative disorder.

Researchers will use neuroimaging and genetic data from Rush University Medical Center, led by Christopher Gaiteri, PhD, assistant professor in the Department of Neurosciences, to validate their AI models. They will also collaborate with the national Alzheimer's Disease Sequencing Project AI/Machine Learning Consortium. The goal is to identify the link between genes and neuroimages to combine them into neuroimaging genetics, which can ultimately help explain the causes of cognitive decline in Alzheimer's disease. This understanding can help researchers and patients find better treatment options. The study's co-investigators are Paul Schulz, MD, a professor in the Department of Neurology at McGovern Medical School, and Kai Zhang, PhD, Yejin Kim, PhD, Yulin Dai, PhD, and Xiangning Chen, PhD, with McWilliams School of Biomedical Informatics. This research is funded by NIH grant U01AG079847.

This image was generated with the assistance of AI (assistance of DALL·E 2).  Credit: Professor James Sprittles, University of Warwick.
This image was generated with the assistance of AI (assistance of DALL·E 2). Credit: Professor James Sprittles, University of Warwick.

Scientists are using the mechanics of giant waves on a nanometric scale

Scientists have shown that the mechanics of rogue waves can be applied to the scale of a nanometer, with potential applications in various industries, including manufacturing and medicine. The study involved direct simulations of molecules and the development of new mathematical models. This theory can help control when and how layers rupture, leading to advancements in nanotechnologies and providing insights into dry eye disorders.

Rogue waves were once considered to be a myth and are known to hit oil rigs and ships in their path. Unlike tsunamis, rogue waves form by the chance combination of smaller waves in the ocean, making them rare events.

Researchers have been studying rogue waves for years, but now they are showing how this phenomenon can be applied on a much smaller scale - nanometrically. This new approach to the behavior of liquids on a nanometric scale has been published as a letter in Physical Review Fluids. A nanometer is a million times smaller than the thickness of a page of a book.

Scientists have found that the holes and bumps caused by rogue waves can be manipulated to produce patterns and structures for use in nano-manufacturing, which is manufacturing on a scale one-billionth of a meter. For instance, patterns formed that rupture liquid films can be used to build microelectronic circuits, which could be used in the production of low-cost components of solar cells. Furthermore, the behavior of thin liquid layers could help to explain why millions of people worldwide suffer from dry eye, which occurs when the tear film covering the eye ruptures.

The University of Warwick’s Mathematics Institute led a study that used direct simulations of molecules and new mathematical models to discover how nanoscopic layers of liquid behave in unexpected ways. Although spilled coffee on a table may seem still, at the nanoscale, the chaotic motion of molecules creates random waves on a liquid’s surface. A rare event occurs when these waves conspire to create a large 'rogue nano wave' that bursts through the layer and creates a hole. The new theory explains how and when this hole is formed, providing new insights into an unpredictable effect. The team of researchers is excited about the potential of this research in various industries. The applications are far-reaching.

Professor James Sprittles from the Mathematics Institute at the University of Warwick said, "We were thrilled to discover that mathematical models initially developed for quantum physics and recently applied to predict rogue ocean waves are essential for predicting the stability of nanoscopic layers of liquid. We hope that in the future, the theory can be used to develop a range of nano-technologies where controlling when and how layers rupture is critical. It could also have implications in related areas, such as the behavior of emulsions in foods or paints, where the stability of thin liquid films determines their shelf-life."

Two neutron stars at the moment of their merger.  CREDIT Dana Berry SkyWorks Digital, Inc.
Two neutron stars at the moment of their merger. CREDIT Dana Berry SkyWorks Digital, Inc.

German scientists use supercomputing tech to gain a better understanding of the 3D structure of kilonovae

A team of scientists from GSI Helmholtzzentrum für Schwerionenforschung and Queen's University Belfast have produced a 3D supercomputer simulation of the light that is emitted after the merger of two neutron stars, similar to a kilonova that has been observed. The simulation brought together several areas of physics, such as the behavior of matter at high densities, properties of unstable heavy nuclei, and atom-light interactions of heavy elements. This breakthrough in research has provided new insights into the phenomenon of kilonovae.

Recent observations that combine both gravitational waves and visible light have pointed to neutron star mergers as the major site of this element production. According to Luke Shingles, a scientist at GSI/FAIR and the leading author, the unprecedented agreement between their simulations and the observation of kilonova AT2017gfo indicates a broad understanding of the event.

The light that we see through telescopes from the material ejected from a neutron-star merger is determined by the interactions between electrons, ions, and photons within it. Supercomputer simulations of radiative transfer can model these processes and the emitted light. Recently, researchers produced a three-dimensional simulation for the first time that can self-consistently follow the neutron-star merger, neutron-capture nucleosynthesis, energy deposited by radioactive decay, and radiative transfer with tens of millions of atomic transitions of heavy elements.

The 3D model can predict the observed light for any viewing direction. When viewed almost perpendicular to the orbital plane of the two neutron stars, as observational data indicates for the kilonova AT2017gfo, the model predicts a sequence of spectral distributions that look very similar to what has been observed for AT2017gfo. This research area will help us to understand the origins of elements heavier than iron (such as platinum and gold) that were mainly produced by the rapid neutron capture process in neutron star mergers, says Shingles.

Almost half of the elements heavier than iron are produced in an environment of extreme temperatures and neutron densities, as achieved when two neutron stars merge. When they spiral in and coalesce, the resulting explosion leads to the ejection of matter with the appropriate conditions to produce unstable neutron-rich heavy nuclei by a sequence of neutron captures and beta-decays. These nuclei decay to stability, releasing energy that powers an explosive ‘kilonova’ transient, a bright emission of light that rapidly fades in about a week.

The 3D simulation combines several areas of physics, including the behavior of matter at high densities, the properties of unstable heavy nuclei, and atom-light interactions of heavy elements. However, further challenges remain, such as accounting for the rate at which the spectral distribution changes and the description of material ejected at late times. Future progress in this area will increase the precision with which we can predict and understand features in the spectra and will further our understanding of the conditions in which heavy elements were synthesized. High-quality atomic and nuclear experimental data, provided by the FAIR facility, is a fundamental ingredient for these models.

SwRI has been developing and maintaining the Elastic-Plastic Impact Computations (EPIC) Dynamic Finite Element computational tool since 2007. This tool has proven to be very cost-effective in supporting the design of more effective armor and warheads. The image above depicts a simulation of an impact using EPIC. As part of an Other Transaction Prototype Agreement with the U.S. Army Corps of Engineers, SwRI will continue to advance EPIC.
SwRI has been developing and maintaining the Elastic-Plastic Impact Computations (EPIC) Dynamic Finite Element computational tool since 2007. This tool has proven to be very cost-effective in supporting the design of more effective armor and warheads. The image above depicts a simulation of an impact using EPIC. As part of an Other Transaction Prototype Agreement with the U.S. Army Corps of Engineers, SwRI will continue to advance EPIC.

SwRI updates impact modeling software EPIC for the US Army Corps of Engineers to meet the evolving computational needs of the US Department of Defense

Southwest Research Institute (SwRI) has received a grant of $500,000 in the first year, which can be further extended up to $3.5 million for the development of the Elastic-Plastic Impact Computations (EPIC) dynamic finite-element code. EPIC uses particle and finite element methods to simulate complex impact and explosion scenarios, enabling engineers to analyze how a particular design would behave under stress in real-world conditions. It can accurately simulate high-velocity impact events and explosive detonations, providing cost-effective design solutions for effective warheads, armor, vehicles, aircraft, and soldiers. EPIC's simulations can also offer protection against a wide range of threats.

SwRI Staff Engineer Dr. Stephen Beissel, who has been involved in the development of the EPIC project since the mid-1990s, stated that EPIC leverages finite element and particle methods to simulate complex impact and explosion scenarios. The numerical algorithms and material models enable the code to handle highly dynamic and energetic events, allowing engineers to analyze how a particular design for a ground vehicle, ship, or aircraft component would react under stress in real-world conditions.

The EPIC code was initially developed in the 1970s to cost-effectively design warheads, body armor, and armored vehicles, and model their interactions. In 2007, the EPIC development team joined SwRI, which opened an office in Minneapolis, Minnesota, to support them. SwRI took over the maintenance and development of the project at that time.

EPIC uses finite element analysis, an efficient computational technique, to model a full range of impact scenarios, including high-speed impacts that generate large pressures, high strain rates, and permanent deformations in solid materials. It also uses particle methods, an approach similar to the finite-element method, which continuously reassesses the local regions over which information is exchanged.

EPIC's unique feature is the accurate transition from finite elements to particle methods when deformations become extensive. It is a crucial tool for designing effective warheads, and armor for vehicles, aircraft, and soldiers, and to protect against a wide range of threats. It simulates high-velocity impact events and explosive detonation in the case of warheads.

Over the next four years, SwRI aims to improve and update EPIC by increasing its accuracy, expanding the types of problems and scenarios it can handle, and increasing its computational efficiency on supercomputers built with GPUs.

Dr. Beissel noted that as adversaries continue to develop new munitions, such as hypersonic missiles, tools like EPIC become critical to designing new armor and approaches to defeating these threats. Creating physical prototypes and testing them is expensive and time-consuming, especially in destructive events. Simulating these dynamic and explosive large-strain events, instead of repeatedly recreating a physical prototype, makes the design cycle more efficient and cost-effective.