FAU researchers win NSF CAREER awards

Researchers represent the College of Engineering and Computer Science

Two researchers from Florida Atlantic University's College of Engineering and Computer Science have received the coveted National Science Foundation (NSF) Early Career (CAREER) awards. The CAREER program offers the NSF's most prestigious awards in support of early-career faculty who have the potential to serve as academic role models in research and education and to lead advances in the mission of their department or organization.

The researchers are Xiangnan Zhong, Ph.D., an assistant professor; and Zhen Ni, Ph.D., an assistant professor, both in the Department of Computer and Electrical Engineering and Computer Science, who received the NSF CAREER awards to drive the current artificial intelligence (AI) wave. With these NSF CAREER awards, the researchers will help to close the gap from the current state-of-the-art techniques to the artificial general intelligence that will bring good performance in learning speed, data efficiency, and generalization of the optimization performance.

"Although existing achievements in artificial intelligence and reinforcement learning are exciting, the fundamental research of data aggregation, learning and approximation capability, and the performance generalization during uncertainties is not yet fully developed," said Stella Batalama, Ph.D., dean, College of Engineering and Computer Science. "The prestigious NSF CAREER awards that professors Zhong and Ni have received will help us to close the gap from the current state-of-the-art techniques to the artificial general intelligence that will bring good performance in learning speed, data efficiency, and generalization of the optimization performance. Moreover, the activities resulting from these projects will vigorously contribute to the nation's artificial intelligence workforce development."

Zhong has received a $503,000 NSF CAREER grant to investigate the intelligent learning control to enable cyber-physical systems (CPS) with the capabilities of autonomous learning and generalization to rapidly adapt in unknown situations. The results are expected to transform how agents interact in high-dimensional and heterogeneous environments, and therefore could potentially provide in-depth findings for exploring creativity in frontier AI techniques.

Zhong also will develop cooperative learning strategies to share with extended skills to facilitate exploration and prevent agents from getting confused by the action details. Also, this project will develop self-motivated learning structures to contribute toward the global objectives for team-wide success in a distributed perspective.

"The integration of research and education plans will prepare our future workforce in the fields of cyber-physical systems, AI, learning, and control," said Zhong.

Ni has received a $500,000 NSF CAREER grant for a natural concurrent reinforcement learning framework that has three major advantages over traditional reinforcement learning methods. These include advantages of simultaneously learning multimodal properties of the complex system; structural advantages of using a personalized learning scheme; and implementation advantages of the data-driven sample-efficient design. Within this framework, Ni will design two concurrent reinforcement learning methods to build the learning-in-learning control paradigm. The applications in the smart energy community will support the novel learning framework and theoretical results.

"Beyond the scientific impacts, this research has broader impacts for a wide range of research disciplines including transportation, rehabilitation, and robotics," said Ni. "Furthermore, the integration of research and education activities will positively impact institutions both regionally and nationally."

NCSU researchers modify air quality models to reflect polluted reality in Latin America

Supercomputational models of air quality have long been used to shed light on pollution control efforts in the United States and Europe, but the tools have not found widespread adoption in Latin America. New work from North Carolina State University and Universidad de La Salle demonstrates how these models can be adapted to offer practical insights into air quality challenges in the Americas outside the U.S.

Supercomputational air quality models can be used in multiple ways. For example, they can be used to determine which sources are responsible for what fraction of air pollution. They can also help authorities predict how air pollution might change if different pollution control methods are adopted.

"Historically, it's been very challenging to apply these modeling tools in Latin America, so it has rarely been done," says Fernando Garcia Menendez, the corresponding author of a paper on the work and an assistant professor of environmental engineering at NC State. "This is important because the region has many areas that are dealing with significant air pollution, and these modeling tools can help governments identify the most cost-effective ways of achieving air quality improvements." Supercomputational models of air quality have long been used to shed light on pollution control efforts in the United States and Europe, but the tools have not found widespread adoption in Latin America. New work from North Carolina State University and Universidad de La Salle demonstrates how these models can be adapted to offer practical insights into air quality challenges in the Americas outside the U.S. The research focuses on air quality in Bogotá, Colombia.  CREDIT James East

One challenge to using computational air quality models in Latin America is that the relevant modeling frameworks were developed largely in the context of the U.S. and Europe. That means that some of the assumptions that modelers took for granted when developing the tools don't always apply in Latin American cities. Furthermore, computational resources and trained environmental modelers are still scarce in the region.

For example, there are often substantially fewer air emissions data available. Also, there are some contributors to air pollution that are common across Latin American metro areas, but that differ from what we see in the U.S. - more unpaved roads, an older cargo fleet, a large number of motorcycles, informal economies, and so on.

With that in mind, Garcia Menendez developed a research project with collaborators at the Universidad de La Salle, in Bogotá, Colombia. Specifically, the research team fine-tuned a modeling framework to reflect the air pollution dynamics in Bogotá and investigate the city's air quality problems. The collaborators at Universidad de La Salle also collected air pollution data that allowed the team to assess the accuracy of its modeling results.

"Our paper outlines the techniques we've used to perform computational modeling of air quality issues in a large Latin American city," says James East, first author of the paper and a Ph.D. student at NC State. "This not only demonstrates that it can be done but provides an approach that others can use to provide insights into air pollution in other parts of the region that are experiencing similar issues."

While the paper focuses on an air quality model for fine particulate matter (PM2.5), the researchers say that the model could be used to look at other air pollutants. Exposure to PM2.5 is associated with a wide variety of health problems, including heart and lung disease.

In their proof-of-concept demonstration, the researchers found that the largest local sources of PM2.5 in Bogotá were dust from unpaved roads and emissions from heavy-duty vehicles. However, when the model was used to project future air quality, the study also found that while paving roads would decrease air pollution in some parts of the city, different emission sources would still lead to increased air pollution in other parts of the city - unless other emission control measures were also implemented.

In short, the model offered practical insights into possible solutions for a complex metropolitan area of 10 million people.

"These findings are of interest to environmental authorities, from the local to the national level, who are pursuing ways to effectively address air pollution in Bogotá and other Colombian cities," says Jorge Pachon, a co-author of the paper and an associate professor at the Universidad de La Salle.

MIT, Sorbonne analysis shows basic cell health systems wear down in Huntington's disease

Using an innovative computational approach to analyze vast brain cell gene expression datasets, researchers at MIT and Sorbonne Université have found that Huntington's disease may progress to advanced stages more because of a degradation of the cells' health maintenance systems than because of increased damage from the disease pathology itself.

The analysis yielded a trove of specific gene networks governing molecular pathways that disease researchers may now be able to target to better sustain brain cell health amid the devastating neurodegenerative disorder, said co-senior author Myriam Heiman, Associate Professor in MIT's Department of Brain and Cognitive Sciences and an investigator at The Picower Institute for Learning and Memory. Christian Neri of the Sorbonne's Centre National de la Recherche Scientifique is the co-senior and co-corresponding author of the study published in eLife.

"If we can maintain the expression of these compensatory mechanisms, it may be a more effective therapeutic strategy than just trying to affect one gene at a time," said Heiman, who is also a member of the Broad Institute of MIT and Harvard.

In the study, the team led by co-corresponding author Lucile Megret created a process called "Geomic" to integrate two large sets of data from Heiman's lab and one more from UCLA researcher William Yang. Each dataset highlighted different aspects of the disease, such as its effect on gene expression over time, how those effects varied by cell type, and the fate of those cells as gene expression varied.

Geomic created plots of the data that mapped differences pertaining to 4,300 genes along dimensions such as mouse age, the extent of Huntington's-causing mutation, and cell type (certain neurons and astrocytes in a region of the brain called the striatum are especially vulnerable in Huntington's). The plots took the form of geometric shapes, like crumpled pieces of paper, whose deformations could be computationally compared to identify genes whose expression changed most consequentially amid the disease. The researchers could then look into how abnormal expression of those genes could affect cellular health and function. Geomic created plots of the data that mapped differences pertaining to 4,300 genes along dimensions such as mouse age, the extent of Huntington's-causing mutation, and cell type (certain neurons and astrocytes in a region of the brain called the striatum are especially vulnerable in Huntington's). The plots took the form of geometric shapes, like crumpled pieces of paper, whose deformations could be computationally compared to identify genes whose expression changed most consequentially amid the disease. The researchers could then look into how abnormal expression of those genes could affect cellular health and function.{module INSIDE STORY} 

Big breakdowns

The Geomic analysis highlighted a clear pattern. Over time, the cells' responses to the disease pathology--linked to toxic expansions in a protein called Huntingtin--largely continued intact, but certain highly vulnerable cells lost their ability to sustain gene expression needed for some basic systems that sustain cell health and function. These systems initially leaped into action to compensate for the disease but eventually lost steam.

One of the biggest such breakdowns in an especially vulnerable cell type, Drd-1 expressing neurons, was maintaining the health of energy-producing components called mitochondria. Last year, Heiman's lab published a study in Neuron showing that in some Huntington's-afflicted neurons, RNA leaks out of mitochondria provoking a misguided and immune response that leads to cell death. The new findings affirm a key role for mitochondrial integrity and implicate key genes such as Ndufb10 whose diminished expression may undermine the cell's network of genes supporting the system.

The Geomic approach also highlighted an especially dramatic decline in the Drd-1 neurons and in astrocytes of expression of multiple genes in pathways that govern endosome regulation, an essential process for determining where proteins go and when they are degraded within the cells. Here, too, key genes like Rab8b and Rab7 emerged as culprits within broader gene networks.

The researchers went on to validate some of their top findings by confirming that key alterations of gene expression were also present in post-mortem samples of brain tissue from human Huntington's patients.

While mitochondrial integrity and endosome regulation are two particularly strong examples, Heiman said, the study lists many others. The Geomic source code and all the data and visualizations it yielded are publicly accessible on a website produced by the authors.

"We've created a database of future targets to probe," Heiman said.

Neri added: "This database sets a precise basis for studying how to properly re-instate brain cell compensation in Huntington's disease, and possibly in other neurodegenerative diseases that share common compensatory mechanisms with Huntington's disease."

Key among these could be regulators of gene transcription in these affected pathways, Heiman said.

"One promising future direction is that among the genes that we implicate in these network effects, some of these are transcription factors," she said. "They may be key targets to bring back the compensatory responses that decline."

A new way to study disease

While the researchers first applied Geomic's method of "shape deformation analysis" to Huntington's disease, it will likely be of equal utility for studying any neurodegenerative disease like Alzheimer's or Parkinson's, or even other brain diseases, the authors said.

"This is a new approach to study systems-level changes, rather than just focusing on a particular pathway or a particular gene," said Heiman. "I think this is a really nice proof of principle and hopefully we can apply this type of methodology to the study of other genomic data from other disease studies."