The core of massive dying galaxies already formed 1.5 billion years after the Big Bang

Astrophysics, Galaxies: The most distant dying galaxy discovered so far, more massive than our Milky Way -- with more than a trillion stars -- has revealed that the 'cores' of these systems had formed already 1.5 billion years after the Big Bang, about 1 billion years earlier than previous measurements revealed. The discovery will add to our knowledge on the formation of the Universe more generally, and may cause the supercomputer models astronomers use, one of the most fundamental tools, to be revised. The result was obtained in close collaboration with Masayuki Tanaka and his colleagues at the National Observatory of Japan is now published in two works in the Astrophysical Journal Letters and the Astrophysical Journal.

What is a "dead" galaxy?

Galaxies are broadly categorized as dead or alive: dead galaxies are no longer forming stars, while alive galaxies are still bright with star formation activity. A 'quenching' galaxy is a galaxy in the process of dying -- meaning its star formation is significantly suppressed. Quenching galaxies are not as bright as fully alive galaxies, but they are not as dark as dead galaxies. Researchers use this spectrum of brightness as the first line of identification when observing galaxies in the Universe. The James Webb Space Telescope, whose primary mirror has a diameter of 6.5 meters, will be launched in 2021. It is the designated successor of the Hubble Space Telescope.{module INSIDE STORY}

The farthest dying galaxy discovered so far reveals remarkable maturity

A team of researchers of the Cosmic Dawn Center at the Niels Bohr Institute and the National Observatory of Japan recently discovered a massive galaxy dying already 1.5 billion years after the Big Bang, the most distant of its kind. "Moreover, we found that its core seems already fully formed at that time", says Masayuki Tanaka, the author of the letter. "This result pairs up with the fact that, when these dying gigantic systems were still alive and forming stars, they might have not been that extreme compared with the average population of galaxies", adds Francesco Valentino, assistant professor at the Cosmic Dawn Center at the Niels Bohr Institute and author of an article on the past history of dead galaxies appeared in the Astrophysical Journal.

Why do galaxies die? - One of the biggest and still unanswered questions in astrophysics

"The suppressed star formation tells us that a galaxy is dying, sadly, but that is exactly the kind of galaxy we want to study in detail to understand why it dies", continues Valentino. One of the biggest questions that astrophysics still has not answered is how a galaxy goes from being star-forming to being dead. For instance, the Milky Way is still alive and slowly forming new stars, but not too far away (in astronomical terms), the central galaxy of the Virgo cluster - M87 - is dead and completely different. Why is that? "It might have to do with the presence of gigantic and active black hole at the center of galaxies like M87" Valentino says.

Earth based telescopes find extremes - but astronomers look for normality

One of the problems in observing galaxies in this much detail is that the telescopes available now on Earth are generally able to find only the most extreme systems. However, the key to describe the history of the Universe is held by the vastly more numerous population of normal objects. "Since we are trying hard to discover this normality, the current observational limitations are an obstacle that has to be overcome."

The James Webb Telescope (JWST) represents hope for better data material in the near future

The new James Webb Space Telescope, scheduled for launch in 2021, will be able to provide the astronomers with data in a level of detail that should be able to map exactly this "normality". The methods developed in close collaboration between the Japanese team and the team at the Niels Bohr Institute have already proven to be successful, given the recent result. "This is significant, because it will enable us to look for the most promising galaxies from the start, when JWST gives us access to much higher quality data" Francesco Valentino explains.

Combining observations with the tool, the supercomputer models of the Universe

What has been found observationally is not too far away from what the most recent models predict. "Until very recently, we did not have many observations to compare with the models. However, the situation is in rapid evolution, and with JWST we will have valuable larger samples of "normal'' galaxies in a few years. The more galaxies we can study, the better we are able to understand the properties or situations leading to a certain state - if the galaxy is alive, quenching or dead. It is basically a question of writing the history of the Universe correctly, and in greater and greater detail. At the same time, we are tuning the computer models to take our observations into account, which will be a huge improvement, not just for our branch of work, but for astronomy in general" Francesco Valentino explains.

Concordia researcher hopes to use big data to make pipelines safer in Canada

Historical records of oil and gas leaks can lead to big improvements for future projects, Fuzhan Nasiri says

Oil and gas pipelines have become polarizing issues in Canada, but supporters and detractors alike can agree that the safer they are, the better.

Unfortunately, integrity and health are ongoing and serious problems for North America's pipeline infrastructure. According to the US Department of Transportation (DOT), there have been more than 10,000 pipeline failures in that country alone since 2002. Complicating safety measures are the cost and intensity of labor required to monitor the health of the thousands of kilometers of pipelines that criss-cross Canada and the United States.

In a recent paper in the Journal of Pipeline Systems Engineering and Practice, researchers at Concordia and the Hong Kong Polytechnic University look at the methodologies currently used by industry and academics to predict pipeline failure -- and their limitations. Fuzhan Nasiri: {module INSIDE STORY}

"In many of the existing codes and practices, the focus is on the consequences of what happens when something goes wrong," says Fuzhan Nasiri, associate professor in the Department of Building, Civil and Environmental Engineering at the Gina Cody School of Engineering and Computer Science.

"Whenever there is a failure, investigators look at the pipeline's design criteria. But they often ignore the operational aspects and how pipelines can be maintained to minimize risks."

Nasiri, who runs the Sustainable Energy and Infrastructure Systems Engineering Lab, co-authored the paper with his Ph.D. student Kimiya Zakikhani and Hong Kong Polytechnic professor Tarek Zayed.

Safeguarding against corrosion

The researchers identified five failure types: mechanical, the result of design, material or construction defects; operational, due to errors and malfunctions; natural hazard, such as earthquakes, erosion, frost or lightning; third-party, meaning damage inflicted either accidentally or intentionally by a person or group; and corrosion, the deterioration of the pipeline metal due to environmental effects on pipe materials and acidity of oil and gas impurities. This last one is the most common and the most straightforward to mitigate.

Nasiri and his colleagues found that the existing academic literature and industry practices around pipeline failures need to further evolve around available maintenance data. They believe the massive amounts of pipeline failure data available via the DOT's Pipeline and Hazardous Materials Safety Administration can be used in the assessment process as a complement to manual in-line inspections.

These predictive models, based on decades' worth of data covering everything from pipeline diameter to metal thickness, pressure, average temperature change, location and timing of failure, could provide failure patterns. These could be used to streamline the overall safety assessment process and reduce costs significantly.

"We can identify trends and patterns based on what has happened in the past," Nasiri says. "And you could assume that these patterns could be followed in the future, but need certain adjustments for climate and operational conditions. It would be a chance-based model: given variables such as location and operational parameters as well as expected climatic characteristics, we could predict the overall chance of corrosion over a set time."

He adds that these models would ideally be consistent and industry-wide, and so transferrable in the event of pipeline ownership change -- and that research like this could influence industry practices.

"Failure prediction models developed based on reliability theory should be realistic. Using historical data (with adjustments) gets you closer to what happens in reality," he says.

"They can close the gap of expectations, so both planners and operators can have a better idea of what they could see over the lifespan of their structure."

Dartmouth deploys big data, AI tools to support research on harmful blue-green algae

Robotic boats and aerial drones combine with water sampling to study eastern lakes

A team of scientists from research centers stretching from Maine to South Carolina will develop and deploy high-tech tools to explore cyanobacteria in lakes across the East Coast.

The multi-year project will combine big data, artificial intelligence and robotics with new and time-tested techniques for lake sampling to understand where, when, and how cyanobacterial blooms develop.

The research team brings together experts in freshwater ecology, computer science, engineering and geospatial science from Bates College, Colby College, Dartmouth, the University of New Hampshire, the University of Rhode Island and the University of South Carolina. {module In-article}

"It is rare to have teams from so many different specialties converge to study a problem like this," said Alberto Quattrini Li, an assistant professor of computer science at Dartmouth and the overall project lead. "By working together, we can increase the amount of data that can be collected and increase prediction capabilities."

Freshwater lakes are responsible for a variety of human and ecological services, such as providing drinking water and producing food. But lakes across the country and world are increasingly threatened by an increase in the incidence of harmful cyanobacterial blooms.

Sometimes known as blue-green algae, blooms of cyanobacteria impact the quality of lake water and threaten human health through toxins that can damage multiple organ systems.

Scientists know that land-use changes and global climate change are the main drivers of cyanobacteria, but there is still much that is not known about what influences the timing and location of blooms in individual lakes. Researchers are also looking to understand how cyanobacteria are impacted by extreme precipitation events.

"We suspect that individual blooms result from a complicated interaction of conditions that include nutrient loading during the past spring, recent trends in temperature and precipitation, and current in-lake conditions," said Kathryn Cottingham, a professor of biology at Dartmouth. "Until now, we haven't had the tools or technologies to track conditions at the right spatial or temporal scales to understand those drivers."

The project will use robotic boats, buoys, and camera-equipped drones to measure physical, chemical, and biological data in lakes where cyanobacteria are detected. When combined, the technology will generate large volumes of data related to the lakes and the development of harmful blooms. The project will also build new algorithmic models to assess the findings.

Lakes in New Hampshire, Maine, Rhode Island, and South Carolina will be studied as part of the project.

Information collected through the research could lead to better predictions of when and where cyanobacterial blooms take place. Those predictions might allow earlier actions to protect public health in recreational lakes and in lakes that supply drinking water.

With technology covering the water and air, researchers will also collect information on population and land use around the lakes to determine how those factors might impact bloom formation.

Project technology will be shared with lake managers and citizens so that community members can conduct their own monitoring. Local homeowners will form a corps of "citizen scientists" to support the project.

Undergraduate and graduate students will also participate in the project. Such interdisciplinary training is hoped to prepare the next generation of scientists to address societal issues.