UTA researchers working to develop better modeling methods to test composite materials

Improved testing of composite materials

A team of University of Texas at Arlington researchers is working to develop better modeling methods for affordable, sustainable testing of composite materials.

Endel Iarve, professor of mechanical and aerospace engineering, has received a $2.3 million grant funded by the Air Force Research Lab to continue a collaborative project with colleagues at Wichita State University.

Iarve is the principal investigator on the project, which includes UTA aerospace engineering Professors Andrew Makeev and Kenneth Reifsnider and Assistant Professor Rassel Raihan as co-principal investigators.

Together, they are creating new tools for the design and certification of various advanced composite structures and materials--including laminated and woven composites--with an emphasis on realistic representation of "as-manufactured" parts that include manufacturing defects. The goal is to bring these tools into the certification framework to reduce costs and increase efficiency across the industry. Endel Iarve{module In-article}

"The award is a significant investment by the Air Force Research Lab and enables the continuation of the tremendous work being done by Professor Iarve and his colleagues in an area of critical importance to the use of advanced composite materials," UTA President Vistasp Karbhari said. "This research will undoubtedly advance the state-of-the-art and have a profound effect on the safety of aircraft and the effective assessment of service life and flight envelopes both for military and civil aviation."

There are three components to UTA's research:

  • Iarve, who is also part of the Institute for Performance Prediction Methodology (IPPM) at the UTA Research Institute, will use a discrete damage modeling technique that he developed through previous Department of Defense and NASA research grants to enhance advanced simulation techniques for high-fidelity strength and life prediction of composite structures. Discrete damage modeling simulates matrix cracking at initially unknown locations and directions within a composite structure. These simulations can allow for better understanding and reduction of the amount of testing required.
  • Reifsnider, who directs the IPPM, and Raihan, who also works at UTARI, will investigate how to predict the quality and strength of bonded joints from material state data, measured non-invasively. Aerospace, marine and automotive manufacturers use composite materials in structures to reduce weight and increase performance, but still, face challenges in understanding how best to bond different materials to make reliable structures and predict their strength and performance."Bonded joints in composites are a key enabler for creating efficient unitized structure, but can also be an Achilles heel," Iarve said. "One of the goals of the project is to devise a way to determine the quality of a bond without breaking it through novel experimental techniques."
  • Makeev, who runs UTA's Advanced Materials and Structures Lab, will work on the high-resolution, non-destructive characterization of composite joints and repairs. Aircraft composite structures typically feature joints with tight radii and complex 3D structures, which are uninspectable using traditional non-destructive methods.

Makeev will apply high-resolution computing tomography inspection technology to overcome object size limits and allow automation for critical joint regions and bonded repairs in larger structures, focusing on the detection of disbonds, voids, and foreign objects/materials.

"It is important for the aerospace industry, as well as numerous other industries where composites are used in manufacturing, to have tools that make testing and inspections as reliable and cost-effective as possible," said Erian Armanios, chair of UTA's Mechanical and Aerospace Engineering Department. "This grant is a testament to the excellent research being performed by Drs. Iarve, Reifsnider, Raihan and Makeev, and shows the value of our ability to bring together similar research areas across the department to produce results that will lead to greater confidence in composite testing and certification."

SRON scientists use artificial intelligence to confirm galaxy mergers ignite starbursts

When two galaxies merge, there are brief periods of stellar baby booms. A group of astronomers led by Lingyu Wang from SRON has now used a sample of over 200,000 galaxies to confirm that galaxy mergers are the driving force behind starbursts. It is the first time that scientists use artificial intelligence in a galaxy merger study. Publication in Astronomy & Astrophysics on October 21st.

One of the most pressing questions in astronomy is how and when stars formed in the galaxies we see around us. The Universe contains hundreds of billions of galaxies and they come in many shapes and forms. Take for example the Sombrero Galaxy, the Black Eye Galaxy, the Whirlpool Galaxy or our own Milky Way stretching across the entire sky. Each harbors hundreds of billions of twinkly lights. How and when did all those stars emerge on the cosmic stage?

A popular hypothesis among astrophysicists is that galaxy mergers go hand-in-hand with short starburst phases and an increase of around a factor two in star formation over the whole duration of the merger. Mergers would produce shock waves in the interstellar gas, igniting significant baby booms of stars. A group of astronomers led by Lingyu Wang from SRON Netherlands Institute for Space Research, including first author William Pearson and co-author Floris van der Tak, now confirm this theory by analyzing a record number of over 200,000 galaxies. They found up to twice the number of starbursts in merging galaxies compared to single galaxies.

Deep learning

Example of two merging galaxies that were identified by AI in this study.{module In-article}

Because their database was so large, the team built a deep learning algorithm that taught itself to identify merging galaxies. Pearson: 'The advantage of artificial intelligence is that it improves the reproducibility of our study because the algorithm is consistent in its definitions of a merger. Also, it's good preparation for upcoming surveys that will image billions of galaxies. Then you inevitably need AI. Even citizen science projects such as Galaxy Zoo cannot deal with those numbers.'

It is the first time that astronomers have used AI in a merger study. 'This is a milestone in the sense that AI will play an increasingly large role in our field,' says Wang. 'But we have to keep in mind that the power of AI is limited to how it is trained. If we feed it a flawed definition of a galaxy merger, then it won't do its job correctly.'

A climate model developed by ISGlobal provides long-term predictions of 'El Niño' events

Although a number of operational climate models are capable of predicting El Niño events, they cannot perform long-term forecasts more than half a year in advance. Now, a team from the Barcelona Institute for Global Health (ISGlobal), an institution supported by "la Caixa", has developed a new statistical climate model able for the first time to predict El Niño episodes up to two-and-a-half years in advance.

The El Niño Southern Oscillation (ENSO) is a climate phenomenon that represents a variety of atmospheric and oceanic features over the equatorial Pacific. It occurs every 2-7 years but has an irregular periodicity. It consists of two opposite phases: a warming of the sea-surface temperature in the eastern and central equatorial Pacific Ocean known as El Niño, and an opposite cooling phase known as La Niña. ENSO can cause extreme weather events in many regions of the world through atmospheric teleconnections, and therefore, it has very important implications for the global seasonal climate predictions. {module In-article}

The study, published in the Journal of Climate, uses a series of predictor variables including sea temperatures at different depths, as well as winds in the tropical Pacific, in a flexible statistical dynamic components model to make retrospective predictions of ENSO events between 1970 and 2016. The model is capable of predicting all the major El Niño episodes that occurred within that period, including the extreme event of 2015-2016, up to two-and-a-half years in advance.

The computational tool developed in this study is an improved version of a statistical dynamic components model already proposed two years ago by the same ISGlobal researchers. Desislava Petrova, the first author of the two studies, says that this is an important advance in the area of climate sciences and ENSO research.

"The analysis shows that the events can be predicted with much more precision since the launching of the Tropical Pacific Observing System (TPOS), which provides better data quality and coverage, also of the subsurface ocean" explains Petrova. "This allows us to make long-term forecasts of this climate phenomenon at a relatively low computational cost," she adds.

ICREA Professor Xavier Rodó, study coordinator and director of the Climate and Health Programme at ISGlobal points out that other statistical models should be improved by "using available data from under the sea surface, which is key to predicting El Niño-Southern Oscillation events". "This could provide early and useful information about El Niño and La Niña to decision-makers around the world, which could prevent threats to human lives and reduce thousands of millions of dollars in economic costs", he comments.