UT develops model to predict hernia surgery recovery outcomes

Could patients experience less pain and possibly have better recovery outcomes if their fears or emotional issues were addressed before surgery?

Three researchers at the University of Tennessee, Knoxville, recently developed a predictive model to examine that question.

Rebecca Koszalinski, assistant professor in the College of Nursing; Anahita Khojandi, assistant professor in the Department of Industrial and Systems Engineering in the Tickle College of Engineering; and Bruce Ramshaw, a physician and adjunct professor in the Haslam College of Business, examined data collected from 102 patients who underwent ventral hernia repair surgery.

A ventral hernia is a bulge of tissue that pushes through a point of weakness in an abdominal wall muscle, requiring surgical correction. Approximately 350,000 ventral hernia procedures occur each year in the US and are associated with an estimated $3 billion in health care costs. {module INSIDE STORY}

The predictive model suggests that the emotional status of the patient prior to surgery--levels of depression, anxiety, grief, or anger--influence recovery outcomes. Patients may experience less pain if their fears or emotional issues are addressed before surgery.

"If we begin prehabilitation, which includes a holistic assessment--not limited to physical and emotional condition--of the person prior to the intervention, then we may be able to affect outcomes," Koszalinski said.

The researchers looked at historical patient data, including demographics and details from the surgical procedures, and examined patterns that led to complications following surgery. By associating the information collected before and during the patients' surgeries to their outcomes, the researchers developed a predictive model to identify future at-risk patients.

The predictive model, generated by Python programming, could be used as a decision support tool, allowing practitioners and patients to more easily assess the risks involved in this type of surgery. Using predictive modeling to examine health data sets is one example of how artificial intelligence can transform modern health care.

"There is a lot of potential for developing decision support tools using data science and artificial intelligence," Khojandi said. "We hear about similar models in the news every day, focused on detecting tumors in chest X-rays, among other things. This is an example of how a tool can be used for shared decision-making and change how individuals interact with the health care system."

The study suggests using the model as a tool for physicians, nurse practitioners, and other clinicians to simulate various scenarios for different patients, examining how the risk factors change for patients. The model could assist in avoiding overtreatment.

The predictive model could help direct efforts on patient education and quantify the impact lifestyle changes have on patients.

"I focus on the person and how they may be better informed and empowered to share in decision-making," Koszalinski said. "The hope is that predictive modeling coupled by empowered patients and expert clinical professionals could result in optimal patient outcomes."

Japanese built AI tool predicts the structure of the Universe

The origin of how the Universe created its voids and filaments can now be studied within seconds after researchers developed an artificial intelligence tool called Dark Emulator.

Advancements in telescopes have enabled researchers to study the Universe with greater detail and to establish a standard cosmological model that explains various observational facts simultaneously. But there are many things researchers still do not understand. Remarkably, the majority of the Universe is made up of dark matter and dark energy, of which no one has been able to identify its nature. A promising avenue to solve these mysteries is the structure of the Universe. Today’s Universe is made up of filaments where galaxies cluster together and look like threads from far away, and voids where there appears to be nothing (image 1). The discovery of the cosmic microwave background has given researchers a snapshot of what the Universe looked like close to its beginning, and understanding how its structure evolved to what it is today would reveal valuable characteristics about what dark matter and dark energy is. Image 1: The way in which galaxies cluster together in the Universe is made clear in this image of the Universe as observed by the Sloan Digital Sky Survey (SDSS). The yellow dots represent the position of individual galaxies, while the orange loop shows the area of the Universe spanning 1 billion light-years. At the center is Earth, and around it is a three-dimensional map of where different galaxies are. The image reveals how galaxies are not uniformly spread out throughout the Universe, and how they cluster together to create areas called filaments, or are completely absent in areas called voids. (Credit: Tsunehiko Kato, ARC and SDSS, NAOJ Four-Dimensional Digital Universe Project)

A team of researchers, including Kyoto University Yukawa Institute for Theoretical Physics Project Associate Professor Takahiro Nishimichi, and Kavli Institute for the Physics and Mathematics of the Universe (Kavli IPMU) Principal Investigator Masahiro Takada, used the world’s fastest astrophysical simulation supercomputers ATERUI and ATERUI II to develop the Dark Emulator. Using the emulator on data recorded by several of the world’s largest observational surveys allows researchers to study possibilities concerning the origin of cosmic structures, and how dark matter distribution could have changed over time.

“We built an extraordinarily large database using a supercomputer, which took us three years to finish, but now we can recreate it on a laptop in a matter of seconds. I feel like there is great potential in data science. Using this result, I hope we can work our way towards uncovering the greatest mystery of modern physics, which is to uncover what dark energy is. I also think this method we’ve developed will be useful in other fields such as natural sciences or social sciences,” says lead author Nishimichi. Image 2: The conceptual design of Dark Emulator. Left: an example of the virtual Universe created by “ATERUI II” supercomputer. It shows the distribution of about 10 billion particles in a volume encompassing about 4.9 billion light years evolved until today. It takes about 2 days using 800 CPU cores in “ATERUI II”. Center: The architecture of Dark Emulator. It learns the correspondence between the fundamental cosmological parameters employed at the beginning of a simulation and its outcome based on a machine-learning architecture with hybrid implementation of multiple statistical methods. After training, the machine now immediately predicts accurately the expected observational signals for a new set of cosmological parameters without running a new simulation. This allows us to drastically reduce the computational cost needed for the extraction of cosmological parameters from observational data. (Credit: YITP, NAOJ){module INSIDE STORY}

This tool uses an aspect of artificial intelligence called machine learning. By changing several important characteristics of the Universe, such as those of dark matter and dark energy, ATERUI and ATERUI II have created hundreds of virtual Universes. Dark Emulator learns from the data, and guesses outcomes for new sets of characteristics without having to create entirely new simulations every time. When testing the resulting tool with real-life surveys, it was able to successfully predict weak gravitational lensing effects in the Hyper Suprime-Cam survey, along with the three-dimensional galaxy distribution patterns recorded in the Sloan Digital Sky Survey to within 2 to 3 percent accuracy, in a matter of seconds. In comparison, running simulations individually through a supercomputer without the AI would take several days.

The researchers hope to apply their tool using data from upcoming surveys in the 2020s, enabling deeper studies of the origin of the Universe.

Details of their study were published in the Astrophysical Journal on 8 October 2019.

Japanese researchers use supercomputing to show that liquid water takes two distinct structures

Researchers at The University of Tokyo have used computational methods and analysis of recent experimental data to demonstrate that water molecules take two distinct structures in the liquid state. The team investigated the scattering of X-ray photons through water samples and showed a bimodal distribution hidden under the first diffraction peak that resulted from tetrahedral and non-tetrahedral arrangements of water molecules. This work may have important implications throughout science, but especially with regard to living systems, like proteins and cell structures, which are strongly affected by their surrounding water molecules.

Given the ubiquity of water on our planet and the central role it plays in all known life, it may be hard to believe that there is anything left to learn about this most familiar fluid. A simple molecule made up of just two hydrogen atoms and one oxygen; water still hides fundamental mysteries that remain to be unraveled. For example, water has unusually high melting and boiling points, and even expands when it freezes (unlike most liquids, which contract). These and other unusual properties make it very different from almost all other liquids but also allow life as we know it to exist. Water, water everywhere{module INSIDE STORY}

The weirdness of water can be best understood by thinking about the very unique interactions between H2O molecules--the hydrogen bond. Water tends to form four hydrogen bonds with its four neighbors, which leads to tetrahedral arrangements of the neighbors. Such arrangements can be largely distorted under thermal fluctuations. However, whether the distortion leads to the coexistence of distinct tetrahedral and non-tetrahedral arrangements has remained controversial.

Now, scientists at The University of Tokyo have combined supercomputer simulations and the analysis of scattering experimental data to find the "structure factor" of water - the mathematical function that represents the paths of dispersed X-rays when they scatter off the hydrogen and oxygen atoms. The analysis showed two overlapping peaks hiding in the first diffraction peak of the structure factor. One of these peaks corresponded to the distance between oxygen atoms as in ordinary liquids, while the other indicated a longer distance, as in a tetrahedral arrangement. "The combination of new computational methods and analysis of recent X-ray scattering data allowed us to see what was not visible in previous work," first author of the study Rui Shi explains.

This discovery may have huge implications across many scientific fields. Knowing the exact structural ordering of water is critical for a complete understanding of molecular biology, chemistry, and even many industrial applications. "It is very satisfying to be able to unravel the liquid structure of such a fundamental substance," senior author Hajime Tanaka says.