- Program could help solve arson cases (LATEST) 04-24-2014
Sifting through the chemical clues left behind by arson is delicate, time-consuming work, but University of Alberta researchers teaming with RCMP scientists in Canada, have found a way to speed the process. A computer program developed by University of Alberta chemistry professor James Harynuk, his team of graduate and...
- When things get glassy, molecules go fractal (LATEST) 04-24-2014
Explaining the mysterious dynamics of glassy materials Colorful church windows, beads on a necklace and many of our favorite plastics share something in common -- they all belong to a state of matter known as glasses. School children learn the difference between liquids and gases, but centuries of scholarship have failed to produce...
- How productive are the ore factories in the deep sea? (LATEST) 04-24-2014
GEOMAR scientists demonstrate in 'Nature' the supply routes of black smokers About ten years after the first moon landing, scientists on earth made a discovery that proved that our home planet still holds a lot of surprises in store for us. Looking through the portholes of the submersible ALVIN near the bottom of the Pacific Ocean in...
- New Physical Sciences Complex Holds Revolutionary Promise (LATEST) 04-24-2014
UMD celebrates grand opening of world-class research and education facility The University of Maryland's new Physical Sciences Complex is an instant architectural landmark with its shimmering elliptical atrium, and with its official opening on April 23, 2014, the state-of-the-art research building is poised to become a landmark of...
- UCSB superconducting qubit array points the way to quantum supercomputers (LATEST) 04-23-2014
A new five-qubit array from UCSB’s Martinis Group is on the threshold of making a quantum computer technologically feasible to build A fully functional quantum supercomputer is one of the holy grails of physics. Unlike conventional supercomputers, the quantum version uses qubits (quantum bits), which make direct use of the multiple...
- U of U software identifies disease-causing mutations in undiagnosed illnesses (LATEST) 04-23-2014
A computational tool developed at the University of Utah (U of U) has successfully identified diseases with unknown gene mutations in three separate cases, U of U researchers and their colleagues report in a new study in The American Journal of Human Genetics. The software, Phevor (Phenotype Driven Variant Ontological Re-ranking...
- Bioinformatics profiling identifies a new mammalian clock gene (LATEST) 04-22-2014
'Big data' approach quickens search for human genes related to control of daily biological rhythms Over the last few decades researchers have characterized a set of clock genes that drive daily rhythms of physiology and behavior in all types of species, from flies to humans. Over 15 mammalian clock proteins have been identified,...
- NCAR, CU Join Intel Parallel Computing Centers Program (LATEST) 04-22-2014
The National Center for Atmospheric Research (NCAR) and the University of Colorado Boulder (CU-Boulder) announced today that they will join the Intel Parallel Computing Centers program. Participants in the program will develop methods to increase the performance of applications that use advanced microprocessor technologies and will help...
- Mysteries of a nearby planetary system's dynamics now are solved (LATEST) 04-22-2014
Mysteries of one of the most fascinating nearby planetary systems now have been solved, report authors of a scientific paper to be published by the journal Monthly Notices of the Royal Astronomical Society in its early online edition on 22 April 2014. The study, which presents the first viable model for the planetary system...
- Simulating in tiny steps gave birth to long-sought-after method (LATEST) 04-22-2014
Using supercomputer simulations to predict which drug candidates offer the greatest potential has thus far not been very reliable, because both small drug-like molecules and the amino acids of proteins vary so much in their chemistry. Uppsala researchers have now cunningly managed to develop a method that has proven to be precise,...
- Computational method dramatically speeds up estimates of gene expression (LATEST) 04-22-2014
Method developed by CMU and UMD could pay dividends as genomic medicine expands With gene expression analysis growing in importance for both basic researchers and medical practitioners, researchers at Carnegie Mellon University and the University of Maryland have developed a new computational method that dramatically speeds up...
- Big data poses great opportunities (LATEST) 04-22-2014
Advances in the technology frontier have resulted in major disruptions and transformations in the massive data processing infrastructures. For the past three decades, classical database management systems, data warehousing and data analysis technologies have been well recognized as effective tools for data management and analysis....
- Researchers reconstruct Pacific storm track in climate model (LATEST) 04-17-2014
Model-within-a-model improves accuracy of global simulation of clouds, weather The first study that combines different scales — cloud-sized and earth-sized — in one model to simulate the effects of Asian pollution on the Pacific storm track shows that Asian pollution can influence weather over much of the world. The results show...
- At the origin of cell division (LATEST) 04-16-2014
The features of living matter emerge from inanimate matter Droplets of filamentous material enclosed in a lipid membrane: these are the models of a "simplified" cell used by the SISSA physicists Luca Giomi and Antonio DeSimone, who simulated the spontaneous emergence of cell motility and division - that is, features of living...
- Progress in the fight against quantum dissipation (LATEST) 04-16-2014
Scientists at Yale have confirmed a 50-year-old, previously untested theoretical prediction in physics and improved the energy storage time of a quantum switch by several orders of magnitude. They report their results in the April 17 issue of the journal Nature. High-quality quantum switches are essential for the development of...
- A*STAR researchers propose network-based evaluation tool to assess relief operations feasibility (LATEST) 04-16-2014
The United Nations Office for Disaster Risk Reduction reported that disasters have affected around 2.9 billion people worldwide from 2000-2012— killing more than a million, and damaging around 1.7 trillion US dollars in estimates. Moreover, natural disasters and their damages have been documented to occur with increasing intensity. Given the...
- NEC selects Chelsio for vector supercomputer (LATEST) 04-16-2014
Chelsio Communications has announced that its T4 based 10GbE adapters have been selected by NEC Corporation for NEC's HPC Systems for network and storage connectivity in SX-ACE, the latest SX-series Vector Supercomputer, targeted at large data processing applications. Equipped with the Chelsio T420-CR Storage Adapter, the NEC...
- Verizon Deploys 100G Technology on Metro Network (LATEST) 04-15-2014
To improve network efficiency and meet continued bandwidth demand, Verizon is deploying 100G technology on its U.S. metro network.The company will gain the same benefits of increased capacity, superior latency and improved scalability in its metro network as it has from deploying 100G technology in its long-haul network.Verizon is...
- Earthquake simulation tops one quadrillion flops: Computational record on SuperMUC (LATEST) 04-15-2014
A team of computer scientists, mathematicians and geophysicists at Technische Universitaet Muenchen (TUM) and Ludwig-Maximillians Universitaet Muenchen (LMU) have – with the support of the Leibniz Supercomputing Center of the Bavarian Academy of Sciences and Humanities (LRZ) – optimized the SeisSol earthquake simulation software on...
- New Context, Chef advance IT automation (LATEST) 04-15-2014
New Context has announced a new partnership with Chef, the leader in web-scale IT automation, to further the goal of enabling business transformation through automation customized to organizations' specific needs. The collaboration is intended to address the need for greater flexibility and adaptability in enterprise IT...
- Published: 07 June 2012
New toolkit, developed by University of Texas researchers, demonstrates use of data-driven science to plan for future pandemics
In 2009, the H1N1 "swine flu" pandemic struck, infecting millions and killing more than 18,000 worldwide, according to the World Health Organization. Though less severe than initially feared, the pandemic highlighted the potential threat of deadly viruses emerging from animals into humans, and the importance of quick and effective public health intervention.
In a globalized world, the probability of a severe pandemic striking are high, according to Lauren Ancel Meyers, an expert in infectious disease epidemiology at The University of Texas at Austin. A biologist by training, Meyers applies mathematical models and computer programs to understand, analyze and predict the transmission of diseases based on a large number of factors.
During the H1N1 outbreak, Meyers worked with officials in national and international public health agencies to simulate the spread of the disease based on properties of the flu virus, demographic information, traffic patterns, and other data. Her models illustrated how the disease might spread and how the placement of preventative measures, like vaccines and antivirals, could mitigate the pandemic.
Meyer's efforts to enhance data-driven science were aided by the Texas Advanced Computing Center (TACC), a leading advanced computing center located at the University. Meyers collaborated with TACC staff and used the center's systems to forecast the infections as the pandemic progressed, and to develop visualizations that vividly depicted the spread of the disease.
In 2010, Meyers began discussions with public health authorities at the Texas Department of State Health Services (DSHS) to continue the project. "They were interested in working with us to build quantitative tools to support decision making during pandemics," Meyers said.
Working with a team of UT researchers from biology, mathematics, statistics, engineering and computing, Meyers led the development of the Texas Pandemic Flu Toolkit, a web-based service that simulates the spread of pandemic flu through the state, forecasts the number of flu hospitalizations, and determines where and when to place ventilators to minimize fatalities.
The toolkit can be used in emergency situations to guide real-time decision-making. For example, public health officials might use the forecaster tool to determine when a pandemic might crest and what kind of magnitude they might see in terms of infections and hospitalizations, which can then be communicated to local authorities.
"While the forecasts will not be exact, they give a rough idea of how many people will be hospitalized around the state and when an epidemic may peak. Such information can lead to more timely and effective control measures," Meyers said.
The toolkit can also be used proactively to develop scenarios of probable pandemics and to see how they may impact different locations, age groups, and demographics. Various interventions, such as antivirals, vaccines, and public health announcements, can be input into the forecasts to determine their effect at different stages in the pandemic's evolution.
Importantly, TACC's powerful supercomputers allow data to be processed, crunched, and distributed to a large number of stakeholders simultaneously.
"Our methods often require rapid evaluation of a large number of possible solutions, which would be intractable without high performance computers," Meyers explained.
Bruce Clements, director of community preparedness with the DSHS, said the project provides Texas with an unprecedented ability for pandemic simulation and forecasting.
"The toolkit allows us to respond more effectively by providing the ability to quickly adjust predictive variables as we gather information. We can use this information to focus response efforts and optimize resources," Clements said. "These same tools may be used in training and exercises to better prepare our public health workforce."
Much of the data that drives the toolkit comes from the Outpatient Influenza-Like Illness Surveillance Network (ILINet). Texas ILINet has 134 enrolled providers who voluntarily report outpatient influenza-like illness by age group to the CDC on a weekly basis. This information is fed into the toolkit and grounds the simulations in real data. Historical information from past pandemics is also included, as are factors like humidity predictions and school calendars, both of which play a large role in the evolution of a pandemic.
David Morton from the UT Cockrell School of Engineering led the Texas ventilator-stockpiling portion of the project. A significant fraction of influenza patients infected during a pandemic require ventilators. Despite the fact that there is a strategic national stockpile of ventilators — about 6,000 — no one knows whether it is right-sized under a variety of different scenarios.
The tool takes information from the forecaster and projects where ventilators should be optimally placed to limit mortality. In essence, it provides the best solution to a bad situation.
"If we have a mild or moderate pandemic then we have ample ventilators. But if we have a severe pandemic, then we're grossly short," Morton said. "Officials knew that, but the thing they valued here was that we could actually quantify how short are we."
In the case of a pandemic, Texas would need all 6,000 ventilators; ten times as many would be required nationwide.
Meyers, Morton, and UT graduate students, Tom Hladish and HsinChan (Neo) Huang, developed the domain-specific aspects of the toolkit. However, it was TACC research associate, Greg P. Johnson, who translated the researchers' algorithms into a web-based interface that provides information-rich data visualizations directly to a user's desktop.
"Greg did a phenomenal job of translating our ideas into extremely user-friendly and transparent tools that surpassed all of our expectations," Meyers said.
Meyers and company unveiled the toolkit to DSHS officials in late 2011. Since that time, public health officials in Texas have been able to log on to the portal, enter a series of parameters, and receive clear and dynamic maps, graphs, and charts showing the results based on various conditions and inputs.
Meyers continues to meet with the CDC and the Texas DSHS to discuss ideas for extending the toolkit.
"These tools are designed to be accessible and to help officials make more quantitatively grounded policy decisions," Meyers said. "The user friendly web-based interfaces help to translate basic science into practical inputs for public health preparedness and response."
Data-driven science is a big deal these days. Recently, the U.S. government committed hundreds of millions of dollars toward enabling new applications that can harness data streams effectively. The Texas Pandemic Flu toolkit shows how this new paradigm will affect public health decision-making. For now, it remains one of the few pandemic-fighting tools available to public health officials.
Said Meyers: "We've just scratched the surface in terms of developing quantitative tools that improve our ability to track and control infectious disease outbreaks."