- Program could help solve arson cases (LATEST) 04-24-2014
Sifting through the chemical clues left behind by arson is delicate, time-consuming work, but University of Alberta researchers teaming with RCMP scientists in Canada, have found a way to speed the process. A computer program developed by University of Alberta chemistry professor James Harynuk, his team of graduate and...
- When things get glassy, molecules go fractal (LATEST) 04-24-2014
Explaining the mysterious dynamics of glassy materials Colorful church windows, beads on a necklace and many of our favorite plastics share something in common -- they all belong to a state of matter known as glasses. School children learn the difference between liquids and gases, but centuries of scholarship have failed to produce...
- How productive are the ore factories in the deep sea? (LATEST) 04-24-2014
GEOMAR scientists demonstrate in 'Nature' the supply routes of black smokers About ten years after the first moon landing, scientists on earth made a discovery that proved that our home planet still holds a lot of surprises in store for us. Looking through the portholes of the submersible ALVIN near the bottom of the Pacific Ocean in...
- New Physical Sciences Complex Holds Revolutionary Promise (LATEST) 04-24-2014
UMD celebrates grand opening of world-class research and education facility The University of Maryland's new Physical Sciences Complex is an instant architectural landmark with its shimmering elliptical atrium, and with its official opening on April 23, 2014, the state-of-the-art research building is poised to become a landmark of...
- UCSB superconducting qubit array points the way to quantum supercomputers (LATEST) 04-23-2014
A new five-qubit array from UCSB’s Martinis Group is on the threshold of making a quantum computer technologically feasible to build A fully functional quantum supercomputer is one of the holy grails of physics. Unlike conventional supercomputers, the quantum version uses qubits (quantum bits), which make direct use of the multiple...
- U of U software identifies disease-causing mutations in undiagnosed illnesses (LATEST) 04-23-2014
A computational tool developed at the University of Utah (U of U) has successfully identified diseases with unknown gene mutations in three separate cases, U of U researchers and their colleagues report in a new study in The American Journal of Human Genetics. The software, Phevor (Phenotype Driven Variant Ontological Re-ranking...
- Bioinformatics profiling identifies a new mammalian clock gene (LATEST) 04-22-2014
'Big data' approach quickens search for human genes related to control of daily biological rhythms Over the last few decades researchers have characterized a set of clock genes that drive daily rhythms of physiology and behavior in all types of species, from flies to humans. Over 15 mammalian clock proteins have been identified,...
- NCAR, CU Join Intel Parallel Computing Centers Program (LATEST) 04-22-2014
The National Center for Atmospheric Research (NCAR) and the University of Colorado Boulder (CU-Boulder) announced today that they will join the Intel Parallel Computing Centers program. Participants in the program will develop methods to increase the performance of applications that use advanced microprocessor technologies and will help...
- Mysteries of a nearby planetary system's dynamics now are solved (LATEST) 04-22-2014
Mysteries of one of the most fascinating nearby planetary systems now have been solved, report authors of a scientific paper to be published by the journal Monthly Notices of the Royal Astronomical Society in its early online edition on 22 April 2014. The study, which presents the first viable model for the planetary system...
- Simulating in tiny steps gave birth to long-sought-after method (LATEST) 04-22-2014
Using supercomputer simulations to predict which drug candidates offer the greatest potential has thus far not been very reliable, because both small drug-like molecules and the amino acids of proteins vary so much in their chemistry. Uppsala researchers have now cunningly managed to develop a method that has proven to be precise,...
- Computational method dramatically speeds up estimates of gene expression (LATEST) 04-22-2014
Method developed by CMU and UMD could pay dividends as genomic medicine expands With gene expression analysis growing in importance for both basic researchers and medical practitioners, researchers at Carnegie Mellon University and the University of Maryland have developed a new computational method that dramatically speeds up...
- Big data poses great opportunities (LATEST) 04-22-2014
Advances in the technology frontier have resulted in major disruptions and transformations in the massive data processing infrastructures. For the past three decades, classical database management systems, data warehousing and data analysis technologies have been well recognized as effective tools for data management and analysis....
- Researchers reconstruct Pacific storm track in climate model (LATEST) 04-17-2014
Model-within-a-model improves accuracy of global simulation of clouds, weather The first study that combines different scales — cloud-sized and earth-sized — in one model to simulate the effects of Asian pollution on the Pacific storm track shows that Asian pollution can influence weather over much of the world. The results show...
- At the origin of cell division (LATEST) 04-16-2014
The features of living matter emerge from inanimate matter Droplets of filamentous material enclosed in a lipid membrane: these are the models of a "simplified" cell used by the SISSA physicists Luca Giomi and Antonio DeSimone, who simulated the spontaneous emergence of cell motility and division - that is, features of living...
- Progress in the fight against quantum dissipation (LATEST) 04-16-2014
Scientists at Yale have confirmed a 50-year-old, previously untested theoretical prediction in physics and improved the energy storage time of a quantum switch by several orders of magnitude. They report their results in the April 17 issue of the journal Nature. High-quality quantum switches are essential for the development of...
- A*STAR researchers propose network-based evaluation tool to assess relief operations feasibility (LATEST) 04-16-2014
The United Nations Office for Disaster Risk Reduction reported that disasters have affected around 2.9 billion people worldwide from 2000-2012— killing more than a million, and damaging around 1.7 trillion US dollars in estimates. Moreover, natural disasters and their damages have been documented to occur with increasing intensity. Given the...
- NEC selects Chelsio for vector supercomputer (LATEST) 04-16-2014
Chelsio Communications has announced that its T4 based 10GbE adapters have been selected by NEC Corporation for NEC's HPC Systems for network and storage connectivity in SX-ACE, the latest SX-series Vector Supercomputer, targeted at large data processing applications. Equipped with the Chelsio T420-CR Storage Adapter, the NEC...
- Verizon Deploys 100G Technology on Metro Network (LATEST) 04-15-2014
To improve network efficiency and meet continued bandwidth demand, Verizon is deploying 100G technology on its U.S. metro network.The company will gain the same benefits of increased capacity, superior latency and improved scalability in its metro network as it has from deploying 100G technology in its long-haul network.Verizon is...
- Earthquake simulation tops one quadrillion flops: Computational record on SuperMUC (LATEST) 04-15-2014
A team of computer scientists, mathematicians and geophysicists at Technische Universitaet Muenchen (TUM) and Ludwig-Maximillians Universitaet Muenchen (LMU) have – with the support of the Leibniz Supercomputing Center of the Bavarian Academy of Sciences and Humanities (LRZ) – optimized the SeisSol earthquake simulation software on...
- New Context, Chef advance IT automation (LATEST) 04-15-2014
New Context has announced a new partnership with Chef, the leader in web-scale IT automation, to further the goal of enabling business transformation through automation customized to organizations' specific needs. The collaboration is intended to address the need for greater flexibility and adaptability in enterprise IT...
- Published: 06 June 2012
For millennia, mankind has discovered new drugs either through educated guesswork or blind luck. But with the proliferation of advanced computing, a new paradigm has emerged whereby one can find drug targets through simulation and modeling.
A combination of modeling, simulation, analysis and visualization, the process is accomplished through the expert application of biophysical algorithms and the high-performance, parallel-processing supercomputers of the Texas Advanced Computing Center's (TACC).
"Computers are a good way to accelerate the process of drug design," said Bajaj. "It takes 10 years to proof out a drug, and a billion dollars or more. Hence computational drug discovery is not only timesaving, but economics tells you this is the way we should be going."
[Recent studies suggest the cost and time required to move from the discovery phase to delivery is growing, with fewer drugs approved by the FDA per billion dollars spent. These findings highlighting the need for faster, cheaper methods.]
The work of discovering a breakthrough new drug begins with a careful analysis of the virus, bacteria or mutation that causes the illness. By shooting powerful x-rays through a sample, electron microscopes create nanoscale pictures of the relevant molecules in near-natural conditions. The images that emerge from these microscopes are a speckled mess, however, and must be cleaned up considerably to become useful.
Combine 100,000 of these cleaned-up images and you have a three-dimensional model that can tell you about the structure of the molecule you are exploring. Structure is an important characteristic for drug discovery because it reflects the shape of the relevant molecules, and shape complementarity — the degree to which two molecules fit together — is a major factor in whether a drug will bond: the first step toward accomplishing its mission. When done correctly, the 3D model allows one to understand, identify and test potential binding sites on a virus.
"The nice thing about the electron projection is that you can build up volumetric models, so you can not only build a surface model from the outside, but also include information that is inside," Bajaj said.
As a consequence of improvements to the image reconstruction and modeling algorithms, Bajaj and his collaborators can now identify secondary structures of molecules, like individual side-chains — floppy but crucial limbs that extend from the central spine of molecule. This level of detail is required to accurately predict binding.
A recent paper by Bajaj, Samrat Goswami (Exa Corporation), and Qin Zhang (UT Austin) in the February 2012 edition of the Journal of Structural Biology showed that their algorithms were able to detect the secondary structures (α-helices and β-sheets) of proteins through intermediate (6-10 Å) and coarse (10-15 Å) resolution 3D maps reconstructed primarily from single particle cryo-electron microscopy.
"If you don't get all the factors into simulation, you get the wrong answer and your predictions suffer," said Bajaj. "And if your predictions suffer, you haven't done anything to accelerate the solution. You're just loading yourself down."
Once a target molecule's structure has been determined, it is then necessary to test potential drug compounds to see if any of them might fit a binding site. In the case of the HIV virus, as studied by Bajaj, the goal is find a molecule that can bind to a specific location on the virus' surface and signal to the virus that it has reached its destination. Rather than injecting its genetic material into a host cell, the potential drug would induce the virus to spill its contents in the extra-cellular medium where it would do no harm.
But in a world of potential compounds, how does a scientist find the one drug molecule that might match and bind to the target region?
Over the last decade, computer scientists have found faster ways to search for things using computers. Call it the "Googlization" of research. Bajaj has taken these insights and applied them to the problem of drug target screening. Instead of using search algorithms to find the closest coffee shop, Bajaj and his research team use them to create an ordered list of targets based on the binding energies and biochemical dynamics when two molecules come into contact, which indicates the most compatible compounds.
The molecules with the highest ranking are then visualized by Bajaj and analyzed to reveal where the algorithms have worked and where they can be improved.
"We not only quantify the accuracy, but also the errors that we make," Bajaj explained. "Without knowing the cause of error, how can you go back and improve your model?"
To make matters more difficult, researchers must ensure that the target compounds don't bind to other vital biological molecules, as these can lead to serious side effects. The sheer number of possible combinations and configurations boggles the mind. But as computer hardware and algorithms improve, such problems are becoming tractable.
Bajaj uses practically every system in TACC's computational ecosystem to solve these problems, including Ranger and Lonestar (TACC's high performance computing systems), Longhorn (TACC's remote visualization system) and Stallion (the super high-resolution tiled display in the TACC/ACES Visualization Laboratory).
"We are blessed at UT Austin with having all of these high performance computers and the TACC organization, because without them we couldn't make progress," Bajaj said.
Utilizing both CPU- and GPU-based methods, and mapping different processes to different architectures to accelerate the computation, Bajaj and his team improved the resolution and accuracy of the drug models tremendously. They have also sped up the docking search by an order of magnitude. "What used to take months is now taking a few days," he said.
According to J. Tinsley Oden, director of ICES, Bajaj's research has the potential to transform the way the drug discovery process works.
"Dr. Bajaj's breakthrough work on computer modeling the incredibly complex mechanisms central to the design, behavior, and delivery of drugs for combating infectious diseases is a testament to the power of modern computer modeling and simulation, high performance computing, computer visualization, and to Bajaj's special abilities and deep insight into both computer modeling and the biological processes involved in drug delivery," he said.
Computational drug discovery is a hot topic in academic research centers and industry alike. As chair of a study section of the National Institutes of Health, Bajaj often speaks with individuals from the pharmaceutical industry about changes in the field.
"More and more, they're moving into the computational drug screening arena, and more and more it's teams of people working together," Bajaj said. "The biophysicist, the biochemist and the synthetic chemist are sitting together with the computational expert, and they say it's giving them clues as to what they should be doing next."
Chandrajit Bajaj, professor of computer science at The University of Texas at Austin has been integrally involved in these efforts for more than 20 years. Computational and Applied Mathematics chair in Visualization and director of the Computational Visualization Center at UT's Institute for Computational Engineering and Sciences (ICES), over his career Bajaj has systematically attacked each step of the drug discovery process, improving the speed and accuracy of the algorithms involved in computational drug discovery. The research has been sponsored by the National Institute of Health, the National Science Foundation, and the Texas Institute of Drug and Diagnostic Development.