- Program could help solve arson cases (LATEST) 04-24-2014
Sifting through the chemical clues left behind by arson is delicate, time-consuming work, but University of Alberta researchers teaming with RCMP scientists in Canada, have found a way to speed the process. A computer program developed by University of Alberta chemistry professor James Harynuk, his team of graduate and...
- When things get glassy, molecules go fractal (LATEST) 04-24-2014
Explaining the mysterious dynamics of glassy materials Colorful church windows, beads on a necklace and many of our favorite plastics share something in common -- they all belong to a state of matter known as glasses. School children learn the difference between liquids and gases, but centuries of scholarship have failed to produce...
- How productive are the ore factories in the deep sea? (LATEST) 04-24-2014
GEOMAR scientists demonstrate in 'Nature' the supply routes of black smokers About ten years after the first moon landing, scientists on earth made a discovery that proved that our home planet still holds a lot of surprises in store for us. Looking through the portholes of the submersible ALVIN near the bottom of the Pacific Ocean in...
- New Physical Sciences Complex Holds Revolutionary Promise (LATEST) 04-24-2014
UMD celebrates grand opening of world-class research and education facility The University of Maryland's new Physical Sciences Complex is an instant architectural landmark with its shimmering elliptical atrium, and with its official opening on April 23, 2014, the state-of-the-art research building is poised to become a landmark of...
- UCSB superconducting qubit array points the way to quantum supercomputers (LATEST) 04-23-2014
A new five-qubit array from UCSB’s Martinis Group is on the threshold of making a quantum computer technologically feasible to build A fully functional quantum supercomputer is one of the holy grails of physics. Unlike conventional supercomputers, the quantum version uses qubits (quantum bits), which make direct use of the multiple...
- U of U software identifies disease-causing mutations in undiagnosed illnesses (LATEST) 04-23-2014
A computational tool developed at the University of Utah (U of U) has successfully identified diseases with unknown gene mutations in three separate cases, U of U researchers and their colleagues report in a new study in The American Journal of Human Genetics. The software, Phevor (Phenotype Driven Variant Ontological Re-ranking...
- Bioinformatics profiling identifies a new mammalian clock gene (LATEST) 04-22-2014
'Big data' approach quickens search for human genes related to control of daily biological rhythms Over the last few decades researchers have characterized a set of clock genes that drive daily rhythms of physiology and behavior in all types of species, from flies to humans. Over 15 mammalian clock proteins have been identified,...
- NCAR, CU Join Intel Parallel Computing Centers Program (LATEST) 04-22-2014
The National Center for Atmospheric Research (NCAR) and the University of Colorado Boulder (CU-Boulder) announced today that they will join the Intel Parallel Computing Centers program. Participants in the program will develop methods to increase the performance of applications that use advanced microprocessor technologies and will help...
- Mysteries of a nearby planetary system's dynamics now are solved (LATEST) 04-22-2014
Mysteries of one of the most fascinating nearby planetary systems now have been solved, report authors of a scientific paper to be published by the journal Monthly Notices of the Royal Astronomical Society in its early online edition on 22 April 2014. The study, which presents the first viable model for the planetary system...
- Simulating in tiny steps gave birth to long-sought-after method (LATEST) 04-22-2014
Using supercomputer simulations to predict which drug candidates offer the greatest potential has thus far not been very reliable, because both small drug-like molecules and the amino acids of proteins vary so much in their chemistry. Uppsala researchers have now cunningly managed to develop a method that has proven to be precise,...
- Computational method dramatically speeds up estimates of gene expression (LATEST) 04-22-2014
Method developed by CMU and UMD could pay dividends as genomic medicine expands With gene expression analysis growing in importance for both basic researchers and medical practitioners, researchers at Carnegie Mellon University and the University of Maryland have developed a new computational method that dramatically speeds up...
- Big data poses great opportunities (LATEST) 04-22-2014
Advances in the technology frontier have resulted in major disruptions and transformations in the massive data processing infrastructures. For the past three decades, classical database management systems, data warehousing and data analysis technologies have been well recognized as effective tools for data management and analysis....
- Researchers reconstruct Pacific storm track in climate model (LATEST) 04-17-2014
Model-within-a-model improves accuracy of global simulation of clouds, weather The first study that combines different scales — cloud-sized and earth-sized — in one model to simulate the effects of Asian pollution on the Pacific storm track shows that Asian pollution can influence weather over much of the world. The results show...
- At the origin of cell division (LATEST) 04-16-2014
The features of living matter emerge from inanimate matter Droplets of filamentous material enclosed in a lipid membrane: these are the models of a "simplified" cell used by the SISSA physicists Luca Giomi and Antonio DeSimone, who simulated the spontaneous emergence of cell motility and division - that is, features of living...
- Progress in the fight against quantum dissipation (LATEST) 04-16-2014
Scientists at Yale have confirmed a 50-year-old, previously untested theoretical prediction in physics and improved the energy storage time of a quantum switch by several orders of magnitude. They report their results in the April 17 issue of the journal Nature. High-quality quantum switches are essential for the development of...
- A*STAR researchers propose network-based evaluation tool to assess relief operations feasibility (LATEST) 04-16-2014
The United Nations Office for Disaster Risk Reduction reported that disasters have affected around 2.9 billion people worldwide from 2000-2012— killing more than a million, and damaging around 1.7 trillion US dollars in estimates. Moreover, natural disasters and their damages have been documented to occur with increasing intensity. Given the...
- NEC selects Chelsio for vector supercomputer (LATEST) 04-16-2014
Chelsio Communications has announced that its T4 based 10GbE adapters have been selected by NEC Corporation for NEC's HPC Systems for network and storage connectivity in SX-ACE, the latest SX-series Vector Supercomputer, targeted at large data processing applications. Equipped with the Chelsio T420-CR Storage Adapter, the NEC...
- Verizon Deploys 100G Technology on Metro Network (LATEST) 04-15-2014
To improve network efficiency and meet continued bandwidth demand, Verizon is deploying 100G technology on its U.S. metro network.The company will gain the same benefits of increased capacity, superior latency and improved scalability in its metro network as it has from deploying 100G technology in its long-haul network.Verizon is...
- Earthquake simulation tops one quadrillion flops: Computational record on SuperMUC (LATEST) 04-15-2014
A team of computer scientists, mathematicians and geophysicists at Technische Universitaet Muenchen (TUM) and Ludwig-Maximillians Universitaet Muenchen (LMU) have – with the support of the Leibniz Supercomputing Center of the Bavarian Academy of Sciences and Humanities (LRZ) – optimized the SeisSol earthquake simulation software on...
- New Context, Chef advance IT automation (LATEST) 04-15-2014
New Context has announced a new partnership with Chef, the leader in web-scale IT automation, to further the goal of enabling business transformation through automation customized to organizations' specific needs. The collaboration is intended to address the need for greater flexibility and adaptability in enterprise IT...
- Published: 06 June 2012
National Science Foundation grants will bring together what's known about how species are related
A new initiative aims to build a comprehensive tree of life that brings together everything scientists know about how all species are related, from the tiniest bacteria to the tallest tree.
Researchers are working to provide the infrastructure and computational tools to enable automatic updating of the tree of life, as well as develop the analytical and visualization tools to study it.
Scientists have been building evolutionary trees for more than 150 years, since Charles Darwin drew the first sketches in his notebook.
Darwin's theory of evolution explained that millions of species are related and gave biologists and paleontologists the enormous challenge of discovering the branching pattern of the tree of life.
But despite significant progress in fleshing out the major branches of the tree of life, today there is still no central place where researchers can go to visualize and analyze the entire tree.
Now, thanks to grants totaling $13 million from the National Science Foundation's (NSF) Assembling, Visualizing, and Analyzing the Tree of Life (AVAToL) program, three teams of scientists plan to make that a reality.
"The AVAToL awards are an exciting new direction for an area that's a foundation of much of biology," says Alan Townsend, director of NSF's Division of Environmental Biology. "That's critical to understanding a changing relationship between human society and Earth's biodiversity."
Figuring out how the millions of species on Earth are related to one another isn't just important for pinpointing an antelope's closest kin, or determining if tuna are more closely related to starfish or hagfish.
Information about evolutionary relationships is fundamental to comparative biology research. It helps scientists identify promising new medicines; develop hardier, higher-yielding crops; and fight infectious diseases such as HIV, anthrax and influenza.
If evolutionary trees are so widely used, why has assembling them across all life been so hard to achieve?
It's not for lack of research, or data. Advances in DNA sequencing and evolutionary analysis, discovery of pivotal early fossils, and novel methods and tools have enabled thousands of new evolutionary trees to be published in scientific journals each year.
However, most of these focus on specific, disconnected branches of the tree of life.
Part of the difficulty lies in the sheer enormity of the task. The largest evolutionary trees to date contain roughly 100,000 groups of organisms.
Assembling the branches for all species of animals, plants, fungi and microbes--and the countless more still being named or discovered--will require new computational tools for analyzing large data sets, for combining diverse kinds of data, and for connecting vast numbers of published trees into a synthetic whole.
Another difficulty lies in how scientists typically disseminate their results. A tiny fraction of all evolutionary trees have been published. Researchers estimate a mere four percent end up in a database in a digital form.
Most of the knowledge is locked up in figures in static journal articles in file formats that may be difficult for other researchers to download, reanalyze or merge with new information.
AVAToL aims to change that.
What makes this program different from previous efforts, scientists say, is its scope: its focus on creating an open, dynamic, evolutionary framework that can be continually refined as new biodiversity data is collected, and its development of computational and visualization tools to scale up tree-based evolutionary analyses.
Researchers will be able to go online and compare their trees to others that have already been published, or download trees for further study.
They'll also be able to expand the tree, filling in the missing branches and placing newly named or discovered species among their relatives.
The goal is to incorporate new trees automatically, so the complete tree can be continuously updated.
In addition to the creation of an updatable tree of life, AVAToL scientists will create new tools for the kinds of research that rely on evolutionary trees and for the collection and analysis of important evolutionary data, including from fossils critical to the placement of many branches in the tree of life.
The three NSF-funded AVAToL projects are:
Automated and Community-Driven Synthesis of the Tree of Life
Principal Investigator: Karen Cranston, Duke University and the National Evolutionary Synthesis Center
This project will produce the first online, comprehensive first-draft tree of all 1.8 million named species, accessible to both the public and scientists. Assembly of the tree will incorporate previously published results and efforts to develop, test and improve methods of data synthesis. This initial tree of life, called the Open Tree of Life, will not be static. Scientists will develop tools for researchers to update and revise the tree as new data come in.
Arbor: Comparative Analysis Workflows for the Tree of Life
Principal Investigator: Luke Harmon, University of Idaho
Scientists deal with daunting volumes of data. One of the most basic challenges facing researchers is how to organize that information into a usable format that can inspire new scientific insights. This project team is working to develop a way to visually portray evolutionary data so scientists can see, at a glance, how organisms are related. The team will create software tools that will enable researchers to visualize and analyze data across the tree of life, enabling research in all areas of comparative biology at multiple evolutionary, space and time scales. The results have the potential to transform the way biologists test evolutionary and ecological hypotheses, enabling new research in fields from medicine to public health, from agriculture to ecology to genetics.
Next Generation Phenomics for the Tree of Life
Principal Investigator: Maureen O'Leary, SUNY-Stony Brook
This team of biologists, computer scientists and paleontologists will extend and adapt methods from computer vision, machine learning and natural language processing to enable rapid and automated study of species' phenotypes on a vast scale across the tree of life. The team's goal is to develop large phenomic datasets using new methods, and to provide the scientific community and the public with tools for future such work. Phenomics is an area of biology that measures the physical and biochemical traits of organisms as they change in response to genetic mutations and environmental influences.
Enormous phenomic datasets, many with images, will foster public interest in biodiversity and the fossil record. Phenotypic data allow scientists to reconstruct the evolutionary history of fossil species, in turn crucial for an understanding of the history of life. This project will leverage recent advances in image analysis and natural language processing to develop novel approaches to rapidly advance the collection and analysis of phenotypic data for the tree of life.