- Bioinformatics profiling identifies a new mammalian clock gene (LATEST) 04-22-2014
'Big data' approach quickens search for human genes related to control of daily biological rhythms Over the last few decades researchers have characterized a set of clock genes that drive daily rhythms of physiology and behavior in all types of species, from flies to humans. Over 15 mammalian clock proteins have been identified,...
- NCAR, CU Join Intel Parallel Computing Centers Program (LATEST) 04-22-2014
The National Center for Atmospheric Research (NCAR) and the University of Colorado Boulder (CU-Boulder) announced today that they will join the Intel Parallel Computing Centers program. Participants in the program will develop methods to increase the performance of applications that use advanced microprocessor technologies and will help...
- Mysteries of a nearby planetary system's dynamics now are solved (LATEST) 04-22-2014
Mysteries of one of the most fascinating nearby planetary systems now have been solved, report authors of a scientific paper to be published by the journal Monthly Notices of the Royal Astronomical Society in its early online edition on 22 April 2014. The study, which presents the first viable model for the planetary system...
- Simulating in tiny steps gave birth to long-sought-after method (LATEST) 04-22-2014
Using supercomputer simulations to predict which drug candidates offer the greatest potential has thus far not been very reliable, because both small drug-like molecules and the amino acids of proteins vary so much in their chemistry. Uppsala researchers have now cunningly managed to develop a method that has proven to be precise,...
- Computational method dramatically speeds up estimates of gene expression (LATEST) 04-22-2014
Method developed by CMU and UMD could pay dividends as genomic medicine expands With gene expression analysis growing in importance for both basic researchers and medical practitioners, researchers at Carnegie Mellon University and the University of Maryland have developed a new computational method that dramatically speeds up...
- Big data poses great opportunities (LATEST) 04-22-2014
Advances in the technology frontier have resulted in major disruptions and transformations in the massive data processing infrastructures. For the past three decades, classical database management systems, data warehousing and data analysis technologies have been well recognized as effective tools for data management and analysis....
- Researchers reconstruct Pacific storm track in climate model (LATEST) 04-17-2014
Model-within-a-model improves accuracy of global simulation of clouds, weather The first study that combines different scales — cloud-sized and earth-sized — in one model to simulate the effects of Asian pollution on the Pacific storm track shows that Asian pollution can influence weather over much of the world. The results show...
- At the origin of cell division (LATEST) 04-16-2014
The features of living matter emerge from inanimate matter Droplets of filamentous material enclosed in a lipid membrane: these are the models of a "simplified" cell used by the SISSA physicists Luca Giomi and Antonio DeSimone, who simulated the spontaneous emergence of cell motility and division - that is, features of living...
- Progress in the fight against quantum dissipation (LATEST) 04-16-2014
Scientists at Yale have confirmed a 50-year-old, previously untested theoretical prediction in physics and improved the energy storage time of a quantum switch by several orders of magnitude. They report their results in the April 17 issue of the journal Nature. High-quality quantum switches are essential for the development of...
- A*STAR researchers propose network-based evaluation tool to assess relief operations feasibility (LATEST) 04-16-2014
The United Nations Office for Disaster Risk Reduction reported that disasters have affected around 2.9 billion people worldwide from 2000-2012— killing more than a million, and damaging around 1.7 trillion US dollars in estimates. Moreover, natural disasters and their damages have been documented to occur with increasing intensity. Given the...
- NEC selects Chelsio for vector supercomputer (LATEST) 04-16-2014
Chelsio Communications has announced that its T4 based 10GbE adapters have been selected by NEC Corporation for NEC's HPC Systems for network and storage connectivity in SX-ACE, the latest SX-series Vector Supercomputer, targeted at large data processing applications. Equipped with the Chelsio T420-CR Storage Adapter, the NEC...
- Verizon Deploys 100G Technology on Metro Network (LATEST) 04-15-2014
To improve network efficiency and meet continued bandwidth demand, Verizon is deploying 100G technology on its U.S. metro network.The company will gain the same benefits of increased capacity, superior latency and improved scalability in its metro network as it has from deploying 100G technology in its long-haul network.Verizon is...
- Earthquake simulation tops one quadrillion flops: Computational record on SuperMUC (LATEST) 04-15-2014
A team of computer scientists, mathematicians and geophysicists at Technische Universitaet Muenchen (TUM) and Ludwig-Maximillians Universitaet Muenchen (LMU) have – with the support of the Leibniz Supercomputing Center of the Bavarian Academy of Sciences and Humanities (LRZ) – optimized the SeisSol earthquake simulation software on...
- New Context, Chef advance IT automation (LATEST) 04-15-2014
New Context has announced a new partnership with Chef, the leader in web-scale IT automation, to further the goal of enabling business transformation through automation customized to organizations' specific needs. The collaboration is intended to address the need for greater flexibility and adaptability in enterprise IT...
- Quantum manipulation: Filling the gap between quantum and classical world (LATEST) 04-14-2014
Quantum superposition is a fundamental and also intriguing property of the quantum world. Because of superposition, a quantum system can be in two different states simultaneously, like a cat that can be both "dead" and "alive" at the same time. However, this anti-intuitive phenomenon cannot be observed directly, because whenever a...
- Combs of light accelerate communication (LATEST) 04-14-2014
Miniaturized optical frequency comb sources allow for transmission of data streams of several terabits per second over hundreds of kilometers – this has now been demonstrated by researchers of Karlsruhe Institute of Technology (KIT) and the Swiss École Polytechnique Fédérale de Lausanne (EPFL) in a experiment presented in the...
- Cloud leaders enlist ISVs to sell horizontal clouds in vertical markets (LATEST) 04-14-2014
According to Technology Business Research Inc.’s (TBR) 4Q13 Public Cloud Benchmark, public cloud revenue grew 47.4% from the year-ago quarter to $6.2 billion across the 50 vendors covered in the report, while market growth for overall public cloud services grew 25.6% year-to-year over the same period to $15.1 billion. Cost savings,...
- Bailing out the world's fresh water bank (LATEST) 04-14-2014
An Australian-based water scientist is testing a new technology to help save imperilled underground water resources in Australia and around the world as climate change tightens its grip on the global food supply. Dr Margaret Shanafield of the National Centre for Groundwater Research and Training at Flinders University has developed a...
- Record number of CU students awarded prestigious access to supercomputing (LATEST) 04-13-2014
A record number of University of Colorado Boulder students have received a prestigious National Science Foundation Graduate Research Fellowship, which recognizes outstanding graduate students in the science, technology, engineering and mathematics (STEM) fields. Thirty CU-Boulder students were among the 2,000 fellowship winners...
- AWI researchers decipher climate paradox from the Miocene (LATEST) 04-12-2014
Growth of Antarctic ice sheet triggered warming in the Southern Ocean Scientists of the German Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research (AWI), have deciphered a supposed climate paradox from the Miocene era by means of complex model simulations. When the Antarctic ice sheet grew to its present-day...
- Published: 05 March 2012
Initial Projects Range from Storm Predictions to Stock Market Data
Accurately predicting severe storms, or what Wall Street’s markets will do next, may become just a bit easier in coming months as Gordon, a unique supercomputer at the San Diego Supercomputer Center (SDSC) at the University of California, San Diego, begins helping researchers delve into these and other data-intensive projects.
Following acceptance testing in January, Gordon has now begun serving University of California and national academic researchers as well as industry and government agencies. Named for its massive amounts of flash-based memory, Gordon is part of the National Science Foundation’s (NSF) Extreme Science and Engineering Discovery Environment, or XSEDE program, a nationwide partnership comprising 16 supercomputers and high-end visualization and data analysis resources.
The first round of allocations recently approved by XSEDE includes about a dozen separate research projects that will use a combined 10 million processing hours on Gordon, whose data-intensive supercomputational capabilities will help scientists find, among other things, ways to create more robust molecular “force fields” to find new drugs for diseases, and explore how particles in atmospheric aerosols directly affect air quality.
Along with its innovative supernodes that create multi-terabyte shared-memory systems, Gordon, the result of a five-year, $20 million National Science Foundation (NSF) grant, can process data-intensive problems about 10 times faster than other supercomputers because it employs massive amounts of flash-based memory instead of slower spinning disks.
“To enable the basic and breakthrough science and scholarship of this new century, UC San Diego has made a significant investment in research cyberinfrastructure, an investment that will benefit our university, the University of California system, our state and our nation,” said UC San Diego Vice Chancellor for Research Sandra A. Brown. “Gordon is a powerful and promising part of that infrastructure, and should quickly demonstrate its value as a research resource.”
“Gordon is now assisting the research community in a number of very diverse and data-intensive projects, many of them which could not be addressed previously because scientists simply didn’t have the [super]computer capability to do so – and some of which are rather unconventional applications that could have a broader, societal impact,” said SDSC Director Michael Norman, the principal investigator for the Gordon project.
Currently the most powerful supercomputer in all of Southern California based on speed and capability, Gordon was recently ranked among the top 50 supercomputers in the world in terms of speed of doing pure math. However, it is the most powerful high-performance computing (HPC) system anywhere when it comes to accessing data, according to Allan Snavely, SDSC’s associate director and project leader along with Norman. In recent validation tests, Gordon achieved an unprecedented 36 million input/output operations per second, or IOPS, a critical measure for doing data-intensive computing such as sifting through huge datasets to find a mere sliver of meaningful information.
“Gordon’s unique ability to provide extremely fast random access to very large data sets is opening new doors for researchers across a wide range of disciplines, some outside the traditional realm of science,” said Snavely. “Such areas could include analyzing social-media data and seeing a political event such as the recent Arab Spring emerge in cyberspace before we saw people emerging in the street.”
Here are a few of the projects included in the first round of allocations for Gordon that were recently approved by XSEDE’s Resource Allocation Committee (XRAC). The XRAC consists of researchers and computational scientists who review the submissions and make allocation recommendations for all XSEDE resources:
Gordon Takes on Wall Street
So-called high-frequency trading has become commonplace in the U.S. stock market, and has been considered by the press to have contributed to the “flash crash” in May 6, 2010 that sent the Dow Jones Industrial average down 998.5 points in just 20 minutes. The Securities and Exchange Commission recently said it is looking to curb high-frequency traders' huge influence on stock trading, and is considering charging fees for buy and sell orders that are later canceled, among other options.
Mao Ye, an assistant professor of finance at the University of Illinois at Urbana-Champaign, is using Gordon to sift through massive amounts of NASDAQ historical ITCH market data to better understand the practices and patterns in high-frequency trading.
According to Ye’s research, many of these traders place an order, only to cancel it within 0.001 second or less. “We want to find out if these cancelled orders are being done to manipulate the market in some way,” said Ye, whose research also has focused on odd lots – or trades of less than 100 shares – which do not have to be reported to the consolidated tape.
“Because the minimum report requirement is 100 shares, we believe that people trade 99 shares for a strategic reason: they may be trying to hide their information,” said Ye. “Since so many orders are cancelled within such a short time, a natural question to ask is whether orders are cancelled due to legitimate reasons, or if these orders are entered for deceptive and manipulative reasons. Also, are these orders contributing to the liquidity and efficiency of the financial market, or are they the causes of the recent flash crash?”
One recent hypothesis is called “quote stuffing,” which states that high-frequency traders send huge amount of messages to purposely slow down the exchange’s trading system. “We’d like to know if these kinds of orders contribute to liquidity of the market, or whether they lead to abnormal volatility.”
Ye and his research team were allocated 500,000 processing hours of time on Gordon to process two years of data containing about 400 million order messages for each day. Ye’s research could be important for recent policy debates on whether a minimum quote life should be set for the orders.
Weather Warnings and Water Management
Two of the first research projects using Gordon are in the area of climate simulations, and severe weather predictions in particular. Ming Xue, a professor at the School of Meteorology, and Director of the Center for Analysis and Prediction of Storms (CAPS) at the University of Oklahoma, was granted 2,000,000 processor hours on Gordon to better understand and improve the accuracy of predictions of high-impact weather such as severe thunderstorms, tornadoes, and tropical cyclones.
“High-speed computing and data storage will allow us to create very high-resolution simulations of tornadoes for analysis at numerous output times, and allow us to use radar data at precise time intervals to create accurate numerical simulations and predictions of severe weather,” said Xue, whose research is funded by multiple NSF grants. “These data files are extremely large, so using traditional storage devices during the simulations can be extremely time-consuming. Gordon will help us significantly reduce the input/output time, thus speeding-up the overall research process.”
Craig Mattocks, an atmospheric research scientist at the Rosenstiel School of Marine and Atmospheric Science at the University of Miami, was awarded 3,500,000 processor hours on Gordon to deploy the Ocean Land Atmosphere Model (OLAM) Earth System on XSEDE computational resources as a Science Gateways project for teaching graduate-level meteorology, climate, and predictability courses. Mattocks’ research is also focused on generating the most detailed regional climate-change projections and simulations to date, to help guide water resource management decisions throughout South Florida.
Mattocks' group has teamed with hydrology modelers from the South Florida Water Management District (SFWMD), which is responsible for managing the flow of water throughout natural and artificial structures such as rivers, canals, floodgates, and water conservation areas. SFWMD must meet the diverse needs of 7.7 million people along with demands from tourism, industrial, and agricultural concerns – all while simultaneously trying to restore the Florida Everglades to their natural state.
The SFWMD has reconstructed the land-cover distribution of the natural system in order to visualize the dramatic changes that occurred in the Florida Everglades between the 1850s and present day. “When the sheet-water flow through the Everglades – or the flow of a thin layer of water that’s not concentrated into channels – was interrupted by drainage from canals in order to keep the land dry for human development, the entire ecosystem changed for the worse,” he said. “Much of the marshland was converted to agriculture, and runoff containing pesticides, fertilizers, and toxins was flushed into Florida Bay, killing the pristine coral reef systems there.”
According to Mattocks, it has been a challenge to restore the flow to something even remotely resembling the area’s earlier, natural system because of commercial agriculture and other man-made impacts.
“There’s been a war going on between environmentalists and commercial enterprises, so progress is slow and funding for restoration efforts is difficult to maintain,” he said. “SFWMD needs predictions of precipitation and evapotranspiration under different climate scenarios to drive their hydrology models in order to plan future water-management strategies in times of rapid climate change. With Gordon, we hope to be able to zoom these simulations down to a one-kilometer resolution, and run longer climate ‘timeslice’ experiments to provide more accurate information to make south Florida's water supply more resilient.”
More Robust Computer Simulations for Drug Design
Other researchers are relying on Gordon to help scientists expose weaknesses in molecular dynamics simulations, which are instrumental in developing new drugs to combat diseases such as AIDS. Junmei Wang, an assistant professor in biochemistry at the UT Southwestern Medical Center in Dallas, Texas, was awarded 300,000 hours on Gordon to study large-scale biological systems or dynamic events in a long time scale using the state-of-the-art molecular dynamics simulation (MD) techniques.
“For us, Gordon is ideal to evaluate our molecular mechanical force fields, which are the foundation of MD technique, and help reveal any problems of a force field for studying biological systems/events, which can only be identified through long time scale simulations,” said Wang, who is using Gordon to conduct ab initio (or, from first principles) calculations to get high-quality structural and dynamic data for model compounds used to develop molecular-mechanical force fields.
“Without a high quality molecular mechanical force field, it is impossible to successfully identify promising inhibitors of a protein or nucleic acid receptor from millions of compounds in libraries,” said Wang. “These promising inhibitors are potential drug leads, and in principle such structure-based drug design can be applied to battle any disease, such as AIDS.”