- Verizon Deploys 100G Technology on Metro Network (LATEST) 04-15-2014
To improve network efficiency and meet continued bandwidth demand, Verizon is deploying 100G technology on its U.S. metro network.The company will gain the same benefits of increased capacity, superior latency and improved scalability in its metro network as it has from deploying 100G technology in its long-haul network.Verizon is...
- Earthquake simulation tops one quadrillion flops: Computational record on SuperMUC (LATEST) 04-15-2014
A team of computer scientists, mathematicians and geophysicists at Technische Universitaet Muenchen (TUM) and Ludwig-Maximillians Universitaet Muenchen (LMU) have – with the support of the Leibniz Supercomputing Center of the Bavarian Academy of Sciences and Humanities (LRZ) – optimized the SeisSol earthquake simulation software on...
- New Context, Chef advance IT automation (LATEST) 04-15-2014
New Context has announced a new partnership with Chef, the leader in web-scale IT automation, to further the goal of enabling business transformation through automation customized to organizations' specific needs. The collaboration is intended to address the need for greater flexibility and adaptability in enterprise IT...
- Quantum manipulation: Filling the gap between quantum and classical world (LATEST) 04-14-2014
Quantum superposition is a fundamental and also intriguing property of the quantum world. Because of superposition, a quantum system can be in two different states simultaneously, like a cat that can be both "dead" and "alive" at the same time. However, this anti-intuitive phenomenon cannot be observed directly, because whenever a...
- Combs of light accelerate communication (LATEST) 04-14-2014
Miniaturized optical frequency comb sources allow for transmission of data streams of several terabits per second over hundreds of kilometers – this has now been demonstrated by researchers of Karlsruhe Institute of Technology (KIT) and the Swiss École Polytechnique Fédérale de Lausanne (EPFL) in a experiment presented in the...
- Cloud leaders enlist ISVs to sell horizontal clouds in vertical markets (LATEST) 04-14-2014
According to Technology Business Research Inc.’s (TBR) 4Q13 Public Cloud Benchmark, public cloud revenue grew 47.4% from the year-ago quarter to $6.2 billion across the 50 vendors covered in the report, while market growth for overall public cloud services grew 25.6% year-to-year over the same period to $15.1 billion. Cost savings,...
- Bailing out the world's fresh water bank (LATEST) 04-14-2014
An Australian-based water scientist is testing a new technology to help save imperilled underground water resources in Australia and around the world as climate change tightens its grip on the global food supply. Dr Margaret Shanafield of the National Centre for Groundwater Research and Training at Flinders University has developed a...
- Record number of CU students awarded prestigious access to supercomputing (LATEST) 04-13-2014
A record number of University of Colorado Boulder students have received a prestigious National Science Foundation Graduate Research Fellowship, which recognizes outstanding graduate students in the science, technology, engineering and mathematics (STEM) fields. Thirty CU-Boulder students were among the 2,000 fellowship winners...
- AWI researchers decipher climate paradox from the Miocene (LATEST) 04-12-2014
Growth of Antarctic ice sheet triggered warming in the Southern Ocean Scientists of the German Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research (AWI), have deciphered a supposed climate paradox from the Miocene era by means of complex model simulations. When the Antarctic ice sheet grew to its present-day...
- CU students win coveted Goldwater scholarships (LATEST) 04-12-2014
Three CU-Boulder undergraduates -- Jasmine Brewer, a junior in engineering physics, Brennan Coffey, a junior in chemical engineering and applied mathematics, and Ryan Dewey, a junior in astrophysics and physics -- have been awarded prestigious Goldwater Scholarships for 2014. The scholarships are worth up to $7,500 each and are...
- NASA simulation portrays ozone intrusions from aloft (LATEST) 04-11-2014
Outdoor enthusiasts in Colorado's Front Range are occasionally rewarded with remarkable visibility brought about by dry, clear air and wind. But it's what people in the mountainous U.S. West can't see in conditions like this – ozone plunging down to the ground from high in the stratosphere, the second layer of the atmosphere – that has...
- Chicago researchers develop quantum supercomputer (LATEST) 04-10-2014
University of Chicago researchers and their colleagues at University College London have performed a proof-of-concept experiment that will aid the future development of programmable quantum supercomputers. Many complex problems are difficult and slow to solve using conventional computers, and over the last several years, research has...
- New switch powers quantum supercomputing (LATEST) 04-09-2014
A light lattice that traps atoms may help scientists build networks of quantum information transmitters Using a laser to place individual rubidium atoms near the surface of a lattice of light, scientists at MIT and Harvard University have developed a new method for connecting particles — one that could help in the development of...
- Bright Computing Achieves Cloudera Certification for Bright Cluster Manager (LATEST) 04-09-2014
Certification of leading cluster management software helps enterprises deploy big data projects at any scale Bright Computing has announced that Bright Cluster Manager is now certified on Cloudera Enterprise 5. Bright and Cloudera will continue to work together to enable enterprise customers to deploy and manage big data projects with...
- Sparx, CSIRO Sign Collaboration Agreement (LATEST) 04-08-2014
CSIRO signs commercialisation deal with Sparx Systems for the development of geospatial modelling tools Sparx Systems and CSIRO have signed a research and development agreement that will help address global challenges in geospatial information management. The new deal will see the ongoing development of model registry features and...
- Nallatech Delivers OpenCL Integrated Platforms for FPGA Accelerated SuperComputing (LATEST) 04-07-2014
Platforms Bundled with Server, FPGA Accelerator Card, OpenCL Software Development Kit, and Tools Enable Rapid Production Ramp-Up Nallatech has announced delivery of its OpenCL integrated platforms for FPGA accelerated computing. Customers can choose from leading vendors’ servers and blades bundled with a Nallatech FPGA accelerator card,...
- Scientists find missing piece of air particle equation hiding in the walls (LATEST) 04-07-2014
Laboratory chamber walls have been stealing vapors, causing researchers to underestimate the formation of secondary organic aerosol in the atmosphere. A study published April 7 in PNAS Online Early Edition describes how a team of scientists, including researchers from the University of California, Davis, showed that vapor losses to...
- Vienna's Kaiser models soil microbes (LATEST) 04-07-2014
Climate feedbacks from decomposition by soil microbes are one of the biggest uncertainties facing climate modelers. A new study from the International Institute for Applied Systems Analysis (IIASA) and the University of Vienna shows that these feedbacks may be less dire than previously thought. The dynamics among soil microbes...
- Groundbreaking optical device could enhance optical information processing, supercomputers (LATEST) 04-07-2014
At St. Paul's Cathedral in London, a section of the dome called the Whispering Gallery makes a whisper audible from the other side of the dome as a result of the way sound waves travel around the curved surface. Researchers at Washington University in St. Louis have used the same phenomenon to build an optical device that may lead...
- BOSS Quasars Track the Expanding Universe (LATEST) 04-07-2014
Berkeley Lab scientists and their colleagues in BOSS study quasars in a new way, yielding a precise determination of expansion The Baryon Oscillation Spectroscopic Survey (BOSS), the largest component of the third Sloan Digital Sky Survey (SDSS-III), pioneered the use of quasars to map density variations in intergalactic gas at...
- Published: 05 March 2012
Initial Projects Range from Storm Predictions to Stock Market Data
Accurately predicting severe storms, or what Wall Street’s markets will do next, may become just a bit easier in coming months as Gordon, a unique supercomputer at the San Diego Supercomputer Center (SDSC) at the University of California, San Diego, begins helping researchers delve into these and other data-intensive projects.
Following acceptance testing in January, Gordon has now begun serving University of California and national academic researchers as well as industry and government agencies. Named for its massive amounts of flash-based memory, Gordon is part of the National Science Foundation’s (NSF) Extreme Science and Engineering Discovery Environment, or XSEDE program, a nationwide partnership comprising 16 supercomputers and high-end visualization and data analysis resources.
The first round of allocations recently approved by XSEDE includes about a dozen separate research projects that will use a combined 10 million processing hours on Gordon, whose data-intensive supercomputational capabilities will help scientists find, among other things, ways to create more robust molecular “force fields” to find new drugs for diseases, and explore how particles in atmospheric aerosols directly affect air quality.
Along with its innovative supernodes that create multi-terabyte shared-memory systems, Gordon, the result of a five-year, $20 million National Science Foundation (NSF) grant, can process data-intensive problems about 10 times faster than other supercomputers because it employs massive amounts of flash-based memory instead of slower spinning disks.
“To enable the basic and breakthrough science and scholarship of this new century, UC San Diego has made a significant investment in research cyberinfrastructure, an investment that will benefit our university, the University of California system, our state and our nation,” said UC San Diego Vice Chancellor for Research Sandra A. Brown. “Gordon is a powerful and promising part of that infrastructure, and should quickly demonstrate its value as a research resource.”
“Gordon is now assisting the research community in a number of very diverse and data-intensive projects, many of them which could not be addressed previously because scientists simply didn’t have the [super]computer capability to do so – and some of which are rather unconventional applications that could have a broader, societal impact,” said SDSC Director Michael Norman, the principal investigator for the Gordon project.
Currently the most powerful supercomputer in all of Southern California based on speed and capability, Gordon was recently ranked among the top 50 supercomputers in the world in terms of speed of doing pure math. However, it is the most powerful high-performance computing (HPC) system anywhere when it comes to accessing data, according to Allan Snavely, SDSC’s associate director and project leader along with Norman. In recent validation tests, Gordon achieved an unprecedented 36 million input/output operations per second, or IOPS, a critical measure for doing data-intensive computing such as sifting through huge datasets to find a mere sliver of meaningful information.
“Gordon’s unique ability to provide extremely fast random access to very large data sets is opening new doors for researchers across a wide range of disciplines, some outside the traditional realm of science,” said Snavely. “Such areas could include analyzing social-media data and seeing a political event such as the recent Arab Spring emerge in cyberspace before we saw people emerging in the street.”
Here are a few of the projects included in the first round of allocations for Gordon that were recently approved by XSEDE’s Resource Allocation Committee (XRAC). The XRAC consists of researchers and computational scientists who review the submissions and make allocation recommendations for all XSEDE resources:
Gordon Takes on Wall Street
So-called high-frequency trading has become commonplace in the U.S. stock market, and has been considered by the press to have contributed to the “flash crash” in May 6, 2010 that sent the Dow Jones Industrial average down 998.5 points in just 20 minutes. The Securities and Exchange Commission recently said it is looking to curb high-frequency traders' huge influence on stock trading, and is considering charging fees for buy and sell orders that are later canceled, among other options.
Mao Ye, an assistant professor of finance at the University of Illinois at Urbana-Champaign, is using Gordon to sift through massive amounts of NASDAQ historical ITCH market data to better understand the practices and patterns in high-frequency trading.
According to Ye’s research, many of these traders place an order, only to cancel it within 0.001 second or less. “We want to find out if these cancelled orders are being done to manipulate the market in some way,” said Ye, whose research also has focused on odd lots – or trades of less than 100 shares – which do not have to be reported to the consolidated tape.
“Because the minimum report requirement is 100 shares, we believe that people trade 99 shares for a strategic reason: they may be trying to hide their information,” said Ye. “Since so many orders are cancelled within such a short time, a natural question to ask is whether orders are cancelled due to legitimate reasons, or if these orders are entered for deceptive and manipulative reasons. Also, are these orders contributing to the liquidity and efficiency of the financial market, or are they the causes of the recent flash crash?”
One recent hypothesis is called “quote stuffing,” which states that high-frequency traders send huge amount of messages to purposely slow down the exchange’s trading system. “We’d like to know if these kinds of orders contribute to liquidity of the market, or whether they lead to abnormal volatility.”
Ye and his research team were allocated 500,000 processing hours of time on Gordon to process two years of data containing about 400 million order messages for each day. Ye’s research could be important for recent policy debates on whether a minimum quote life should be set for the orders.
Weather Warnings and Water Management
Two of the first research projects using Gordon are in the area of climate simulations, and severe weather predictions in particular. Ming Xue, a professor at the School of Meteorology, and Director of the Center for Analysis and Prediction of Storms (CAPS) at the University of Oklahoma, was granted 2,000,000 processor hours on Gordon to better understand and improve the accuracy of predictions of high-impact weather such as severe thunderstorms, tornadoes, and tropical cyclones.
“High-speed computing and data storage will allow us to create very high-resolution simulations of tornadoes for analysis at numerous output times, and allow us to use radar data at precise time intervals to create accurate numerical simulations and predictions of severe weather,” said Xue, whose research is funded by multiple NSF grants. “These data files are extremely large, so using traditional storage devices during the simulations can be extremely time-consuming. Gordon will help us significantly reduce the input/output time, thus speeding-up the overall research process.”
Craig Mattocks, an atmospheric research scientist at the Rosenstiel School of Marine and Atmospheric Science at the University of Miami, was awarded 3,500,000 processor hours on Gordon to deploy the Ocean Land Atmosphere Model (OLAM) Earth System on XSEDE computational resources as a Science Gateways project for teaching graduate-level meteorology, climate, and predictability courses. Mattocks’ research is also focused on generating the most detailed regional climate-change projections and simulations to date, to help guide water resource management decisions throughout South Florida.
Mattocks' group has teamed with hydrology modelers from the South Florida Water Management District (SFWMD), which is responsible for managing the flow of water throughout natural and artificial structures such as rivers, canals, floodgates, and water conservation areas. SFWMD must meet the diverse needs of 7.7 million people along with demands from tourism, industrial, and agricultural concerns – all while simultaneously trying to restore the Florida Everglades to their natural state.
The SFWMD has reconstructed the land-cover distribution of the natural system in order to visualize the dramatic changes that occurred in the Florida Everglades between the 1850s and present day. “When the sheet-water flow through the Everglades – or the flow of a thin layer of water that’s not concentrated into channels – was interrupted by drainage from canals in order to keep the land dry for human development, the entire ecosystem changed for the worse,” he said. “Much of the marshland was converted to agriculture, and runoff containing pesticides, fertilizers, and toxins was flushed into Florida Bay, killing the pristine coral reef systems there.”
According to Mattocks, it has been a challenge to restore the flow to something even remotely resembling the area’s earlier, natural system because of commercial agriculture and other man-made impacts.
“There’s been a war going on between environmentalists and commercial enterprises, so progress is slow and funding for restoration efforts is difficult to maintain,” he said. “SFWMD needs predictions of precipitation and evapotranspiration under different climate scenarios to drive their hydrology models in order to plan future water-management strategies in times of rapid climate change. With Gordon, we hope to be able to zoom these simulations down to a one-kilometer resolution, and run longer climate ‘timeslice’ experiments to provide more accurate information to make south Florida's water supply more resilient.”
More Robust Computer Simulations for Drug Design
Other researchers are relying on Gordon to help scientists expose weaknesses in molecular dynamics simulations, which are instrumental in developing new drugs to combat diseases such as AIDS. Junmei Wang, an assistant professor in biochemistry at the UT Southwestern Medical Center in Dallas, Texas, was awarded 300,000 hours on Gordon to study large-scale biological systems or dynamic events in a long time scale using the state-of-the-art molecular dynamics simulation (MD) techniques.
“For us, Gordon is ideal to evaluate our molecular mechanical force fields, which are the foundation of MD technique, and help reveal any problems of a force field for studying biological systems/events, which can only be identified through long time scale simulations,” said Wang, who is using Gordon to conduct ab initio (or, from first principles) calculations to get high-quality structural and dynamic data for model compounds used to develop molecular-mechanical force fields.
“Without a high quality molecular mechanical force field, it is impossible to successfully identify promising inhibitors of a protein or nucleic acid receptor from millions of compounds in libraries,” said Wang. “These promising inhibitors are potential drug leads, and in principle such structure-based drug design can be applied to battle any disease, such as AIDS.”