Appro has announced that it has been awarded a subcontract for a 147.5TF Appro 1U-Tetra supercomputers from Lockheed Martin in support of the DoD High Performance Computing Modernization Program (HPCMP). The HPCMP supports DoD objectives to strengthen national prominence by advancing critical technologies and expertise through use of High Performance Computing (HPC). Research scientists and engineers benefit from HPC innovation to solve complex US defense challenges.
As a subcontractor of Lockheed Martin, Appro will provide system integration, project management, support and technical expertise for the installation and operation of the supercomputers and Lockheed, as a prime contractor will provide overall systems administration, computer operations management, applications user support, and data visualization services supporting five major DoD Supercomputing Resource Centers (DSRCs). This agreement was based on a common goal of helping customers reduce complexity in deploying, managing and servicing their commodity High Performance Computing solutions while lowering their total cost of ownership.
The following are the supercomputing centers where Appro clusters will be deployed through the end of 2010: Army Research Laboratory DSRC at Aberdeen Providing Ground, MD, US Air Force Research Laboratory DSRC at Wright Patterson AFB, OH, US Army Engineer Research and Development Center DSRC in Vicksburg, MS, Navy DoD Supercomputing Resource Center at Stennis Space Center, MS, Arctic Region Supercomputing Center DSRC in Fairbanks, AK.
“We are extremely pleased to work with Lockheed Martin and be part of providing advanced cluster technologies and expertise in High Performance Computing (HPC) in support of the DoD High Performance Computing Modernization Program (HPCMP), said Daniel Kim, CEO of Appro. "Lockheed Martin leads its industry in innovation and has raised the bar for reducing costs, decreasing development time, and enhancing product quality for this important government program, and our products and solutions are a perfect fit for their demanding expectations."
With a network of more than 5,000 sensors that monitor weather conditions, seismic activity, traffic, bacteria on beaches, water levels and much more, Sensorpedia is a significant resource that continues to expand.
Sensorpedia, developed three years ago by Oak Ridge National Laboratory's Bryan Gorman and David Resseguie, connects first responders, individuals and communities with online sensor data in the United States and beyond.
"Sensorpedia combines the best of Facebook and YouTube and continues to expand and evolve to meet the demands and needs of users," Resseguie said.
Sneak peeks of Sensorpedia, which is in beta testing, are available at http://www.sensorpedia.com. Funding for Sensorpedia is provided by the Department of Homeland Security-sponsored Southeast Region Research Initiative.
The 2009 national meeting of the Centre for High Performance Computing (CHPC) combined with the 5th BELIEF (Bringing European eLectronic e-Infrastructures to Expanding Frontiers) Symposium, takes place from 7 to 9 December 2009 at the Sandton Convention Centre. Some 270 local and international delegates have registered for this event.
This year the CHPC and the South African National Research Network (SANReN) have made significant progress in the ongoing efforts to establish and enhance usage of South Africas national cyber infrastructure. The Department of Science and Technology (DST) funds both initiatives, which will soon be complemented by the Very Large Datasets initiative.
The SUN Microsystems hybrid supercomputer at the CHPC has recently made it onto the TOP500 list of supercomputers globally. Launched in September 2009, it is now the fastest supercomputer in Africa. The CHPC boasts significant supercomputing resources for the local and African research communities. Since the launch of its operation, the CHPC has achieved significant milestones, most notably the completion of its first round of flagship projects. Flagship projects are a mechanism to drive pockets of collaborative research requiring significant computational resources on problems of relevance to South Africa and have a strong human capital development.
The DST and SANReN recently announced the completion of the national backbone networking ahead of schedule. The national backbone interconnects the metros of Tshwane, Johannesburg, Mangaung, Cape Town, Nelson Mandela Bay and eThekwini on a 10-gigabits-per-second fibre-optic ring network and supports projects of national importance. The Tertiary Education Network has acquired international bandwidth from Seacom which can now be distributed via the SANReN national backbone network. This development bodes well for South Africas ability to tackle bandwidth-hungry projects such as the Square Kilometre Array (SKA), as well as for collaboration with international partners.
This year the 5th BELIEF symposium has been collocated with the 2009 CHPC national meeting. The CSIR Meraka Institute is part of the BELIEF-II consortium, funded by the Capacities Programme which is part of the European Unions Framework Programme 7. This will allow valuable insights into the trends and visions of the evolving e-Infrastructures ecosystem in South Africa and Europe by providing a co-operation platform to promote knowledge sharing between business, research, government and academic communities.
Speakers at the event have been drawn from the local and international research communities. These include C. Schoultz, Director: Character Matters who will speak on impact of cyber infrastructure on the South African animation industry and J. Jonas, Associate Director: Science & Engineering SKA South Africa, whose topic is entitled Dataflow in Radio Telescopes: the Square Kilometre Array and MeerKAT.
A selection of overseas speakers includes Dr C. McIntyre, Senior Vice President, Strategic Operations and High Performance Computing Initiative, Council for Competitiveness (US); Professor Yannis Ioannidis, Professor at the Department of Informatics and Telecommunications of the University of Athens and Dr Hermann Lederer, Deputy Director of the Garching Computing Centre (RZG) of the Max Planck Society.
Proceedings at the event will be shared with interested parties in Europe via video conferencing.
Silica is one of the most common minerals on Earth. Not only does it make up two-thirds of our planet's crust, it is also used to create a variety of materials from glass to ceramics, computer chips and fiber optic cables. Yet new quantum mechanics results generated by a team of physicists from Ohio State University (OSU) show that this mineral only populates our planet superficially—in other words, silica is relatively uncommon deep within the Earth.
Using several of the largest supercomputers in the nation, including the National Energy Research Scientific Computing Center’s (NERSC) Cray XT4 "Franklin" system, the team simulated the behavior of silica in high-temperature, high-pressure environments that are particularly difficult to study in a lab. These details may one day help scientists predict complex geological processes like earthquakes and volcanic eruptions. Their results were published in the May 10 online early edition of the Proceedings of the National Academy of Sciences (PNAS).
"Silica is one of the simplest and most common minerals, but we still don't understand everything about it. A better understanding of silica on a quantum-mechanical level would be useful to earth science, and potentially to industry as well," says Kevin Driver, an OSU graduate student who was a lead author on the paper. "Silica adopts many different structural forms at different temperatures and pressures, not all of which are easy to study in the lab."
Over the past century, seismology and high-pressure laboratory experiments have revealed a great deal about the general structure and composition of the Earth. For example, such work has shown that the planet's interior structure exists in three layers called the crust, mantle, and core. The outer two layers—the mantle and the crust—are largely made up of silicates, minerals containing silicon and oxygen. Still, the detailed structure and composition of the deepest parts of the mantle remain unclear. These details are important for complex geodynamical modeling that may one day predict large-scale events, such as earthquakes and volcanic eruptions.
Driver notes that even the role that the simplest silicate—silica—plays in the Earth's mantle is not well understood. "Say you're standing on a beach, looking out over the ocean. The sand under your feet is made of quartz, a form of silica containing one silicon atom surrounded by four oxygen atoms. But in millions of years, as the oceanic plate below becomes subducted and sinks beneath the Earth's crust, the structure of the silica changes dramatically," he said.
As pressure increases with depth, the silica molecules crowd closer together, and the silicon atoms start coming into contact with more oxygen atoms from neighboring molecules. Several structural transitions occur, with low-pressure forms surrounded by four oxygen atoms and higher-pressure forms surrounded by six. With even more pressure, the structure collapses into a very dense form of the mineral, which scientists call alpha-lead dioxide.
Driver notes that it is this form of silica that likely resides deep within the earth, in the lower part of the mantle, just above the planet's core. When scientists try to interpret seismic signals from that depth, they have no direct way of knowing what form of silica they are dealing with. They must rely on high-pressure experiments and computer simulations to constrain the possibilities. Driver and colleagues use a particularly high-accuracy, quantum mechanical simulation method to study the behavior of different silica forms, and then compare the results to the seismic data.
In PNAS, Driver, his advisor John Wilkins, and their coauthors describe how they used a quantum mechanical method to design computer algorithms that would simulate the silica structures. When they did, they found that the behavior of the dense, alpha-lead dioxide form of silica did not match up with any global seismic signal detected in the lower mantle. This result indicates that the lower mantle is relatively devoid of silica, except perhaps in localized areas where oceanic plates have subducted.
"As you might imagine, experiments performed at pressures near those of Earth's core can be very challenging. By using highly accurate quantum mechanical simulations, we can offer reliable insight that goes beyond the scope of the laboratory," said Driver.
Supercomputers Dissect Silica with Quantum Monte Carlo Calculations
Credit: Kevin Driver, Ohio State University
The structure of a six-fold coordinated silica phase called stishovite, a prototype for many more complex mantle minerals. Stishovite is commonly created in diamond anvil cells and often found naturally where meteorites have slammed into earth. Driver and co-authors computed the shear elastic constant softening of stishovite under pressure with quantum Monte Carlo using over 3 million CPU hours on Franklin
The team's work was one of the first to show that quantum Monte Carlo (QMC) methods could be used to study complex minerals deep in the planet's interior. Although the algorithms have been around for over half a century, Driver notes that applying them to silica was simply too labor– and computer–intensive, until recently.
"In total, we used the equivalent of six million CPU hours to model four different states of silica. Three million of those CPU hours involved using NERSC's Franklin system to calculate a shear elastic constant for silica with the QMC method. This is the first time it had ever been done," said Driver.
He notes that the QMC calculations on Franklin were completed during the system's pre-production phase, before the system was formally accepted by NERSC. During this phase, the center's 3,000 science users were encouraged to try out the Cray XT4 system to see if it could withstand the gamut of scientific demands from different disciplines. Driver notes that the Franklin results allow him to measure how silica deforms at different temperatures and pressures.
"From computing hardware to the consulting staff, the resources at NERSC are really excellent. The size and speed of the center’s machines is something that I don’t normally have access to at other places," says Driver.
"This work demonstrates both the superb contributions a single graduate student can make, and that the quantum Monte Carlo method can compute nearly every property of a mineral over a wide range of pressures and temperatures," said Wilkins. "The study will stimulate a broader use of quantum Monte Carlo worldwide to address vital problems."
He and his colleagues expect that quantum Monte Carlo will be used more often in materials science in the future, as the next generation of computers goes online. Coauthors on the paper included Ronald Cohen of the Carnegie Institution of Washington; Zhigang Wu of the Colorado School of Mines; Burkhard Militzer of the University of California, Berkeley; and Pablo López Ríos, Michael Towler, and Richard Needs of the University of Cambridge.
This research was funded by the National Science Foundation and the Department of Energy. In addition to NERSC, computing resources were also provided by the National Center for Atmospheric Research, the National Center for Supercomputing Applications, the Computational Center for Nanotechnology Innovations, the TeraGrid and the Ohio Supercomputer Center. Click here to read the PNAS abstract.
The use of e-Infrastructures is enabling a new wave of collaborative scientific research through remote access to computing services, instrumentation and resources bringing “real-world” benefits that impact our daily lives. The Conference on e-Infrastructures across the Mediterranean http://www.terena.org/eumedevent3/, 30-31 March 2010 in Brussels, Belgium, builds on the success of the previous EU-Med Events and marks an important step to progressing collaboration between countries of the Mediterranean region and European Union (EU) in the field of e-Infrastructures and networking for research and education.
These advanced, multi-layered platforms spanning networking, storage, supercomputing & grids, and access to shared scientific data not only help researchers to tackle scientific problems more effectively but also engender new scientific communities to work on similar challenges irrespective of their geographical location. Economies of scale can and need to be a cornerstone in the e-Infrastructure landscape, fostering further development and reaping rewards now and in the future.
This Conference brings together invited policy makers and civil servants from relevant ministries and institutions, executives and officials from international organisations such as the European Commission, the Arab League, UN agencies, private companies and foundations, as well as representatives from the user community to deliver insights into the state of play and future perspectives, define what is needed to further collaborative work on research and education and deliberate high-level policy issues. The event is by invitation only and organised and sponsored by three European funded projects chartered with fostering the creation of e-Infrastructures through important inter-related activities:
EUMEDCONNECT2 (http://www.eumedconnect2.net, a research network that connects Europe with Africa, the Mediterranean rim and the Middle East.
EUMEDGRID-Support (http://www.eumedgrid.org, promoting the deployment of e-Infrastructures, and supporting existing and new applications, policy development and training.
GÉANT (http://www.geant.net: a world-leading pan-European research network, bringing together at gigabit speeds the networks of the national entities, NRENs, which are collectively represented by TERENA (Trans-European Research & Education Networking Association – www.terena.org).