- Published: 17 May 2012
Ever-increasing data needs inspire evolutionary and revolutionary advances
More than 500 physicists and computational scientists from around the globe, including many working at the world’s largest and most complex particle accelerators, will meet in New York City to discuss the development of the evolutionary, or even revolutionary, computational tools essential to the future of high-energy and nuclear physics. The 19th International Conference on Computing in High Energy and Nuclear Physics (CHEP) will be hosted by the U.S. Department of Energy’s Brookhaven National Laboratory and New York University — leaders in expanding the frontiers of data-intensive scientific research and supercomputational analysis.
WHERE: New York University's Jack H. Skirball Center for the Performing Arts, and Helen and Martin Kimmel Center, both at 60 Washington Square South, NYC.
WHEN: May 21-25, 2012
DETAILS: SuperComputing is often described as an “enabler.” It enables researchers to understand data obtained from extremely complex experiments by teasing out the elusive signals that point to discoveries amid the billions of particles striking myriad detectors at accelerators and reactors, or streaming through the atmosphere from the sun and cosmic rays. To keep up with ever-increasing needs for data processing and analysis, it is essential for key players to meet periodically to review experiences gained with current technologies, and plot the future path for the field.
This year’s conference will host more than 500 attendees from 45 nations who represent large international physics experiments, including the Relativistic Heavy Ion Collider (RHIC) at Brookhaven Lab and the Large Hadron Collider (LHC) at Europe’s CERN laboratory — particle accelerators that recreate conditions of the early universe in billions of subatomic particle collisions to explore the fundamental forces and properties of matter.
Through an interactive program consisting of presentations, workshops, and poster sessions, participants will share their experiences with data processing at all stages — from how computers closest to experiments such as these “know” which of the billions of events are important to record and analyze, through to the final analyses of petabytes of data that use computing resources distributed worldwide. Experts from the computing industry will also give insight into future technology trends. And in keeping with their formative contributions to the development of the original World Wide Web, physicists involved in data distribution and analysis will also describe new collaborative tools and uses for social media.
The following plenary talks will take place at the Skirball Center on Monday, May 21, starting at 8:30 a.m.
Glen Crawford, Director of the Research and Technology Division of the Office of High Energy Physics (HEP) within the U.S. Department of Energy (DOE) Office of Science, will review the status of the U.S. HEP program, including some recent research highlights, and address the importance of adapting the DOE strategic plan to respond to recent budgetary developments. He will include observations about the importance of computing to the field of HEP, and call attention to programs in DOE that have a role in advancing computing for HEP and other sciences.
Joe Incandela, a physics professor at the University of California, Santa Barbara, and spokesperson for the CMS collaboration at the Large Hadron Collider, will present selected highlights of the status of the LHC accelerator and experiments in 2012, together with some examples of the important physics results obtained with the 2011 data. The latter will include an emphasis on the 2011 search for the Higgs particle, which is thought to impart mass to matter, and prospects for 2012. Perspectives on the changing model and role of computing in hadron collider experiments from the Tevatron to the LHC will also be presented.
Seth W. Pinsky, President of the New York City Economic Development Corporation (NYCEDC), has been working to expand the New York City economy and position the City as an international center for innovation in the 21st century, with programs aimed at diversifying the City’s economy, helping legacy industries transition to 21st Century business models, and expanding entrepreneurship to ensure that the City is well-represented in the fields of tomorrow, including the high-tech sector. His talk will emphasize the crucial role of high-end computing in economic advancement.
This year’s conference was organized by scientists from Brookhaven Lab’s RHIC and ATLAS Computing Facility (RACF), which provides supercomputing services for RHIC, the U.S.-based collaborators in the ATLAS experiment at the LHC, and the collaborators in the Large Synoptic Survey Telescope (LSST) project. At peak times, data streams into the RACF at a rate of 4 to 5 gigabytes per second. Some 30,000 computational processing unit (CPU) cores sift through 2 petabytes of data per day, store up to 80 terabytes on tape, and distribute a fraction of the processed data to collaborators around the U.S. and overseas.
The RACF and research at Brookhaven Lab are funded primarily by the DOE Office of Science.