ACADEMIA
University of Santiago de Compostela participates in the LHC experiments
- Written by: Writer
- Category: ACADEMIA
The first real data from the LHC experiments begin to reach the supercomputer centers, such as CESGA for further analysis.
On March 30, 2010 the largest particle accelerator in the world, the Large Hadron Collider (LHC) installed at CERN (European Organisation for Nuclear Research), reproduced the conditions that existed after the Big Bang by colliding two beams of protons with an energy of 7 TeV (teraelectronvoltios), never before achieved in an accelerator. This milestone in the history of particle physics marked the beginning of the LHC research program: Six experiments, ATLAS, CMS, LHCb, ALICE, TOTEM and LHCf associated each with a single detector and for analyzing the particles produced in collisions. In experiments involving thousands of scientists from research groups from more than 37 countries, the Group of High Energy Research at the University of Santiago de Compostela (USC) is included in this effort.
Directed by Bernardo Adeva Andany Professor, Laboratory for High Energy Physics of the USC is an integral part of the international team in charge of the Silicon Tracker of the LHCb detector, one of six experiments going on in the LHC at CERN. The USC group has invested 10 years in the construction of internal detector Silicon Tracker, the Inner Tracker, using instrumental and technology themselves. Several of its members involved in coordination in the experiment at CERN. The group also operates a Tier-2 for the process of LHCb experiment data, and have developed a grid computing infrastructure for receiving and analyzing the data, in conjunction with the Supercomputing Center of Galicia, CESGA, and the University of Barcelona.
Silicon detectors
The LHCb experiment investigates the lack of matter-antimatter symmetry through the study of particles containing the b quark (beauty quark). The quarks by anti-b are unstable and ephemeral, rapidly decaying into other particles. During collisions at LHC, particles will be generated by billions, and comparison of their decays could provide data to explain why nature favored the matter against antimatter at the origin of the universe. A possible explanation could be the non-conservation of CP symmetry in the interactions that took place in the first moments of the universe, because the decays of quarks and antiquarks with opposite orientations of their helicity, have no place with equal probability, that may help explain the absence of antimatter.
Recent data have allowed us to observe, first, the failure to maintain COP decay of heavy quarks, and indicate that it reaches considerably high values, up to 30%. From now generate enough statistics to develop this theory as "the experiment being carried out in the LHCb detector will enable the study of more decays of quarks by anti-b than it has ever been observed before," Adeva Andany said.
The group's work at USC along with groups of Zurich, Lausanne and Heidelberg, was 25% of the construction and installation in the LHC LHCb Silicon Tracker, which is co-coordinator of one of its members, Abraham Gallas Torreira. They are also responsible for 50%, with the Federal Polytechnic Institute of Lausanne, the construction of the Inner Tracker (IT), the detector precision silicon micropearls built by the Laboratory of Physics of High Energies of USC. It is used to detect traces of b quarks produced in collisions and consists of more than 200,000 electronic channels. The trackers are designed specifically to record the trajectory of each particle as it passes through the detector. The union of these tracks left in different parts of the detector is essential to reconstruct the decays of particles B.
In charge of data collection Silicon Tracker is another member of the High Energy Compostela, Daniel Esperante. The LHCb uses a sophisticated electronic system to filter the data from the 10 million collisions per second that are recorded in their detectors, the High Level Trigger. This algorithm based on the concept of regions of interest, to select data from all sensors processing them in 1000 about 16 cores (processing units), located in area of LHCb, downward, after an initial screening to select one million events per second, more manageable amounts, about 2000 events per second.
15 petabytes of data per year (15 million Gigabytes)
After gaining enough energy to reproduce conditions similar to those in the Big Bang, the research program of the LHC is hereby initiated, and thus begins the analysis of the 15 petabytes of data per year is estimated to yield the experiments. To distribute, store and analyze it, LHC has created the Worldwide LHC Computing Grid (LCG), a global partnership that combines the computing resources of more than 100,000 processors at 130 sites (schools) from 34 countries available to the community high-energy physicists. Organized into 4 levels (Tier), the Tier 0 is CERN's central computer, which includes a first backup of all raw data from LHC and real-time distributed among the eleven Tier 1, which in turn coordinate and sending data distributed to Tier 2. These are centers that provide storage capacity and computational analysis sufficient for specific tasks, which scientists access through Tier 3, single or clusters computers housed in universities.
In collaboration with the Supercomputing Center of Galicia, CESGA, and the University of Barcelona, the Group of High Energy Physics of the USC operates a Tier-2 center with a cluster dedicated exclusively to process data from the LHCb experiment. Juan José Saborido Silva, Project Coordinator of the USC Computing GRID for CERN, and member of High Energy Physics, is considered essential to develop a GRID computing infrastructure to carry out the experiment at USC. "Both for the analysis of the simulations we have been doing during the testing of the experiment, as they begin to get real data grid infrastructure enabled us to work with huge volumes of data in real time and computing resources available unthinkable outside the grid." The LHCb experiment will take about ten years, during which they collect millions of data to help understand the origin of the universe. According to Bernardo Adeva, "could redefine the laws of physics contained in the Standard Model of fundamental interactions."