19 March 2012 Electronics channel
The Large Hadron Collider (LHC) collides protons together at very high energy, and the resulting interactions between the constituent quarks and gluons take place under conditions close to those existing at the time of the big bang. These conditions permit the study of the underlying forces of nature, the production of new particles such as the Higgs boson, and the search for for new physics processes such as evidence for supersymmetry.
The LHC has been running in full data taking mode since 2010. Collisions occur every 50ns leaving signals in the four large detectors (ATLAS,CMS,LHCb and ALICE) which contain many millions of sensitive electronic channels. This interaction rate gives rise to the "data deluge" of the LHC - requiring processing many tens of PetaBytes of data annually. The complete chain involves management and storage of raw data, its reconstruction using the worldwide distributed computing infrastructure known as the Grid, the replication and management of processed data, and the final analysis by physicists.
Within the UK the "GridPP" project, in collaboration with the National Grid Service, provides the necessary computing infrastructure, connected to the World Wide Grid though the JANET academic network. There are over 250,000 processing cores in the Grid spread between sites from Brazil to Russia. This complex system works as a result of a very high level of standardisation, and a high level of cooperation with and between national authorities.
This lecture will present the physics motivation for the LHC and then explore the computing and data management environment needed to realise the project.
Peter Clarke is a Professor in the School of Physics and Astronomy at the University of Edinburgh. He has a 1st Class Honours degree in Electronics Engineering (Southampton University,1980) and a D.Phil in Particle Physics (Oxford 1985). He was a CERN Fellow before being appointed as a lecturer first at Brunel University in 1987 and then moving to University College London in 1993.
He was promoted to Reader and then Professor in 2001 and was Head of the Particle Physics Research Group between 2001-04. He moved to the University of Edinburgh in 2004 to take up the Chair of eScience and later become Director of the National eScience Centre until 2009. He is a Fellow of the Royal Society of Edinburgh, and a Fellow of the Institute of Engineering and Technology and the Institute of Physics.
His early research work included the first measurements of direct CP violation in the Kaon system at the CERN NA31 experiment. Later, at the Stanford Linear Accelerator Centre and then the LEP electron positron collider at CERN, his research focused upon precision measurements of the electro-weak interaction, the properties of the Z and W bosons and indirect searches for the Higgs boson. At UCL he worked on construction of the ATLAS experiment for the Large Hadron Collider. He now works on studies of CP violation (the asymmetry between matter and anti-matter) as a member of the LHCb experiment at the LHC.
He was involved in UK e-Science since its inception. He has held roles in international grid computing infrastructure projects including the management board of the UK grid for particle physics (GridPP), the European Data Grid (EDG) and the EGEE projects. He was a founder of the Centre of Excellence in Networked Systems at UCL and was prominent in advancing national and international networking for research. At the LHCb experiment he is presently responsible for coordinating offline physics data processing using the worldwide computing grid