Large Hadron Collider Tier-1
The GridPP Tier-1 provides large scale computing resources for the Large Hadron Collider (LHC) experiments as well as the wider particle physics community. It is one of the largest sites in the Worldwide LHC Computing Grid (WLCG) project, which is the collaboration of over 100 universities and research institutes providing computing to the LHC. As Tier-1, we hold a custodial copy of a fraction of the raw data that CERN produces.
The GridPP Tier-1 provides a range of services such as:
- Tape based archival storage for the entire HEP Community
- Large scale disc storage with tightly integrated compute for data intensive processing
- State of art networking with dedicated links to CERN and privilege access to other institutes
- Expertise in data management and transfer software such as Rucio, FTS and XRootD
- Data management and transfers
- Global Software distribution service (CVMFS)
- Information providers and accounting
GridPP - Distributed Computing for Data Intensive Research
GridPP manages the UK’s engagement in CERN’s Large Hadron Collider Computing Grid project, collaborating with physicists and computing scientists to support particle physics research and EGI initiatives.
The picture shows all of the UK institutes that are part of the GRIDPP Collaboration. For more information, please visit their website here.
Related Programmes, Projects or Facilities
SWIFT - HEP
Swift-HEP modernises HEP codes, optimises performance on evolving hardware, and fosters collaboration to maximise physics returns and address industry skill demands. For more information, please visit their website here.
LSST:UK
LSST:UK is one of the most ambitious science projects planned for the next decade, and a key part of the astronomical landscape in the 2020s and 2030s. For more information, please visit their website here.
IRIS - A Community creating Digital Research Infrastructure to support STFC Science
For more information about the IRIS programme, please visit the IRIS website here.
Impact
data stored on disk
data stored on tape
logical cores in batch farm
jobs processed, using over 43 million CPU hours per month
data processed per month
data written to tape per month