CERN Accelerating science

LHC community prepares the High Luminosity Upgrades

by D. Contardo, K. Einsweiler

With the approval of the CERN Council in June 2016, the LHC is now moving toward execution of the High Luminosity upgrades. As well, after the first step of approval by scientific committees at the end of 2015, the ATLAS and CMS collaborations were encouraged by the Resource Review Board to prepare Technical Design Reports of the experiment upgrades. These TDRs will happen in 2017 and the 3rd ECFA High-Luminosity LHC Experiments workshop in October was a new opportunity for the LHC community to address widely the preparation of the entire program.

The workshop agenda1 was organised to allow large attendance of all communities and to provide comprehensive and accessible information to everyone. The overviews of CERN scientific strategy, physics potential and experimental challenges for the accelerator and detectors were followed by four major discussions: The accelerator operation scenarios and their impact on experiments; the physics reach and performance; the progress on common experimental R&D; and the update of design and performance of the detector upgrades. The material for the first discussion was prepared by a new working group formed earlier this year involving experiments and accelerator, while theory and experiment colleagues had several dedicated forums to develop the new physics studies that were presented.

Recently, a new baseline for the LHC upgrade was defined, with reduced number of crab-cavities. It was shown that luminosities similar to the original design could be achieved with a relatively limited increase of 10 to 20% of the collision density along the beam axis. Two operation scenarios are considered, one with luminosity levelling at 5 x 1034 Hz/cm2 corresponding to a mean collision pileup (PU) of 140 and a density ranging from 1.1 to 1.3 per mm, and one with levelling at 7.5 x 1034 Hz/cm2, with 200 PU and a density ranging from 1.7 to 1.9 per mm. Provided the experiments can maintain good performance in these latter conditions, a 30% increase in integrated luminosity could be reached for a same operation period. For the first time, experimental simulations of the physics object reconstruction performance (for leptons, b-quarks, jets and missing transverse energy) were presented as a function of the collision density. A linear degradation of the performance is observed, rather independent of the total number of collisions, indicating that the reconstruction is mostly sensitive to the pileup of tracks produced near the collision of interest and wrongly assigned to this vertex. Options to reduce the collision density, such as the flat optics considered for HL-LHC operation, could therefore improve the experimental performance. Another means to reduce mis-assignment of track to interactions would be to precisely measure their arrival time in detectors (see below). New simulations will now be performed to estimate how the performance degradation with collision density propagates to benchmark physics analyses.

Physics reach at the HL-LHC depends on statistical and systematic uncertainties. The latter originate from theoretical model calculations and from experimental measurements. Tremendous progress on theory uncertainties where presented, mostly from exploitation of recent data and new orders of NLO calculations. The factor 2 improvement assumed in earlier projections of physics performance at the HL-LHC was proven to be already achieved. Several new physics projections were presented by the experiments, also including recent Run II analyses criteria and better assessment of the upgraded detector performance. Particularly, new expectations for the Higgs coupling precisions were shown, along with new fiducial and differential cross section estimates, where systematic uncertainties cancel. For new physics searches, theory perspectives building on the limits set by the current LHC results were discussed. New signal benchmarks were also evaluated; expanding the assessment of the experiment’s coverage, and above all highlighting the importance to maintain highly efficient event selection and to extend the tracking acceptance in forward regions.   

Several common R&D for technical solutions are ongoing in three main areas: electronic systems, cooling and mechanical structures, computing and software. These efforts benefit from specific frameworks and workshops, often federated through CERN services to the experiments. Comprehensive reports of progress were presented at the workshop and only few highlights are mentioned here. Deep investigation of radiation tolerance is of particular importance for ASIC chip technologies, especially for the 65 nm TSMC technology proposed for the most irradiated regions in the inner pixel detectors. It was shown that recent submission of demonstrator chips must soon confirm that radiation doses up to 500 Mrad, or more, could be sustained as expected. Good progresses of other common developments were also presented for new data links, power distribution systems, large power CO2 cooling systems and light mechanical structures. Computing needs for the HL-LHC are estimated to be 50 to 100 times larger than those for Run 2, depending on pileup conditions. The projections of equipment performance versus cost indicate a gain of about 20% per year. It was however noted that substantial uncertainty on this projection are arising from present concentration and saturation of the market. In any case, the material improvements at constant cost will not be sufficient and another factor 10 gain will be needed on the computing usage itself. This appears achievable with emerging techniques for data management, multi-processor and/or accelerator usage, and new algorithmic methods. This will however require a huge effort of adaptation. The HEP Software Foundation (HSF) is proposing a framework open to the entire community to federate these efforts. A primary goal of the HSF is to prepare, by summer 2017, a white paper describing possible strategies and a roadmap toward future computing and software.

The ATLAS and CMS upgrades are driven by similar considerations for performance and operational challenges and therefore have similar scope. However, constraints from the original detectors can lead to different and complementary approaches in developing new configurations and technical solutions. It was inspiring at the workshop to compare the design options and their expected performance. Likewise, learning from the experience of ALICE and LHCb in preparing their earlier upgrades foreseen for the Long Shutdown 2 was extremely valuable. This was particularly true for tracking devices and for the new data acquisition systems (DAQ) they develop. These systems will select event only at the computing level, paving the path toward future computing and software models. The data volume and complexity in ATLAS and CMS will not allow a full computing event selection and the improvement of the hardware trigger remains a key pre-requisite for physics. To achieve the required performance the new paradigm is to implement a hardware track reconstruction, so that they can be ready for use in tens of microseconds. In CMS, the presence of a high magnetic field has allowed to develop a new silicon-strip module concept to implement tracks in the trigger at every bunch crossing. Instead, ATLAS will have a lower rate pre-triggering based on the calorimeter and the muon detectors information before using tracks. The track information in the two approaches may be different and it will be interesting with more simulations to assess the impact on performance.  

Partly because of this specific CMS concept for trigger, the two experiments have adopted different configurations for the new trackers. In the barrel, ATLAS foresees 5 layers of pixels followed by 4 layers of strips, while CMS considers 4 layers of pixels followed by 6 layers of strips. A novelty of the designs will be to tilt modules toward the interaction point at the edge of the layers, either in the pixel and/or in the strip parts. In the forward regions, both experiments will increase the coverage of the detectors with more rings/disks of pixels farther from the interaction point. Again comparing the different implementations proposed by ATLAS and CMS is enlightening in the quest for the most immaterial trackers.

In other detectors, mostly all electronics will be replaced to fulfil higher trigger rates and latency requirements. With the advent of high bandwidth data links the trend for all experiments becomes to readout the full detectors at each bunch crossing, providing full information and flexibility for triggering features deported to the back-end electronic systems. 

Calorimeter upgrades are important particularly in the forward regions. It was presented that after deep investigation, ATLAS has decided to maintain the existing forward calorimeter as it is, while CMS presented progresse in developing a High Granularity Calorimeter for its endcaps. This will be the first implementation in an experiment of a detector capable of 5D measurement (x,y,z,t,E) of electromagnetic and hadronic showers.

With likely operation of the HL-LHC reaching very high luminosities, precise measurement of arrival time of particles in detectors could provide a substantial new handle to mitigate the impact of the collision pileup. In the accelerator baseline scenarios, the collision time spread has a rms of about 180 ps and therefore a 30 ps resolution measurement could allow scanning a much reduced area of the z-t collision space and therefore of number of collisions. ATLAS is investigating a High Granularity Timing Detector in front of the existing forward calorimeter, with similar configuration as the CMS HGC, but with silicon sensors (LGAD) able to provide high precision timing measurements for minimum ionizing particles. CMS instead is studying a dedicated precision timing layer with full acceptance coverage in front of the barrel and endcap calorimeters. Encouraging studies of the performance benefits and of the technical feasibility where presented at the workshop.

As for the previous editions, the workshop was an inspiring exchange of information and new ideas, in a convivial atmosphere propitious to strengthen the links across the communities.


Notes: 1.


Latest issues in pdf