Machine Learning based tracking at the trigger level
The LHC collides protons at a rate of 40 MhZ to provide a good chance of an interesting event occurring. The collision energy is also extremely high at 13.6 TeV as the original purpose is to probe high energy physics.
We can't afford to save every collision though, so we employ a trigger system that has to very quickly determine which collision events are worth saving for further analysis. Traditionally, the trigger will do a rough reconstruction of the collision, and if it meets certain criteria, it will save all of the data in the detector which can be used for a more detailed reconstruction later on.
The event rate that we can ultimately save to tape is limited by the readout bandwidth which is a product of the trigger rate and event size. We can't afford to increase the trigger rate, but we can decrease the event size which allows us to save orders of magnitude more data. We do this through a method we call a "trigger level analysis" where we only save the objects reconstructed by the trigger, and nothing else. This comes at the cost of decreased reconstruction precision which we have to cleverly work around, but it allows us to explore a phase space of very rare events. The analysis we are currently doing searches for very rare and relatively light dark matter candidates which would be impossible by doing a traditional analysis.
Last Updated Date : 16/04/2023