TrackFormers Part 2: Enhanced Transformer-Based Models for High-Energy Physics Track Reconstruction
Sascha Caron, Nadezhda Dobreva, Maarten Kimpel, Uraz Odyurt, Slav Pshenov, Roberto Ruiz de Austri Bazan, Eugene Shalugin, Zef Wolffs, Yue Zhao
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
High-Energy Physics experiments are rapidly escalating in generated data volume, a trend that will intensify with the upcoming High-Luminosity LHC upgrade. This surge in data necessitates critical revisions across the data processing pipeline, with particle track reconstruction being a prime candidate for improvement. In our previous work, we introduced "TrackFormers", a collection of Transformer-based one-shot encoder-only models that effectively associate hits with expected tracks. In this study, we extend our earlier efforts by conducting detailed investigations into more custom Transformer attention mechanisms, a new design combining geometric projection and lightweight clustering, and a joint model conditioning classification on a regressor's predictions. Furthermore, we discuss new datasets that allow the training on hit level for a range of physics processes. These developments collectively aim to boost both the accuracy and potentially the efficiency of our tracking models, offering a robust solution to meet the demands of next-generation high-energy physics experiments.