David Rousseau

I’m a High Energy Physicist at IJCLab, CNRS/IN2P3 and Université Paris-Saclay, currently working on the ATLAS experiment at CERN on the Large Hadron Collider.

After a dozen of years designing and implementing many pieces of the ATLAS experiment software, a chance meeting in 2013 with a Machine Learning (what was this?) Computer Scientist decided a new path in my career: develop the interface between High Energy Physics (and science in general) and Machine Learning (or Artificial Intelligence).

My current and past research topics are described below ( please check the Publications and Talks tab for references)

Current research topics

Item 1, 2 and 3 have been discussed at the AI and the Uncertainty Challenge workshop I organised in Nov 2023.

  • 1 Uncertainty aware training : physicists ultimately write papers with measurements which always include an assessment of uncertainties, including the impact of Nuisance Parameter (the parameterized unknowns) in systematic uncertainties. How to evaluate the uncertainties on a more and more complex model ? How to build confidence into a complex ML model and how to convince our peers about it ? How to deal with the uncertainties on the input of the models to maximise the overall accuracy ?  A new competition (to run summer 2024) is being set up for the FAIR Universe project (in collaboration with Berkeley, Chalearn, U Washington). See also talks tab.
  • 2 Simulation Based Inference. In High Energy Physics, we’ve been using accurate (but never accurate enough) simulators in order to eventually extract measurement from data. Machine Learning has been used more and more to build optimal or at least better multivariate features. Simulation Based Inference  is an additional step, where Machine Learning is used to approximate the Negative Log Likelihood of the measurement (I like to call it an unbinned Maximum Likelihood where the pdf is a trained Neural Network). We’re working on applying this set of technique to off-resonance Higgs to 4 lepton cross section measurement (see Aishik Ghosh’s thesis in the “Student” tab, more recent developments are internal to ATLAS, to be public summer 2024).
  • 3 Machine Learning for fast particle Tracking at the Large Hadron Collider : reconstructing the trajectory of charged particles is “just” about connecting the measurement points. However this need to be done fast and accurately, and this will become even more difficult with the foreseen increase of luminosity at the High Luminosity LHC. To foster innovation in algorithms I’ve organised the TrackML challenge (see TrackML page and info in Publications and Talks tab). We are now developing these techniques in the frame of the AidaInnova European project (started in 2021) and ANR ATRAPP (started in 2022)
  • 4 Generator models for fast detector simulation : in HEP we have accurate simulators based in particular on Geant4, but they are slow. There is a big movement to use Geant4 to train NN model (with a GAN architecture in particular) to accelerate by orders of magnitude key elements like shower generation. See Aishik Ghosh’s PhD, and an ATLAS publication.

Past research topics

  • Using an OPU for event classification : the Optical Processor Unit (OPU) built by LightOn, a Paris-based start-up, provides analog large random matrix multiplication with very large throughput. We’ve been exploring the use of it in HEP using event classification at the LHC as a test case (which work has been presented in a talk at ACAT 2021 conference)
  • evidence for Higgs to tau tau in ATLAS (MMC)
  • software developments for ATLAS
  • B physics in ALEPH
  • ALEPH silicon detector alignment