A number of Master courses in the STEM area mandate students to find a research project abroad to which they participate for 3-6 months. Many of the students find projects that arise their interest through internet searches- at least this is the way I got to know a few of them: as I regularly put details of my research progress in this blog (among other places), I am evidently a visible target. I do not complain about this: many of the students who contact me end up contributing to the projects they get embedded in. In return, they usually get to add a few lines to their CV, and maybe authorship of one or two papers.
Given the above, I suppose it is a good practice to publish here, every now and then, a list of ongoing projects where there is potentially use for additional internship students. As an additional bonus to the advertisement function, by doing the effort of putting together the list I am creating a good pointer I can refer students to, which collect all the information I otherwise have to supply on demand every time a new guy or gal knocks at my email-door. So here goes, a summary of things I am currently working on.  

1) Optimization of the SWGO array

This is a project I started about one year ago, when I joined the SWGO collaboration. SWGO aims to place about 6000 cherenkov detectors at high altitude in south America, to study high-energy gamma rays from the cosmos. The optimal placement on the ground of these 6000 units is a 11997-dimensional problem, which can be solved with deep learning through gradient descent. I have published preliminary results of this study recently, but we are improving the model and the whole pipeline, so there is certainly room for an extra pair of software-writing hands here. For more information about this, and a couple of nice gifs, see also this other blog post. An earlier discussion is here. The project is in collaboration with Prof. Michele Doro at the University of Padova, and with Statisticians at Carnegie-Mellon University and computer scientists at RPTU.

2) Optimization of an EM calorimeter for a muon collider detector

If we ever build a muon collider, we will have to cope with large beam-induced backgrounds from muons that decay in flight, producing secondary particles (in the end, lots of low-energy photons and neutrons) that hit the detector from the sides. This is a rather new problem in calorimetry, where the challenge of measuring with high precision the energy and direction of a photon from Higgs boson decay is made more complex by the peculiar geometry of the background flux. We are thus studying the problem with a differentiable model, to find the optimal geometry of the calorimeter cells. A big challenge is to create a performant continuous model of the energy release from these soft photons, which can be embedded in the differential pipeline. As the PhD student who is working at this will graduate in about one year, there is ample space for an additional student to help in some of the sub-tasks that the project consists of. The work is in collaboration with Universite' Clermont Auvergne, where Prof. Julien Donini is co-supervising the student I mentioned above.

3) Open-ended study of granular calorimeters for future collider applications

Thinking more outside the box, still in the area of calorimetry, we need to reckon with the fact that there is a push toward higher and higher granularity in these detector (we found out 15 years ago that by getting precise images of the detailed subcomponents of hadronic jets we could identify effectively the decay of heavy objects like W and Z bosons, top quarks, or Higgs bosons), and the technology is also improving. The question to me is today, can we build tracking calorimeters, which reconstruct the details of hadronic showers, allowing for particle identification? Also, calorimeters are high-density instruments that are placed after low-density tracking detectors. This is a long-standing paradigm which appears to not be useful any longer after we got the capabilities of deep learning for reconstuction of particle signals. Hence studying varying density devices seems a compelling new research topic. In this case, there are already a few studies we did in the past, and we want to further them by finding out what is the maximum amount of information we can mine from hadronic showers in an arbitrary granular instruments: finding this out would guide our hand toward optimizing them. This research is in collaboration with RPTU, with Prof. Nicolas Gauger at the center for Scientific Computing of Kaiserslautern-Landau, and with Carnegie-Mellon University, where I am collaborating with Statistician Ann Lee and collaborators. 

Notably, Nicolas and I opened a 1-year research grant position to work in this area, opened for Master graduates in scientific disciplines (no PhD necessary to apply!) The grant is offered by INFN with RPTU contributions, and the call is currently open -the deadline for applications is December 13. You can find more information about it in this blog post, or apply at this site (the call number is 26098).

4) Muon Tomography

One of the first applications of differentiable programming for detector optimization I considered, with a group of colleagues mostly based at Universite' catholique de Louvain and led by Dr. Andrea Giammanco, is muon tomography. We developed a software called TomOpt to optimize the layout of detectors that allow the reconstruction of the material map of unknown volumes by scanning the pattern of scattering that cosmic-ray muons undergo when traversing the volumes. 
TomOpt is close to be released as a proof of principle, but we are adding functionalities to it, so here is another fertile ground for an internship. The software is described in this recent preprint.

5) Cosmological neutrinos

Another optimization problem we are studying concerns the optimal placement in 3D of photomultiplier units in an ice or water volume, to look for the signal left by the interaction of ultra-energetic cosmic neutrinos. The problem is similar to the 2D one of SWGO, and is thus also amenable to gradient-based optimization. We are considering this for the GEN2 upgrade of the IceCube detector, as well as for the P-ONE detector. This is work in collaboration with Uppsala University, where the expert is Prof. Christian Glaser. 

6) Neuromorphic computing for particle detectors

Finally, a super-cool new idea is to employ nano-photonics in devising neuromorphic computing devices that can be used for in-situ preprocessing and extraction of high-level primitives from finely grained detectors, such as the mentioned calorimeters for a future collider. In collaboration with Prof. Fredrik Sandin from Lulea University of Technology we are starting research in this area, with a joint PhD who will start soon.