The revolutionary new capabilities of X-ray free-electron lasers have launched a new field of ultrafast X-ray science. FELs generate femtosecond-duration X-ray pulses with a peak brightness more than a billion times higher than any previous source. This has led to our first direct measurements of chemistry and catalysis in action at the atomic scale, movies of magnetisation dynamics at the nanoscale, the observation of the evolution of exotic quantum dynamics (such as squeezed phonons) in solids, the generation and study of extreme states of matter as found in the cores of stars and planets, or atoms stripped of electrons from the inside giving new insights into atomic structure. FELs have also provided superior images of proteins that are free from effects of radiation damage that plague the conventional methods of X-ray crystallography and cyro-electron microscopy. The methodologies used in such experiments fundamentally differ from those at conventional sources, similar in some ways to the change in optical spectroscopy brought about by the introduction of optical lasers. Complex measurements are made in single pulses, but the full dataset is often aggregated over many millions of shots, as conditions or settings are scanned or to capture rare transient events. As we transition from proof-of-principle experiments towards work-horse measurements of real systems by non-expert users, we must expand experimental capabilities and reliability to be able to collect enormous datasets at high rates, over long periods of time. Such is the promise created by the specialized facilities FLASH, the European XFEL, and the LCLS II, located in Hamburg and California, which produce pulses at up to megahertz rates. But an experiment only works as well as its least-reliable component, and to profit from this capacity requires optimizing all sub-systems of the source and instrumentation. Only then can we acquire the necessary datasets to explore the full structure and dynamics of complex systems at atomic length and time scales. In this Helmholtz International Laboratory, we aim to address the reliability and throughput of various subsystems to achieve high-rate FEL measurements of complex systems. The Laboratory is organized into four work packages that each propose a novel and bold approach to improve reliability. This starts from applying machine learning to the operation of the accelerator and generation of X-ray pulses, as well as to the detection and analysis of X-ray signals. We aim to deploy robotic control of the delivery of samples to avoid interruptions and downtime, and to address challenges in the transport of high-power X-ray beams to experiments. These issues are common to our high-rate facilities and are best addressed collaboratively with pooled resources. Common solutions will enable standardization of experiments and protocols which will further foster collaboration in other areas and promote reliability and ease of use.

Read more about hir3x ยป

Publications

  1. Learning to Do or Learning While Doing: Reinforcement Learning and Bayesian Optimisation for Online Continuous Tuning: Jan Kaiser et al., arXiv, doi: 10.48550/arXiv.2306.03739
  2. Learning-based Optimisation of Particle Accelerators Under Partial Observability Without Real-World Training: Jan Kaiser et al., Proceedings of the 39th International Conference on Machine Learning, doi: https://proceedings.mlr.press/v162/kaiser22a.html
  3. Accelerating Linear Beam Dynamics Simulations for Machine Learning Applications: Oliver Stein et al., Proceedings of the 13th International Particle Accelerator Conference, doi: 10.18429/JACoW-IPAC2022-WEPOMS036

Team Members

Ilya Agapov Scientific areas: accelerator 
Annika Eichler Scientific areas: accelerator 
Jan Kaiser Scientific areas: accelerator 
Raimund Kammering Scientific areas: accelerator 

Project Partner