SDL Electrons and Neutrons
SimDataLab Mission
The Simulation and Data Laboratory Electrons & Neutrons (SDLen) forms a bridge between experiment and simulation with a special focus on data management and data analytics using high-performance computing and methods from the field of artificial intelligence. SDLen offers high-level technology support to users in electron microscopy, neutron scattering and related fields. Besides that, the lab hosts selected research projects targetting the development of novel methods and the usability of state-of-the-art technology.
Introduction
The approach of the Simulation and Data Laboratory Electrons & Neutrons (SDLen) is to enable a thorough understanding of materials and nanoscale structures from two sides:
Using customized data analysis methods onto experimental data obtained from sophisticated probing methods (such as TEM, STEM, EDX, XRCD, SAX, SANS) allows to draw conclusions about the nature of the investigated sample.
This is augmented by the simulation approach: Realistic models of the sample are constructed in-silico to verify the experimental findings on a qualitative and quantitative level. This includes the application of electronic structure methods, (ab-initio or classical) molecular dynamics, micromagnetic or multi-slice simulations, to name a few.
The SDLen staff constitutes of members of the Ernst Ruska Centre for Microscopy with Electrons (ER-C), the Jülich Centre for Neutron Science (JCNS) and the Jülich Supercomputing Centre (JSC) at Forschungszentrum Jülich.
Activities
The activities of SDLen include but are not limited to:
- The development of LiberTEM, a python-based data analysis package for electron microscopy data
- Supporting researchers in the optimization and modernization of simulation and analysis codes to run efficiently on state-of-the-art supercomputing architectures;
- Driving research for high-key analysis methods using artificial intelligence, such as holography;
- Development of simulation packages for the all-electron description of nanoscale structures;
- The maintenance of a platform for electronic lab books and other data management tools;
- Supporting researchers to get HPC benchmarks and write compute time applications;
- The maintenance of a knowledge base for the usage of HPC resources;
- The organization of training events for Machine Learning Methods;