To build on this collaboration, and transfer it to the next level, where a synergistic framework is created at the interface of supercomputing, machine learning and neuroscience, the Joint Lab will:
• develop scalable methods for data analytics and simulation to cope with the enormous size and multiscale complexity of the brain;
• co-design modular supercomputing, interactive supercomputing and scalable data infrastructures towards exascale through use cases from neuroscience;
• develop, provide and maintain scalable tools for simulation, visualization, data acquisition, data management, and processing;
• leverage machine and deep learning (developing methods that are suitable for large-scale scientific problems, learning from the brain);
• create productive loops between theory, simulation, data analytics and empirical neuroscience.
A central element is the inclusion of the entire SDL Neuroscience. The Joint Lab SMHB will continue solving the scientific, societal and technical challenges of the digital transformation by addressing key issues with respect to the analysis, simulation and visualization of extremely large and complex datasets. It establishes a consistent data infrastructure spanning the way from the wet lab to web-based big data access, enabled by High-Performance Computing (HPC) infrastructure. The Joint Lab SMHB in its final form will be able to support other disciplines in adopting to this infrastructure. It bundles activities of the Helmholtz partners in the European Flagship Human Brain Project (HBP). Together with other communities, the HBP also helps to create a critical mass for the development of exascale use-cases for EuroHPC, a collaborative effort of European countries and the EU to advance and support exascale supercomputing and data analytics. The Joint Lab SMHB will pioneer and support research on neuro-inspired neuronal networks and HPC-based solutions as a socalled Local within Helmholtz AI, the Helmholtz Artificial Intelligence Cooperation Unit.