High-Q Club

High-Q Club

Highest Scaling Codes on JUQUEEN

Following up on our JUQUEEN Porting and Scaling or Extreme Scaling Workshops and to promote the idea of exascale capability computing, we established a showcase for codes that at the time could utilise the entire 28-rack BlueGene/Q system at JSC. The goal was to encourage other developers to invest in tuning and scaling their codes and show that they are capable of using all 458,752 cores, aiming at more than 1 million concurrent threads on JUQUEEN.
The diverse membership of the High-Q Club shows that it is possible to scale to the complete JUQUEEN using a variety of programming languages and parallelisation models, demonstrating individual approaches to reach that goal. High-Q status marked an important milestone in application development towards future HPC systems that envisage even higher core counts.

Programming languages used by codes in the High-Q Club
Programming languages used by High-Q Club codes (Venn diagram)
Programming models used by codes in the High-Q Club
Programming models used by High-Q Club codes (Venn diagram)
I/O methods used by codes in the High-Q Club
I/O methods used by High-Q Club codes

Analysing the codes, we found that the application benefits extended beyond the BlueGene/Q architecture to other HPC leadership systems. The lessons learned for JSC and the application developers were more widely applicable and provided insights for expected future exascale applications.

References

Codes in the High-Q Club

Strong scaling, High-Q Club
Strong scaling results of the codes
Weak scaling, High-Q Club
Weak scaling results of the codes

The following list of codes is kept as a hall-of-fame for the member codes in the High-Q Club. Since with the decommissioning of JUQUEEN in spring 2018 any results the codes achieved on this BlueGene/Q system are somewhat outdated, we refer to the above publications for more details and keep the list short, only indicating thier programming model, programming language and I/O method (if relevant).

High-Q Club

1D-NEGF

A 1D Non-Equilibrium Green's Function framework for transport phenomena.
MPI + OpenMP / C

High-Q Club

CIAO

Multiphysics, multiscale Navier-Stokes solver for turbulent reacting flows in complex geometries.
MPI / Fortran / HDF5

High-Q Club

Code_Saturne

An open-source, multiphysics CFD software.
MPI + OpenMP / Fortran + C / MPI-I/O

High-Q Club

CoreNeuron

Simulating electrical activity of neuronal networks with morphologically-detailed neurons.
MPI + OpenMP / C + C++ / MPI-I/O

High-Q Club

dynQCD

Lattice Quantum Chromodynamics (QCD) with dynamical fermions.
SPI + pthreads / C

High-Q Club

FE2TI (ex_nl/FE2)

A scale-bridging approach incorporating micro-mechanics in macroscopic simulations of multi-phase steels.
MPI + OpenMP / C + C++

High-Q Club

FEMPAR

A framework for the massively parallel FE simulation of multiphysics problems governed by PDEs.
MPI / Fortran

High-Q Club

Gysela

A gyrokinetic code from CEA, France, for modelling fusion core plasmas.
MPI + OpenMP + pthreads / Fortran + C / HDF5

High-Q Club

hp-fRG

A hierarchically parallelised code for renormalisation group calculations.
MPI + OpenMP / C + C++

High-Q Club

ICON

A solver for fully compressible non-hydrostatic equations of motion at very high horizontal resolution.
MPI + OpenMP / Fortran + C / netCDF

High-Q Club

IMD

A software package for classical molecular dynamics simulations.
MPI / C

High-Q Club

JURASSIC

A fast solver for infrared radiative transfer in the Earth's atmosphere.
MPI + OpenMP / C / netCDF

High-Q Club

JuSPIC

A fully relativistic Particle-in-Cell code for laser-plasma simulations.
MPI + OpenMP / Fortran / MPI-I/O

High-Q Club

KKRnano

Korringa-Kohn-Rostoker Green function code for quantum description of nano-materials.
MPI + OpenMP / Fortran / SIONlib

High-Q Club

LAMMPS(DCM)

A Dynamic Cutoff Method for a classical molecular dynamics code, the Large-scale Atomic/Molecular Massively Parallel Simulator.
MPI + OpenMP / C++

High-Q Club

MP2C

Fluid simulations using a hybrid representation of solvated particles.
MPI / Fortran / SIONlib

High-Q Club

MPAS-A

An atmospheric solver for fully compressible non-hydrostatic equations of motion on unstructured Voronoi meshes.
MPI / Fortran + C / SIONlib

High-Q Club

μφ (muPhi)

Water flow and solute transport in porous media.
MPI / C++ / SIONlib

High-Q Club

Musubi

A multicomponent Lattice Boltzmann solver for flow simulations.
MPI + OpenMP / Fortran / MPI-I/O

High-Q Club

NEST

A simulator for spiking neural network models that focus on the dynamics, size and structure of neural systems.
MPI + OpenMP / C++ / SIONlib

High-Q Club

OpenTBL

Direct numerical simulation of turbulent flows.
MPI + OpenMP / Fortran / HDF5

High-Q Club

ParFlow + p4est

An open-source parallel watershed flow model.
MPI / Fortran + C / MPI-I/O

High-Q Club

pe

A massively parallel rigid body dynamics framework.
MPI / C++ / MPI-I/O

High-Q Club

PEPC

A particle tree code for solving the N-body problem for Coulomb, gravitational and hydrodynamic systems.
MPI + pthreads / Fortran + C / SIONlib

High-Q Club

PMG+PFASST

A space-time parallel solver for systems of ODEs.
MPI + pthreads / Fortran + C

High-Q Club

PP-Code

A particle-mesh code for simulating charged and neutral particle dynamics in relativistic and non-relativistic plasmas.
MPI + OpenMP / Fortran / MPI-I/O

High-Q Club

psOpen

Direct numerical simulation of fine-scale turbulence.
MPI + OpenMP / Fortran / HDF5

High-Q Club

Seven-League Hydro Code

An all Mach number code for fluid dynamics in astrophysics.
MPI + OpenMP / Fortran / MPI-I/O

High-Q Club

SHOCK

Direct numerical simulation of compressible flow.
MPI / C / HDF5

High-Q Club

Terra-Neo

A multigrid solver for geophysics applications from the University of Erlangen.
MPI + OpenMP / Fortran + C++

High-Q Club

waLBerla

A widely applicable Lattice Boltzmann solver from the University of Erlangen.
MPI + OpenMP / C++ / MPI-I/O

High-Q Club

ZFS

A multiphysics framework for compressible and incompressible flow, aeroacoustics, and combustion phenomena.
MPI + OpenMP / C++ / netCDF

Contact

Dr. Dirk Brömmel

Member of the Simulation and Data Lab Plasma Physics Member of the Project Management Office for PRACE-6IP

  • Institute for Advanced Simulation (IAS)
  • Jülich Supercomputing Centre (JSC)
Building 16.4 /
Room 320
+49 2461/61-6595
E-Mail

Last Modified: 09.05.2022