Navigation and service


Scientific area

A multiphysics CFD software.

Short description

Code_Saturne is an open-source CFD software package relying on the finite volume method to simulate the Navier-Stokes equations. It can handle any type of mesh built with any cell/grid structure. Incompressible and compressible flows can be simulated, with or without heat transfer, and a wide range of turbulence models is also available. The solver uses a segregated approach and the velocity-pressure coupling is handled using a projection-like method. The default algorithm to compute the velocity is the Jacobi algorithm and the pressure is solved with the help of an algebraic multigrid (AMG) algorithm.
Parallelism is handled by distributing the domain over MPI processes, with an optional second level of shared memory parallelism based on the OpenMP model. Several partitioning tools are available, i.e. geometry-based (Morton or Hilbert Space Filling Curve) or graph-based (METIS, ParMETIS, SCOTCH and PT-SCOTCH).
Code_Saturne can be used as a standalone package, but extra libraries may also be plugged in, as to read some of the supported mesh formats (CGNS, MED, CCM, for instance), to get access to graph partitioners (METIS, ParMETIS, SCOTCH, PT-SCOTCH) or to additional sets of linear solvers (PETSc, for instance).
MPI-IO is used for input and optional output of potential checkpointing files and meshes and for output of postprocessing files, when using the EnSight Gold format, (also readable by ParaView).
The code is one of the two only CFD multiphysics software of the PRACE Unified European Applications Benchmark Suite.

Code_SaturneSnapshot of the velocity magnitude in a blood pump.


  • 458,752 cores (1,835,008 compute threads) on BlueGene/Q (JUQUEEN)
  • 786,432 cores (3,145,728 compute threads) on BlueGene/Q (MIRA, Argonne)
  • 78,000 cores on Cray XC30 (ARCHER)

Strong scaling tests on JUQUEEN using MPI+OpenMP.Strong scaling tests on JUQUEEN using MPI+OpenMP.

Programming language and model

  • C (50%)
  • Fortran (37%)
  • Python (13%)
  • Hybrid parallelism (MPI + OpenMP)
  • MPI-IO

Tested on platforms

  • BlueGene/Q, /P, and /L
  • Power 5, 7, and 8
  • Cray XT4, XT6, XE6, and XC30
  • iDataPlex, NetScale

Application developers and contact

Martin Ferrand, Yvan Fournier, and Erwan Le Coupanec
6 quai Watier
78001 Chatou

External contributor:

Charles Moulinec
STFC Daresbury Laboratory
Keckwick Lane
United Kingdom

(Text and images provided by the developers)