Search

link to homepage

Institute for Advanced Simulation (IAS)

Navigation and service


Getting Started

Installation

After unpacking the Benchmark Framework the following directory structure is available:

  • JUBE/

    • applications
    • bench/
    • doc/
    • platform/
    • skel/
    • LICENCE

The applications/ subdirectory should contain the individual benchmark applications. The bench/ subdirectory contains the benchmark environment scripts. The doc/ subdirectory contains the overall documentation of the benchmark framework. The platform/ subdirectory holds the platform definitions as well as job submission script templates for each defined platform. The skel/ subdirectory contains templates for analysis patterns for text output of different measurement tools.

Configuration

The platform

Once you have obtained all sources from the applications you want to use in the benchmark, you can start configuring the benchmarks for use on your platform.

A platform is defined through a set of variables in the platform.xml file, which can be found in the platform/ directory. To create a new platform entry, copy an existing platform description and modify it to fit your local setup. The variables defined here will be used by the individual applications in the later process. Best practice for the platform nomenclature would be: <vendor>-<system_type>-<system_name|site>. Additionally, you have to create a template batch submission script, which should be placed in a subdirectory of the platform/ directory of the same name as the platform itself. Although this nomenclature is not required by the benchmarking environment, it helps keeping track of you templates, and minimises the amount of adaptation necessary for the individual application configurations.

The applications

Once a platform is defined, each individual application that should be used in the benchmark needs to be configured for this platform. In order to configure an individual application, copy an existing configuration file (e.g. jube-example-system.xml) to the file bench-<your_platform>.xml. Then open an editor of your choice, to adapt the platform file to your needs. Change the settings of the platform parameter to the name of your defined platform. The platform name can then be referenced throughout the benchmarking environment by the $platform variable.

Execution

Assuming the Benchmark Suite is installed in a directory that can be used during execution, a typical run of a benchmark application will contain two steps.

  1. Compiling and submitting the benchmark to the system scheduler.
  2. Verifying, analysing and reporting the performance data.

Compiling and submitting

If configured correctly, the application benchmark, e.g. IMB, can be compiled and submitted on the system with the commands:

bash$ cd applications/IMB

bash$ perl ../../bench/jube jube-example-system.xml

The benchmarking environment will then compile the binary for all node/task/thread combinations defined, if those parameters need to be compiled into the binary. It creates a so-called sandbox subdirectory for each job, ensuring conflict free operation of the individual applications at runtime. Iif any input files are needed, those are prepared automatically as defined.

Each active benchmark in the application's top-level configuration file will receive an ID, which is used as a reference by JuBE later on.

Verifying, analysing and reporting

After the benchmark jobs have run, an additional call to the benchmarking environment will gather the performance data. For this, the special parameters -update and -result are used.

bash$ cd application/IMB

bash$ perl ../../bench/jube -update -result <ID>

The ID is the reference number the benchmarking environment has assigned to this run. The performance data will then be output to stdout, and can be post-processed from there.


Servicemeu

Homepage