XIOS Library : XML-IO-SERVER

General presentation

XIOS is library designed to manage NETCDF outputs of climate models. It has been developped by Yann Meurdesoif at IPSL (https://www.ipsl.fr/)

Main features :

  • Management of output diagnostic and history file
  • Temporal post-processing operation (averaging,min/max,instant,cumul)
  • Spatial post-processing operation (arithmetic operation, regridding)

Main advantages :

  • Simplification of the IO management into the code
  • output a field require only an identifier and the data
  • Outsourcing the output definition in an XML file
  • Changing IO definitions without recompiling
  • Performances :
    • Simultaneaous writing and computing by asynchronous call
    • Use one or more server processes dedicated to IO management
    • Use of parallel file system ability of NETCDF4-HDF5 format

Downloading

The code has been downloaded from the official repository : http://forge.ipsl.jussieu.fr/ioserver

A general presentation is available here : http://forge.ipsl.jussieu.fr/ioserver/raw-attachment/wiki/WikiStart/XIOS_IO_Workshop_Hamburg.pdf

You can read both UserGuide and ReferenceGuide to know more about XIOS

Note

The version implemented in MARS is XIOS-2.0 that you can download here XIOS2

Compilation

XIOS has been compiled on Datarmor with NETCDF/4.3.3.1-mpt-intel2016 library

cd /home/datawork-mars/TOOLS/XIOS
svn co http://forge.ipsl.jussieu.fr/ioserver/svn/XIOS/branchs/xios-2.0
# Available compilation options/mode
./make_xios --help (or ./make_xios -h) ;
# Available architectures/machines
./make_xios –avail ;
# What does an arch configuration file look like ?
cat arch/arch_DATARMOR.*
# Compilation (for example Datarmor)
./make_xios --prod --arch DATARMOR --job 4
# Check that libxios.a and executable file exists.
ls bin/xios_server.exe lib/libxios.a

Warning

The library must be compiled with the same version of the compiler used for the model

Test using XIOS tutorial

XIOS in the framework of MARS

The principle is that XIOS replaces classic managment of outputs (output_mng) in MARS code

  1. Implementation in MARS source code

Here is a brief description on how XIOS has been implemented in MARS

  • Environment and activation

    • First in makefile the XIOS=’USE’ flag has to be activated. This allow to copy xios_server.exe and all xml templates in $RDIR

    • In Makefile
      • Add cppkey -Dkey_xios to activate this functionality

      • Be sure the XIOS directory is correct

        XIOS       =/home/datawork-mars/TOOLS/XIOS/LIB/trunk/
        
  • Impact on the source code

    • The module xios_module.F90 is the main routine which contains :
      • A subroutine init_xios to initialise MPI communicator, the grids (both horizontal and vertical) and the calendar
      • A subroutine named send_xios_diag which is the one called at each time step and which send to XIOS the variables to write. All variables are being send in the source code and the user can choose its own outputs in XML files at execution (see xml)
    • In main.F90 a call of xios_init is made at the beginning
    • In step.F90 the call of output_mng is replaced by xios_send_diag

    Note

    Restart files are not yet managed by XIOS

  1. XML inputs files

Three inputs files are used for XIOS :

  • The main one is iodef.xml which includes the 2 followings
    • it contains a context for MARS in which is included grid, variables and file definitions
    • Another context is used for XIOS parameters (buffer size, info level, print to file and the use of server mode
  • The second one is field_def.xml
    • There you can define one or several field_group containing the variables you want in your output
    • Here variables have been grouped by grid or by features (ex turbulence)
  • The last one is file_def.xml
    • There you can custom your own output file with the variables in field_def.xml
    • You can add as many files as you want
  1. How to launch it on Datarmor

    • In you MPI shell script you should add XIOS

      #PBS -q mpi_1
      #PBS -l mem=6gb
      #PBS -l walltime=01:00:00
      #submit job
      date
      echo "submit MPI job with  $NETCDF_MODULE "
      setenv mpiproc `cat $PBS_NODEFILE  | wc -l`
      echo Number of MPI cpus : $mpiproc
      time $MPI_LAUNCH -np 26 ./mars_exe.$NETCDF_MODULE : -np 2 xios_server.exe>& mars_mpi.out
      date
      
    • XIOS on multiples Nodes: to ensure correct balance, xios_server must be dispatch on each node

      #PBS -q mpi_2
      time $MPI_LAUNCH -np 27 ./mars_exe.$NETCDF_MODULE : -np 1 xios_server.exe
      time $MPI_LAUNCH -np 27 ./mars_exe.$NETCDF_MODULE : -np 1 xios_server.exe
      >& mars_mpi.out
      

    or try to overcharge it

    #PBS -q mpi_2
    #PBS -l select=2:ncpus=28:mpiprocs=29
    time $MPI_LAUNCH -np 28 ./mars_exe.$NETCDF_MODULE : -np 1 xios_server.exe
    time $MPI_LAUNCH -np 28 ./mars_exe.$NETCDF_MODULE : -np 1 xios_server.exe
    >& mars_mpi.out