FAQ & ERRORS

Compilation

Warning

cannot stat ‘/home1/datahome/jystanis/MARS/MARS_CONFIG/BRETSUD_500/BRETSUD_500-r-1630/OBST/’: No such file or directory

Answer : You use a very old version which does not have the OBSTRUCTION module.

Modifier makefile with OBST = ‘notused’ (instead of OBST = ‘use’)

Warning

the compilation aborts and (because) *WCONF-CASE* appears in out.compile.out

Answer : This means you have done the links with the general name CONF_CASE and not with YOUR CONF-CASE. Do the links again.

Warning

./smallf90_rank1/cometdyn.f90:115:(.text+0x2388): relocation truncated to fit: R_X86_64_PC32 against symbol `cometdyn_mp_paref_’ defined in COMMON section in ./OBJETS/cometdyn.o

Answer :
There probably is a problem of memory due to a large number of static variable during the compilation. Try to add theses compilation options in your Makfefile : -mcmodel large -shared-intel

Warning

“can’t find toolcpp.h in small90/comaree.f90”

Answer :
Check the include path in your Makefile
: make sure the path to your CONFIG in the CDIR is correct
CPP = fpp -P -C -noJ -I$(CDIR)/WGIRONDE_CURVI-V11.0/PHYS/TOOL INCDIR = -I./INC/ -I$(OASISINC) -I$(DIRNC)/include/ -module $(CDIR)/WGIRONDE_CURVI-V11.0/OBJETS/

Submission

Warning

problème pour lancer un run sur 55 procs en MPI 2D avec Mars. Est-ce que c’est toujours possible ou c’est proscrit ? Je n’arrive pas à créer un fichier mpi.txt 2D avec 56 procs avec l’outil de découpe MPI.

J’ai essayé avec ces options de run sans succès :

#PBS -l select=1:ncpus=28:mpiprocs=28+1:ncpus=27:mpiprocs=27:mem=60g

Answer :

  1. You have no other choise than blocking 56 procs

    #PBS -q mpi_2

    $MPI_LAUNCH -n 55

  2. Best use : hybrid. But wait a little bit for the batch_hyb

The run does not start

Error message in mars.out

Warning

erreur a la lecture des ondes de maree (amp_sa) dans le mars.out

Correction : Probleme de lecture du HEAD donc head qui nest pas au bon format

Warning

mars_exe.4.3.3.1-mpt-intel2016: No such file or directory
MPT ERROR: could not run executable. If this is a non-MPT application,
you may need to set MPI_SHEPHERD=true.
(SGI MPT 2.15 12/18/16 02:52:47)

Answer :

  1. vous avez compilé avec un module différent du module spécifié dans votre batch de soumission du run. Voir load dans compile et dans votre batch et s’assurer que ce sont les mêmes modules.
  2. Vous utilisez un rang et voulez lancer mars_exe_rankX.4.3.3.1-mpt-intel2016 en fait. Modifiez votre batch en fonction.

rq : éliminer les module load dans env_mars.csh car inutilisés si on passe par compile. Si compilation interactif, ben attention à pb ci-dessus.

Warning

Erreur : routine = ionc4_openr
fonction = nf90_open variable = ../../inputs/IC500m15j/FINIS500_20110830T0000Z.nc Erreur : routine = ionc4_openr fonction = nf90_open variable = ../../inputs/IC500m15j/FINIS500_20110830T0000Z.nc ERROR -115 = NetCDF: Error initializing for parallel access ERROR -115 = NetCDF: Error initializing for parallel access

Several answers :

  1. the links under $DATAWORK directory do not exist anymore on Datarmor. ( /work has been changed into /homeX/datawork)

  2. the format of the netcdf must be compatible with parallel access so in netcdf4 :

    to solve this you just should convert your file in nc4 this way : ncks -O -4 in.nc in.nc

Warning

** you MUST HAVE ishift_obc_west+3 < limax **

Answer : This means you are probably using 1D MPI (without mpi.txt) and you request too many cpus for the width of your domain. The solution is to build a mpi.txt using the decoupe tools in HOMEMARS/../TOOLS/MPI2D_DOMAIN