When the job is running, I get a "segmentation fault"
The required memory for a job evolves during the run. If the job does not have access to enough memory, it will stop with a "segmentation fault error"
Several solutions
1. if you are under Caparmor, you can increase the size of the memory by adding in your batch #PBS -l mem=5gb right under #!/bin/csh
2. if you run under your own machine, increase the stacksize by the following command :
- shell command
- bash ulimit -s unlimited
- csh limit stacksize unlimited
3. if the job stops when you are saving results (while you are using the ionetcf library), you can shrink the size of your saved files by configuring your output.dat file.