Simulation crash

mbaldini's picture
Submitted by mbaldini on Tue, 02/27/2018 - 10:14

Hi all, I am running a simulation based on the periodic channel tutorial. I succeed running several variations of It but now suddenly the simulation crash always in the same point. As you can see in the attached log file below, it's not due to high Courant number or problems converging the pressure field. Any idea?

Thanks!
Mauro
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

Time = 1.7325

Courant Number mean: 0.0210942 max: 0.0681994

Coupling...
Starting up LIGGGHTS
Executing command: 'run 50 '
run 50
Setting up run at Mon Feb 26 19:51:34 2018

Memory usage per processor = 11.2948 Mbytes
Step Atoms KinEng 1 Volume
3464950 2125 0.00024900749 4.2313066e-05 1.8709056e-05
CFD Coupling established at step 3465000
3465000 2125 0.00024909134 4.2217973e-05 1.8709056e-05
Loop time of 0.00398018 on 12 procs for 50 steps with 2125 atoms, finish time Mon Feb 26 19:51:34 2018

Pair time (%) = 0.000447353 (11.2395)
Neigh time (%) = 0 (0)
Comm time (%) = 0.000178079 (4.47414)
Outpt time (%) = 0.00103784 (26.0751)
Other time (%) = 0.00231691 (58.2112)

Nlocal: 177.083 ave 206 max 158 min
Histogram: 1 1 2 3 2 0 2 0 0 1
Nghost: 6.83333 ave 11 max 3 min
Histogram: 1 0 3 3 0 0 2 1 1 1
Neighs: 285.167 ave 390 max 203 min
Histogram: 1 1 2 2 2 1 1 1 0 1

Total # of neighbors = 3422
Ave neighs/atom = 1.61035
Neighbor list builds = 0
Dangerous builds = 0
LIGGGHTS finished

timeStepFraction() = 1
update Ksl.internalField()
TotalForceImp: (1.20892e-07 2.37829e-07 -2.97535e-08)
DILUPBiCG: Solving for Ux, Initial residual = 0.0162437, Final residual = 4.85595e-06, No Iterations 1
DILUPBiCG: Solving for Uy, Initial residual = 0.0163716, Final residual = 4.71705e-06, No Iterations 1
DILUPBiCG: Solving for Uz, Initial residual = 0.00537275, Final residual = 1.5168e-06, No Iterations 1
Pressure gradient source: uncorrected Ubar = 0.270099, pressure gradient = 1.55112
--> FOAM Warning : Pressure gradient force is neglected in this model!!
DICPCG: Solving for p, Initial residual = 0.553665, Final residual = 0.0473892, No Iterations 4
time step continuity errors : sum local = 1.2539e-05, global = -5.33363e-09, cumulative = -0.00243579
Pressure gradient source: uncorrected Ubar = 0.270033, pressure gradient = 4.47137
--> FOAM Warning : Pressure gradient force is neglected in this model!!
DICPCG: Solving for p, Initial residual = 0.0784504, Final residual = 0.00715004, No Iterations 18
time step continuity errors : sum local = 1.91615e-06, global = -5.33363e-09, cumulative = -0.00243579
Pressure gradient source: uncorrected Ubar = 0.270036, pressure gradient = 4.36937
--> FOAM Warning : Pressure gradient force is neglected in this model!!
DICPCG: Solving for p, Initial residual = 0.0113259, Final residual = 0.00111827, No Iterations 126
time step continuity errors : sum local = 3.00691e-07, global = -5.33363e-09, cumulative = -0.0024358
Pressure gradient source: uncorrected Ubar = 0.270038, pressure gradient = 4.26814
--> FOAM Warning : Pressure gradient force is neglected in this model!!
[2] [3]
[3]
[3] --> FOAM FATAL IO ERROR:
[3] error in IOstream "/home/mauro/cK1/cfdemRUN/run/rP/Dphi3.0_Vphi0.100_rP1000/CFD/processor3/1.7325/U" for operation Ostream& operator<<(Ostream&, const Scalar&)
[3]
[3] file: /home/mauro/cK1/cfdemRUN/run/rP/Dphi3.0_Vphi0.100_rP1000/CFD/processor3/1.7325/U at line 5428.
[3]
[3] From function IOstream::check(const char*) const
[3] in file db/IOstreams/IOstreams/IOstream.C at line 99.
[6]
[6]
[6] --> FOAM FATAL IO ERROR:
[6] error in IOstream "/home/mauro/cK1/cfdemRUN/run/rP/Dphi3.0_Vphi0.100_rP1000/CFD/processor6/1.7325/U_0" for operation Ostream& operator<<(Ostream&, const Scalar&)
[6]
[6] file: /home/mauro/cK1/cfdemRUN/run/rP/Dphi3.0_Vphi0.100_rP1000/CFD/processor6/1.7325/U_0 at line 8421.
[6]
[6] From function IOstream::check(const char*) const
[6] in file db/IOstreams/IOstreams/IOstream.C at line 99.[0]
[2]
[2] --> FOAM FATAL IO ERROR:
[2] error in IOstream "/home/mauro/cK1/cfdemRUN/run/rP/Dphi3.0_Vphi0.100_rP1000/CFD/processor2/1.7325/U" for operation Ostream& operator<<(Ostream&, const char)
[2]
[2] file: /home/mauro/cK1/cfdemRUN/run/rP/Dphi3.0_Vphi0.100_rP1000/CFD/processor2/1.7325/U at line 5454.
[2]
[2] From function IOstream::check(const char*) const
[2] in file db/IOstreams/IOstreams/IOstream.C at line 99.
[2]
FOAM parallel run exiting
[2]
[3]
FOAM parallel run exiting
[3]
[11]
[6]
FOAM parallel run exiting
[6]
[4] [9]
[11]
[11] --> FOAM FATAL IO ERROR:
[11] error in IOstream "/home/mauro/cK1/cfdemRUN/run/rP/Dphi3.0_Vphi0.100_rP1000/CFD/processor11/1.7325/U_0" for operation Ostream& operator<<(Ostream&, const Scalar&)
[11]
[11] file: /home/mauro/cK1/cfdemRUN/run/rP/Dphi3.0_Vphi0.100_rP1000/CFD/processor11/1.7325/U_0 at line 8419.
[11]
[4]
[4] --> FOAM FATAL IO ERROR:
[4] error in IOstream "/home/mauro/cK1/cfdemRUN/run/rP/Dphi3.0_Vphi0.100_rP1000/CFD/processor4/1.7325/U" for operation Ostream& operator<<(Ostream&, const Scalar&)
[4]
[4] file: /home/ma

paul | Wed, 02/28/2018 - 09:03

The crash occurs while writing the U field. Are you sure you have enough hard disk space available?

shahab.zaman | Mon, 07/09/2018 - 18:40

Dear Mauro,

I just tried to perform a calculation on the cluster.
I modified a script from an OpenFOAM job, but that didn't work.

call looks like:

bsub < file

in file, there are some settings and finally the call of the simulation:

$MPIEXEC $FLAGS_MPI_BATCH Allrun.sh

but this doesn't work

I also tried just to call:
Allrun.sh

The only thing that happens is the decomposePar of the case.
Any hints?

Thank's a lot
Shahab