unfnished run

Submitted by VAJIHEH on Mon, 12/19/2011 - 19:07

Hi All,

I am running a coupled CFDEM case, with time steps of around 10e-10 , and coupling of every 10 steps. the simultion stops half way through with below error, I appreciate any help.

Thanks
ime = 6.554e-05

Courant Number mean: 2.82015e-05 max: 0.0036778
- evolve()
Starting up LIGGGHTS
Executing command: 'run 10 '
run 10 Setting up run ...
Memory usage per processor = 15.5843 Mbytes
Step Atoms KinEng 1 Volume
655391 1 0 1.3533016e-11 5.4e-09
655401 1 0 1.3533016e-11 5.4e-09
Loop time of 0.000183582 on 2 procs for 10 steps with 1 atoms

Pair time (%) = 1.19209e-05 (6.49351)
Neigh time (%) = 0 (0)
Comm time (%) = 3.00407e-05 (16.3636)
Outpt time (%) = 9.53674e-06 (5.19481)
Other time (%) = 0.000132084 (71.9481)

Nlocal: 0.5 ave 1 max 0 min
Histogram: 1 0 0 0 0 0 0 0 0 1
Nghost: 0.5 ave 1 max 0 min
Histogram: 1 0 0 0 0 0 0 0 0 1
Neighs: 0 ave 0 max 0 min
Histogram: 2 0 0 0 0 0 0 0 0 0

Total # of neighbors = 0
Ave neighs/atom = 0
Neighbor list builds = 0
Dangerous builds = 0
LIGGGHTS finished

timeStepFraction() = 1
alpha limited to0.3
evolve done
update Ksl.internalField()
totaldragforceEuler calculus
totaldragforceEuler = 2.55414
dv/dt =sum(ddt(voidfraction)) [0 0 -1 0 0 0 0] 0
DILUPBiCG: Solving for Ux, Initial residual = 0.00260401, Final residual = 3.5027e-06, No Iterations 1
DILUPBiCG: Solving for Uy, Initial residual = 0.00200194, Final residual = 8.48266e-07, No Iterations 1
DILUPBiCG: Solving for Uz, Initial residual = 0.00343095, Final residual = 1.6237e-06, No Iterations 1
[0] DICPCG: Solving for p, Initial residual = 0.0253081, Final residual = 0.000519698, No Iterations 2
time step continuity errors : sum local = 1.41043e-06, global = 4.6491e-08, cumulative = 2.14012e-05
DICPCG: Solving for p, Initial residual = 0.000529627, Final residual = 3.83479e-05, No Iterations 3
time step continuity errors : sum local = 1.41042e-06, global = 4.64911e-08, cumulative = 2.14476e-05
DICPCG: Solving for p, Initial residual = 3.83599e-05, Final residual = 3.78337e-06, No Iterations 7
time step continuity errors : sum local = 1.41042e-06, global = 4.64911e-08, cumulative = 2.14941e-05
ExecutionTime = 2150.65 s ClockTime = 2297 s

regIOobject::readIfModified() :
Reading object voidfractionNext from file "/home/vajiheh/CFDEM/cfdem_GIT/cfdemSolverPisoCase_shared/coupled-slot-die-settlingTestMPI/CFD/processor0/6.5e-05/voidfractionNext"
regIOobject::readIfModified() :
Reading object rho from file "/home/vajiheh/CFDEM/cfdem_GIT/cfdemSolverPisoCase_shared/coupled-slot-die-settlingTestMPI/CFD/processor0/6.5e-05/rho"
regIOobject::readIfModified() :
Reading object U from file "/home/vajiheh/CFDEM/cfdem_GIT/cfdemSolverPisoCase_shared/coupled-slot-die-settlingTestMPI/CFD/processor0/6.5e-05/U"
regIOobject::readIfModified() :
Reading object UsNext from file "/home/vajiheh/CFDEM/cfdem_GIT/cfdemSolverPisoCase_shared/coupled-slot-die-settlingTestMPI/CFD/processor0/6.5e-05/UsNext"
regIOobject::readIfModified() :
Reading object voidfractionPrev from file "/home/vajiheh/CFDEM/cfdem_GIT/cfdemSolverPisoCase_shared/coupled-slot-die-settlingTestMPI/CFD/processor0/6.5e-05/voidfractionPrev"
regIOobject::readIfModified() :
Reading object KslNext from file "/home/vajiheh/CFDEM/cfdem_GIT/cfdemSolverPisoCase_shared/coupled-slot-die-settlingTestMPI/CFD/processor0/6.5e-05/KslNext"
regIOobject::readIfModified() :
Reading object UsPrev from file "/home/vajiheh/CFDEM/cfdem_GIT/cfdemSolverPisoCase_shared/coupled-slot-die-settlingTestMPI/CFD/processor0/6.5e-05/UsPrev"
regIOobject::readIfModified() :
Reading object Ksl from file "/home/vajiheh/CFDEM/cfdem_GIT/cfdemSolverPisoCase_shared/coupled-slot-die-settlingTestMPI/CFD/processor0/6.5e-05/Ksl"
regIOobject::readIfModified() :
Reading object p from file "/home/vajiheh/CFDEM/cfdem_GIT/cfdemSolverPisoCase_shared/coupled-slot-die-settlingTestMPI/CFD/processor0/6.5e-05/p"
regIOobject::readIfModified() :
Reading object voidfraction from file "/home/vajiheh/CFDEM/cfdem_GIT/cfdemSolverPisoCase_shared/coupled-slot-die-settlingTestMPI/CFD/processor0/6.5e-05/voidfraction"
regIOobject::readIfModified() :
Reading object Us from file "/home/vajiheh/CFDEM/cfdem_GIT/cfdemSolverPisoCase_shared/coupled-slot-die-settlingTestMPI/CFD/processor0/6.5e-05/Us"
regIOobject::readIfModified() :
Reading object KslPrev from file "/home/vajiheh/CFDEM/cfdem_GIT/cfdemSolverPisoCase_shared/coupled-slot-die-settlingTestMPI/CFD/processor0/6.5e-05/KslPrev"
[0] --> FOAM Serious Error :
[0] From function IOobject::readHeader(Istream&)
[0] in file db/IOobject/IOobjectReadHeader.C at line 89
[0] Reading "/home/vajiheh/CFDEM/cfdem_GIT/cfdemSolverPisoCase_shared/coupled-slot-die-settlingTestMPI/CFD/processor0/6.5e-05/KslPrev" at line 1
[0] First token could not be read or is not the keyword 'FoamFile'
[0]
[0] Check header is of the form:
[0]
/*--------------------------------*- C++ -*----------------------------------*\
| ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 1.7.x |
| \\ / A nd | Web: www.OpenFOAM.com |
| \\/ M anipulation | |
\*---------------------------------------------------------------------------*/
FoamFile
{
version 2.0;
format ascii;
class volScalarField;
location "6.5e-05";
object KslPrev;
}
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //

[0]
[0] --> FOAM FATAL IO ERROR:
[0] problem while reading header for object KslPrev
[0]
[0] file: /home/vajiheh/CFDEM/cfdem_GIT/cfdemSolverPisoCase_shared/coupled-slot-die-settlingTestMPI/CFD/processor0/6.5e-05/KslPrev at line 1.
[0]
[0] From function regIOobject::readStream()
[0] in file db/regIOobject/regIOobjectRead.C at line 69.
[0]
FOAM parallel run exiting
[0]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 2722 on
node vajiheh-desktop exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).