CFDEM 2.1.0 fails to run

skyopener's picture
Submitted by skyopener on Thu, 09/01/2011 - 05:58

Dear developers & CFDEMers,
Some strange messages occur when I run the newest version of CFDEM, and none of cases run successfully.
My compiling process is same as the compiling tutorial --------setup_LIGGGHTS__OpenFoamR_CFDEM_2p0_on_Ubuntu1004_24052011.txt.
and everything goes well in the compiling process. (OF 2.0.1 debug version, LIGGGHTS1.4.1, CFDEM 2.1.0)

In the settingTestMPI directory, after the command ./Allrun.sh . the output message is following:
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
/*---------------------------------------------------------------------------*\
| ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 2.0.1 |
| \\ / A nd | Web: www.OpenFOAM.com |
| \\/ M anipulation | |
\*---------------------------------------------------------------------------*/
Build : 2.0.1
Exec : decomposePar
Date : Sep 01 2011
Time : 11:21:18
Host : skyopener-desktop
PID : 15765
Case : /home/skyopener/OpenFOAM/skyopener-2.0.1/run/cfdemSolverPisoCase_shared/settlingTestMPI/CFD
nProcs : 1
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Time = 0
Create mesh

Calculating distribution of cells
Selecting decompositionMethod simple

Finished decomposition in 0.11 s

Calculating original mesh data

Distributing cells to processors

Distributing faces to processors

Distributing points to processors

Constructing processor meshes

Processor 0
Number of cells = 4000
Number of faces shared with processor 1 = 400
Number of processor patches = 1
Number of processor faces = 400
Number of boundary faces = 1200

Processor 1
Number of cells = 4000
Number of faces shared with processor 0 = 400
Number of processor patches = 1
Number of processor faces = 400
Number of boundary faces = 1200

Number of processor faces = 400
Max number of cells = 4000 (0% above average 4000)
Max number of processor patches = 1 (0% above average 1)
Max number of faces between processors = 400 (0% above average 400)

Processor 0: field transfer
Processor 1: field transfer

End.

// run_parallel_cfdemSolverPiso_settlingTestMPI_CFDDEM_shared //

/home/skyopener/OpenFOAM/skyopener-2.0.1/run/cfdemSolverPisoCase_shared/settlingTestMPI/CFD

rm: unable to delete file "couplingFiles/*": no such file or directory
/*---------------------------------------------------------------------------*\
| ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 2.0.1 |
| \\ / A nd | Web: www.OpenFOAM.com |
| \\/ M anipulation | |
\*---------------------------------------------------------------------------*/
Build : 2.0.1
Exec : cfdemSolverPiso_shared -parallel
Date : Sep 01 2011
Time : 11:21:20
Host : skyopener-desktop
PID : 15779
Case : /home/skyopener/OpenFOAM/skyopener-2.0.1/run/cfdemSolverPisoCase_shared/settlingTestMPI/CFD
nProcs : 2
Slaves :
1
(
skyopener-desktop.15780
)

Pstream initialized with:
floatTransfer : 0
nProcsSimpleSum : 0
commsType : nonBlocking
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0

Reading field p

Reading physical velocity field U
Note: only if voidfraction at boundary is 1, U is superficial velocity!!!

Reading momentum exchange field Ksl

Reading voidfraction field voidfraction = (Vgas/Vparticle)

Creating dummy density field rho = 1

Reading particle velocity field Us

Reading/calculating face flux field phi

Selecting incompressible transport model Newtonian
Selecting turbulence model type RASModel
Selecting RAS turbulence model laminar

Reading g
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI COMMUNICATOR 3 SPLIT FROM 0
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
Selecting locateModel standard
Selecting dataExchangeModel twoWayMPI
Starting up LIGGGHTS for first time execution
Executing input script '../DEM/in.liggghts_init'
LIGGGHTS 1.4.1 based on lammps-10Mar10

# Pour granular particles into chute container, then induce flow

atom_style granular
atom_modify map array sort 0 0
communicate single vel yes
processors 1 1 2

boundary f f f
newton off

units si

region reg block 0 0.1 0 0.1 0 0.1 units box
create_box 1 reg
Created orthogonal box = (0 0 0) to (0.1 0.1 0.1)
1 by 1 by 2 processor grid

neighbor 0.003 bin
neigh_modify delay 0 binsize 0.01

#Material properties required for new pair styles

fix m1 all property/global youngsModulus peratomtype 5.e6
fix m2 all property/global poissonsRatio peratomtype 0.45
fix m3 all property/global coefficientRestitution peratomtypepair 1 0.3
fix m4 all property/global coefficientFriction peratomtypepair 1 0.5
fix m5 all property/global characteristicVelocity scalar 2.0

#pair style
pair_style gran/hooke 1 0 #Hookean without cohesion
pair_coeff * *

#timestep, gravity
timestep 0.00001
fix gravi all gravity 9.81 vector 0.0 -1.0 0.0

#walls
fix xwalls all wall/gran/hooke 1 0 xplane 0.0 0.1 1
fix ywalls all wall/gran/hooke 1 0 yplane 0 0.1 1
fix zwalls all wall/gran/hooke 1 0 zplane 0 0.01 1

#-import mesh from cad:
#fix cad1 all mesh/gran hopperGenauerSALOME.stl 1 1.0 0. 0. 0. 0. 180. 0.

#-use the imported mesh as granular wall
#fix bucket_wall all wall/gran/hertz/history 1 0 mesh/gran 1 cad1

#particle insertion

#- distributions for insertion using pour
#fix pts1 all particletemplate/sphere 1 atom_type 1 density constant 2500 radius constant 0.001
#fix pts2 all particletemplate/sphere 1 atom_type 1 density constant 2500 radius constant 0.002
#fix pdd1 all particledistribution/discrete 1. 2 pts1 0.3 pts2 0.7
#variable alphastart equal 0.05
#region bc block 0.01 0.09 0.05 0.09 0.001 0.009 units box
#fix ins all pour/dev/packing 1 distributiontemplate pdd1 vol ${alphastart} 200 region bc

#- create single partciles
#create_atoms 1 single 0.05 0.04 0.05 units box
create_atoms 1 single 0.05 0.04 0.046 units box
Created 1 atoms
set group all diameter 0.0001 density 3000
Setting atom values ...
1 settings made for diameter
1 settings made for density

#cfd coupling
fix cfd all couple/cfd/force couple_every 100 mpi
INFO: nevery as specified in LIGGGHTS is overriden by calling external program
variable vx equal vx[1]
variable vy equal vy[1]
variable vz equal vz[1]
variable time equal step*dt
fix extra all print 100 "${time} ${vx} ${vy} ${vz}" file ../DEM/post/velocity.txt title "%" screen no
ERROR on proc 0: Cannot open fix print file ../DEM/post/velocity.txt
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 15779 on
node skyopener-desktop exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
running reconstructPar -noZero in pseudo-parallel mode on 2 processors
do reconstructPar on 1 time directories
making temp dir
mkdir: unable to create directory"temp.parreconstructPar": file is already exsit there
Starting Job 1 - reconstructPar time = 0 through 0
Starting Job 2 - reconstructPar time = constant through constant
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

the error message is the MPI_ABORT, but I cann't figure it out where the problem is.
so any suggestion or hint will be appreciated..

thanks.
S.L

ckloss's picture

ckloss | Thu, 09/01/2011 - 08:05

this is the problem:
ERROR on proc 0: Cannot open fix print file ../DEM/post/velocity.txt
so this is nothing serious - you need an empty directory "post" in the /DEM directory, probably this got lost because git does not track empty directories

Christoph

cgoniva's picture

cgoniva | Thu, 09/01/2011 - 08:22

Hi!

Christoph is right ... the case/DEM/post directories got lost. I'm gonna fix that asap.
In the meanwhile simply add an empty dir calles post

Cheers,
Chris

skyopener's picture

skyopener | Thu, 09/01/2011 - 09:55

hello Christoph & Chris,

Thanks for your quick replys.
After I created an empty "post" directory, none of case runs successfully . all the cases show the same crash messages.
they all stop at the coupling point. After the running of liggghts, the program suspends with the output: timeStepFraction() = 1...
After waiting some seconds, the program crashes with the message as following:
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Total # of neighbors = 0
Ave neighs/atom = 0
Neighbor list builds = 0
Dangerous builds = 0
LIGGGHTS finished

timeStepFraction() = 1
[1]
[1]
[1] --> FOAM FATAL ERROR:
[1] index -1 out of range 0 ... 3999
[1]
[1] From function UList::checkIndex(const label)
[1] in file /home/skyopener/OpenFOAM/OpenFOAM-2.0.1/src/OpenFOAM/lnInclude/UListI.H at line 109.
[1]
FOAM parallel run aborting
[1]
[1] #0 Foam::error::printStack(Foam::Ostream&) at ~/OpenFOAM/OpenFOAM-2.0.1/src/OSspecific/POSIX/printStack.C:201
[1] #1 Foam::error::abort() at ~/OpenFOAM/OpenFOAM-2.0.1/src/OpenFOAM/lnInclude/error.C:230
[1] #2 Foam::Ostream& Foam::operator<< (Foam::Ostream&, Foam::errorManip) at ~/OpenFOAM/OpenFOAM-2.0.1/src/OpenFOAM/lnInclude/errorManip.H:85
[1] #3 Foam::UList::checkIndex(int) const at ~/OpenFOAM/OpenFOAM-2.0.1/src/OpenFOAM/lnInclude/UListI.H:113
[1] #4 Foam::UList::operator[](int) at ~/OpenFOAM/OpenFOAM-2.0.1/src/OpenFOAM/lnInclude/UListI.H:168
[1] #5 Foam::centreVoidFraction::setvoidFraction(double** const&, double**&, double**&, double**&) const at ~/CFDEM/cfdem_GIT/cfdemParticle/subModels/voidFractionModel/centreVoidFraction/centreVoidFraction.C:138
[1] #6 Foam::cfdemCloud::evolve(Foam::GeometricField&, Foam::GeometricField, Foam::fvPatchField, Foam::volMesh>&, Foam::GeometricField, Foam::fvPatchField, Foam::volMesh>&) at ~/CFDEM/cfdem_GIT/cfdemParticle/cfdemCloud/cfdemCloud.C:346
[1] #7
[1] at ~/CFDEM/cfdem_GIT/cfdemSolverPiso_shared/cfdemSolverPiso.C:80
[1] #8 __libc_start_main in "/lib/tls/i686/cmov/libc.so.6"
[1] #9
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
It seems that the particles evolve outside the fluid calculation domain and lead to this error.
After the error occurred in one computer, I suspect if it is the problem of compiling . then I recompile all the materials (OF Liggghts CFDEM)in another computer, but the same error appears again.
I really appreciate your kindly help. thanks.
S.L

cgoniva's picture

cgoniva | Thu, 09/01/2011 - 13:40

Hi!

The first error with the "missing DEM/post directory" should be fixed now. (Thx for your response!)

the second error "index -1 out of range 0 ... 3999" looks weird.

We tested the latest version with 1.6-ext and 1.7.x and it worked fine...

probably you could dig a little in:
/cfdemParticle/subModels/voidFractionModel/centreVoidFraction/centreVoidFraction.C:138
(it asks for index -1 of voidfractionNext_ ...)

and check what is causing the error.

I'll have a look at that as well.

Cheers,
Chris

skyopener's picture

skyopener | Tue, 09/06/2011 - 14:49

hello Chris,
after several days' try, I have move to OF 1.7.1 finally.

During this time, several versions of openfoam with two different compile option have been recompiled(OF 2.0 ,OF 2.0.1 ,OF 1.7.1, OF 1.6: with debug and opt compile mode). It's disappointed that none of them work on my computer. Some of them can't compile the source of CFDEM, others pass the compile but the program crashed with some strange behaviors such as suspend on the point: timestepfraction=1........
During compiling different version of openfoam with cfdem, some of them failed with the information that different c++ libraries have been linked...although some efforts have been paid, the problems can't be figured out. Finally I have to install a new OS, and then follow the compiling tutorials to compile other materials.
Fortunately, the program works perfectly.....

by the way, when run the command './Allrun' in the cfdemSoverIBCase tutorial, the same phenomenon appears with another topic in the forum.
thx for your kindly help.
S.L

skyopener's picture

skyopener | Mon, 09/19/2011 - 09:22

dear cfd-dem developers,
I have compiled the newest cfd-dem successfully in the ubuntu 11.04 following the tutorials on the forum in my spare time. (CFDEM 2.1.0 && LIGGGHTS 1.4.3 && OpenFOAM 2.0.1(debug mode))
everything goes well but only one mistake in the tutorial twosphereGlowinsik..
In the Make/fvSchemes, the item: div((nuEff*dev(grad(U).T()))) Gauss linear; should be div((nuEff*dev(T(grad(U))))) Gauss linear.
thx.!
S.L

cgoniva's picture

cgoniva | Mon, 04/02/2012 - 10:30

Dear all,

finally I can reproduce this error message - it appears when I compile with Debug option (with Opt everything works fine).

To fix that bug, either wait for the next cfdem version, or do the following modifications:
========================
(A)
in dividedVoidfraction.C move the command cellVol = particleCloud_.mesh().V()[cellID]; inside the if (cellID >= 0) section:
scalar cellVol(1);
if (cellID >= 0) // particel centre is in domain
{
cellVol = particleCloud_.mesh().V()[cellID];
========================
(B)
in Archimedes.C the setForce function should look like this:
vector force(0,0,0);

for(int index = 0;index < particleCloud_.numberOfParticles(); ++index)
{
if(mask[index][0])
{
label cellI = particleCloud_.cellIDs()[index][0];
force=vector::zero;

if (cellI > -1) // particle Found
{
scalar dp = 2*particleCloud_.radius(index);

if(twoDimensional_)
{
force = -g_.value()*rho_[cellI]*pow(dp,2)/4*M_PI;
}else{
force = -g_.value()*rho_[cellI]*pow(dp,3)/6*M_PI;
}
}

if(treatDEM_) for(int j=0;j<3;j++) DEMForces[index][j] += force[j];
else if(treatExplicit_) for(int j=0;j<3;j++) expForces[index][j] += force[j];
else for(int j=0;j<3;j++) impForces[index][j] += force[j];
}
}
========================

Cheers,
Chris

junchen00 | Wed, 09/28/2011 - 12:27

I encountered the same INFO, but followed an ERROR, please help me!

/*---------------------------------------------------------------------------*\
| ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 1.7.1 |
| \\ / A nd | Web: www.OpenFOAM.com |
| \\/ M anipulation | |
\*---------------------------------------------------------------------------*/
Build : 1.7.1-03e7e056c215
Exec : cfdemSolverPiso_shared -parallel
Date : Sep 29 2011
Time : 02:13:51
Host : cae-desktop
PID : 22424
Case : /home/cae/OpenFOAM/cae-1.7.1/run/cfdemSolverPisoCase_shared/settlingTestMPI/CFD
nProcs : 2
Slaves :
1
(
cae-desktop.22425
)

Pstream initialized with:
floatTransfer : 0
nProcsSimpleSum : 0
commsType : nonBlocking
SigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0

Reading field p

Reading physical velocity field U
Note: only if voidfraction at boundary is 1, U is superficial velocity!!!

Reading momentum exchange field Ksl

Reading voidfraction field voidfraction = (Vgas/Vparticle)

Creating dummy density field rho = 1

Reading particle velocity field Us

Reading/calculating face flux field phi

Selecting incompressible transport model Newtonian
Selecting turbulence model type RASModel
Selecting RAS turbulence model laminar

Reading g
Selecting locateModel standard
Selecting dataExchangeModel twoWayMPI
Starting up LIGGGHTS for first time execution
Executing input script '../DEM/in.liggghts_init'
LIGGGHTS 1.4.2 based on lammps-10Mar10

# Pour granular particles into chute container, then induce flow

atom_style granular
atom_modify map array sort 0 0
communicate single vel yes
processors 1 1 2

boundary f f f
newton off

units si

region reg block 0 0.1 0 0.1 0 0.1 units box
create_box 1 reg
Created orthogonal box = (0 0 0) to (0.1 0.1 0.1)
1 by 1 by 2 processor grid

neighbor 0.003 bin
neigh_modify delay 0 binsize 0.01

#Material properties required for new pair styles

fix m1 all property/global youngsModulus peratomtype 5.e6
fix m2 all property/global poissonsRatio peratomtype 0.45
fix m3 all property/global coefficientRestitution peratomtypepair 1 0.3
fix m4 all property/global coefficientFriction peratomtypepair 1 0.5
fix m5 all property/global characteristicVelocity scalar 2.0

#pair style
pair_style gran/hooke 1 0 #Hookean without cohesion
pair_coeff * *

#timestep, gravity
timestep 0.00001
fix gravi all gravity 9.81 vector 0.0 -1.0 0.0

#walls
fix xwalls all wall/gran/hooke 1 0 xplane 0.0 0.1 1
fix ywalls all wall/gran/hooke 1 0 yplane 0 0.1 1
fix zwalls all wall/gran/hooke 1 0 zplane 0 0.01 1

#-import mesh from cad:
#fix cad1 all mesh/gran hopperGenauerSALOME.stl 1 1.0 0. 0. 0. 0. 180. 0.

#-use the imported mesh as granular wall
#fix bucket_wall all wall/gran/hertz/history 1 0 mesh/gran 1 cad1

#particle insertion

#- distributions for insertion using pour
#fix pts1 all particletemplate/sphere 1 atom_type 1 density constant 2500 radius constant 0.001
#fix pts2 all particletemplate/sphere 1 atom_type 1 density constant 2500 radius constant 0.002
#fix pdd1 all particledistribution/discrete 1. 2 pts1 0.3 pts2 0.7
#variable alphastart equal 0.05
#region bc block 0.01 0.09 0.05 0.09 0.001 0.009 units box
#fix ins all pour/dev/packing 1 distributiontemplate pdd1 vol ${alphastart} 200 region bc

#- create single partciles
#create_atoms 1 single 0.05 0.04 0.05 units box
create_atoms 1 single 0.05 0.04 0.046 units box
Created 1 atoms
set group all diameter 0.0001 density 3000
Setting atom values ...
1 settings made for diameter
1 settings made for density

#cfd coupling
fix cfd all couple/cfd/force couple_every 100 mpi
INFO: nevery as specified in LIGGGHTS is overriden by calling external programERROR: Fix property/atom variable dragforce has wrong length (length is 3 but length 0 expected)

cgoniva's picture

cgoniva | Wed, 09/28/2011 - 15:31

Hi!

I just checked the latest version of LIGGGHTS and the latest version of CFDEM on my ubuntu 10.04, and 1.7.x and the settlingTestMPI ran perfectly fine.

Please try to update your code ,check the compilation process and please contact us if the error still occurs.

Cheers, Chris

junchen00 | Thu, 09/29/2011 - 05:11

I tried the latest version of LIGGGHTS 1.4.4 and the ERROR disappeared. Thanks!
BTW: I used LIGGGHTS 1.4.2 when the ERROR occurred.