need help on the solving of CFD-DEM heat transfer, blow up when solving T equation

Submitted by Oliver.pasqual on Tue, 05/15/2012 - 03:02

hello everyone,
I'm using the cfdemSolverPisoScalar_shared, but the solver blowed up every time.
It's believed that the problem must be lies in the setting or the mesh structure. I have tested many boundary conditions also the inlet velocity and mesh structure, few effects have been gained.
It's a cubic container.
The calculation domain is 60mm*6.4mm*600mm with 20*8*200 mesh in x,y,z direction, particle diameter is 0.8mm.
The boundary condition for front and back is wall.
I can't understand why the momentum equation is solverd correctly even the mesh number is small in the y direction, but the scalar equation blow up every time.....
could you give me some suggestions on how to strike to this problem.
The error message is as follow:
==================================================================
Nlocal: 10742.5 ave 13088 max 9569 min
Histogram: 2 0 1 0 0 0 0 0 0 1
Nghost: 8471.75 ave 9091 max 7440 min
Histogram: 1 0 0 0 0 1 0 0 0 2
Neighs: 209435 ave 249308 max 196029 min
Histogram: 3 0 0 0 0 0 0 0 0 1

Total # of neighbors = 837741
Ave neighs/atom = 19.496
Neighbor list builds = 0
Dangerous builds = 0
LIGGGHTS finished

timeStepFraction() = 1
Total particle volume neglected: 2.65686e-10
evolve done
total convective particle-fluid heat flux [W] (Eulerian) = 41.396
[0] #0 [1] #0 Foam::error::printStack(Foam::Ostream&)Foam::error::printStack(Foam::Ostream&)[3] #0 [2] #0Foam::error::printStack(Foam::Ostream&) Foam::error::printStack(Foam::Ostream&)--------------------------------------------------------------------------
An MPI process has executed an operation involving a call to the
"fork()" system call to create a child process. Open MPI is currently
operating in a condition that could result in memory corruption or
other system errors; your MPI job may hang, crash, or produce silent
data corruption. The use of fork() (or system() or other calls that
create child processes) is strongly discouraged.

The process that invoked fork was:

Local host: node20 (PID 8309)
MPI_COMM_WORLD rank: 3

If you are *absolutely sure* that your application will successfully
and correctly survive a call to fork(), you may disable this warning
by setting the mpi_warn_on_fork MCA parameter to 0.
--------------------------------------------------------------------------
in "/home/eric/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[0] #1 Foam::sigFpe::sigHandler(int) in "/home/eric/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
in [1] #1 "/home/eric/OpenFOAM/OpenFOAMFoam::sigFpe::sigHandler(int)-2.0.1/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #1 Foam::sigFpe::sigHandler(int) in "/home/eric/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #2
in "/home/eric/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[0] #2
in "/home/eric/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #2
[2] at sigaction.c:0
[2] #3 Foam::PBiCG::solve(Foam::Field&, Foam::Field const&, unsigned char) const[1] at sigaction.c:0
[1] #3 Foam::PBiCG::solve(Foam::Field&, Foam::Field const&, unsigned char) const[0] at sigaction.c:0
[0] #3 Foam::PBiCG::solve(Foam::Field&, Foam::Field const&, unsigned char) const in "/home/eric/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #4 Foam::fvMatrix::solve(Foam::dictionary const&) in "/home/eric/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #4 Foam::fvMatrix::solve(Foam::dictionary const&) in "/home/eric/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[0] #4 Foam::fvMatrix::solve(Foam::dictionary const&) in "/home/eric/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libfiniteVolume.so"
[2] #5 in "/home/eric/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libfiniteVolume.so"
[1] #5 Foam::fvMatrix::solve() in "/home/eric/OpenFOAM/OpenFOAM-2.0.1/platforms/linux64GccDPOpt/lib/libfiniteVolume.so"
[0] #5 in "/home/eric/OpenFOAM/eric-2.0.1/platforms/linux64GccDPOpt/bin/cfdemSolverPisoScalar_shared"
[2] #6 Foam::fvMatrix::solve()
Foam::fvMatrix::solve() in "/home/eric/OpenFOAM/eric-2.0.1/platforms/linux64GccDPOpt/bin/cfdemSolverPisoScalar_shared"
[1] #6 [2] in "/home/eric/OpenFOAM/yangshiliang-2.0.1/platforms/linux64GccDPOpt/bin/cfdemSolverPisoScalar_shared"
[2] #7 __libc_start_main in "/home/eric/OpenFOAM/eric-2.0.1/platforms/linux64GccDPOpt/bin/cfdemSolverPisoScalar_shared"
[0] #6
in "/lib64/libc.so.6"
[2] #8
=======================================================================
so any hint or suggestion will be highly appreciated.
oliver

cgoniva's picture

cgoniva | Tue, 05/15/2012 - 08:46

Hallo oliver,

are you sure that the equation diverges? (is a heatfluy of 41 W unrealistic?) Probably you can track the error down and find out which equation (line of code) is causing the error?

Cheers,
Chris

Oliver.pasqual | Tue, 05/15/2012 - 14:48

Hi Chris,
thanks for your reply.
it is strange when running on the cluster.
Recently many cases have been tested using cfdemSolverPisoScalar_shared, and some of them failed to run with the this error message. and the break point is random.
I can't figure out where the problem lie in.
One case run successfully with inlet velocity 0.6m/s 0.8m/s 1.0m/s and 1.3m/s with 8 cpus, but it cracked with 1.2m/s and 1.4m/s with 4 cpus. the the interesting screen is that the program run smothly with 8 cpus when the inlet velocity is 1.2 and 1.4.
I will have to dig more details on this problem.
Best regards
oliver