Momentum transfer issue

Submitted by tshan on Fri, 01/14/2011 - 08:11

Hi, Chris,

I have a small question that happened in the coupling case that I modified. It is pouring case which pours particles from the air to the water. There was no problem when I try a small number of particles, however, when I increased the number to about 1000, the foam would finish while the liggghts would not stop and keep waiting in the middle.

Why would this happen and could you let me know how I could solve this issue?

Many thanks for your kind help!

Best,
Tong

CFD:
...
wait for file "/home/tong/Software/cfdem/test2/CFD/couplingFiles/radius1"
reading from file: "/home/tong/Software/cfdem/test2/CFD/couplingFiles/radius1"
nr particles = 981
giveVectorData, numberOfParticles=981
wait for file "/home/tong/Software/cfdem/test2/CFD/couplingFiles/dragforce1"
writing to file: "/home/tong/Software/cfdem/test2/CFD/couplingFiles/dragforce1"...
evolve done.
DILUPBiCG: Solving for Ux, Initial residual = 1, Final residual = 11.4979, No Iterations 1001
DILUPBiCG: Solving for Uy, Initial residual = 1, Final residual = 0.0814328, No Iterations 1001
DILUPBiCG: Solving for Uz, Initial residual = 1, Final residual = 8.77976, No Iterations 1001
#0 Foam::error::printStack(Foam::Ostream&) in "/home/tong/OpenFOAM/OpenFOAM-1.6/lib/linux64GccDPOpt/libOpenFOAM.so"
#1 Foam::sigFpe::sigFpeHandler(int) in "/home/tong/OpenFOAM/OpenFOAM-1.6/lib/linux64GccDPOpt/libOpenFOAM.so"
#2 in "/lib/libc.so.6"
#3 Foam::PCG::solve(Foam::Field&, Foam::Field const&, unsigned char) const in "/home/tong/OpenFOAM/OpenFOAM-1.6/lib/linux64GccDPOpt/libOpenFOAM.so"
#4 Foam::fvMatrix::solve(Foam::dictionary const&) in "/home/tong/OpenFOAM/OpenFOAM-1.6/lib/linux64GccDPOpt/libfiniteVolume.so"
#5
in "/home/tong/OpenFOAM/tong-1.6/applications/bin/linux64GccDPOpt/cfdemSolverPisoExplicit"
#6 __libc_start_main in "/lib/libc.so.6"
#7
in "/home/tong/OpenFOAM/tong-1.6/applications/bin/linux64GccDPOpt/cfdemSolverPisoExplicit"
tong@tong-laptop:~/Software/cfdem/test2$

DEM:

CFD Coupling established at step 12800
Fix couple/cfd/file: waiting for file: ../CFD/couplingFiles/x0
Fix couple/cfd/file: waiting for file: ../CFD/couplingFiles/v0
Fix couple/cfd/file: waiting for file: ../CFD/couplingFiles/radius0
Fix couple/cfd/file: waiting for file: ../CFD/couplingFiles/dragforce0
INFO: more than 15 touching neighbor atoms found, growing shear history...done!
CFD Coupling established at step 12900
Fix couple/cfd/file: waiting for file: ../CFD/couplingFiles/x0
Fix couple/cfd/file: waiting for file: ../CFD/couplingFiles/v0
Fix couple/cfd/file: waiting for file: ../CFD/couplingFiles/radius0
Fix couple/cfd/file: waiting for file: ../CFD/couplingFiles/dragforce0
CFD Coupling established at step 13000
Fix couple/cfd/file: waiting for file: ../CFD/couplingFiles/x0
Fix couple/cfd/file: waiting for file: ../CFD/couplingFiles/v0
Fix couple/cfd/file: waiting for file: ../CFD/couplingFiles/radius0
Fix couple/cfd/file: waiting for file: ../CFD/couplingFiles/dragforce0
(keep waiting)

cgoniva's picture

cgoniva | Fri, 01/14/2011 - 09:46

Hi Tong!

Simulation did not stop ... it crashed.

"DILUPBiCG: Solving for Ux, Initial residual = 1, Final residual = 11.4979, No Iterations 1001"

shows you that 1001 iterations were necessary to get a solution ... a hint that something went wrong.

Probably local particle loading is too high -> interaction terms become to high -> simulation might crash.

have you tried smaller time steps?

Cheers,
Christoph

tshan | Fri, 01/14/2011 - 11:48

Hi, Chris!

Thanks for your quick and useful reply.

At this time, I decreased dt to the half, there was no problem and the simulation finished. However, I need to pour a large number of particles so I further decreased the dt. There was an error in CFD terminal:

Error - TS bigger than coupling interval!

I do not know why this happened and this is what I modified in CFD part:

controlDict:
deltaT 0.0002;

couplingProperties:
DEMts 0.000002;
couplingInterval 100;

Many thanks again for you help!

Best,
Tong

ckloss's picture

ckloss | Fri, 01/14/2011 - 12:16

In the CFD dict, you specify the DEM time-step, which has to be equal to the DEM time-step you specify in the LIGGGHTS in-file

Also, the coupling intervals have to be equal (in LIGGGHTS this is the first argument of the fix couple/cfd command)

Christoph

tshan | Fri, 01/14/2011 - 12:59

Hi, Christoph,

Thanks for your reply!

But I am sure that they are equal and the error still exists, and the liggghts part has not been started at this time:

Error - TS bigger than coupling interval!
#0 Foam::error::printStack(Foam::Ostream&) in "/home/tong/OpenFOAM/OpenFOAM-1.6/lib/linux64GccDPOpt/libOpenFOAM.so"
#1 Foam::error::abort() in "/home/tong/OpenFOAM/OpenFOAM-1.6/lib/linux64GccDPOpt/libOpenFOAM.so"
#2 Foam::cfdemCloud::cfdemCloud(Foam::fvMesh const&) in "/home/tong/OpenFOAM/tong-1.6/lib/linux64GccDPOpt/liblagrangianCFDEM_shared.so"
#3
in "/home/tong/OpenFOAM/tong-1.6/applications/bin/linux64GccDPOpt/cfdemSolverPisoExplicit"
#4 __libc_start_main in "/lib/libc.so.6"
#5
in "/home/tong/OpenFOAM/tong-1.6/applications/bin/linux64GccDPOpt/cfdemSolverPisoExplicit"

FOAM aborting

Best,
Tong

cgoniva's picture

cgoniva | Sat, 01/15/2011 - 01:01

Hi Tong!

This error means that you are trying e.g.
couple all 10s but the timestep size of the CFD calculation is 11s.

cheers,
Chris

tshan | Sat, 01/15/2011 - 08:08

Hi, Chris,

I am quite confused. I indeed know that the deltaT should be equal to DEMts*couplingInterval.

When I adjusted to these parameters:
deltaT=1e-4 and DEMts=1e-6, couplingInterval=100, there should be no problem.However, I met this Error - TS bigger than coupling interval!

PS: deltaT=5e-4 and DEMts=5e-6, couplingInterval=100 is no problem!

It is so weird that I really can not understand and I have seen:
if(mesh_.time().deltaT().value() > couplingInterval_*DEMts_)
{FatalError<<"\nError - TS bigger than coupling interval!\n"<< abort(FatalError); }

Isn't the mesh_.time().deltaT().value() the above deltaT? I really got confused. Did I make some stupid mistakes? Could you help me point out?

Many thanks for your help!

Best,
Tong

cgoniva's picture

cgoniva | Mon, 01/17/2011 - 10:01

Hi Tong!

mesh_.time().deltaT().value() is the value of dT which is set in the controlDict file.

This value is compared to DEMts (from the coupingProperties file) * couplingInterval (from the coupingProperties file)

Note that the DEMts (from the coupingProperties file) must fit to the value specified in the in.* file of the DEM simulation.

Btw, please use "adjustTimeStep no;" in the controlDict file.

Cheers,
Christoph

tshan | Thu, 01/20/2011 - 16:45

Hi, Christoph,

I wanted to pour a large number of particles(10,000) into the water but I met this error again and it crashed. At this time, it did not work when I decreased the dt. So are there some other solutions to resolve this error?

Another question, why did I keep meeting this time step continuity errors at each step even though it would not crash?

DICPCG: Solving for p, Initial residual = 1, Final residual = 0.0953501, No Iterations 7
time step continuity errors : sum local = 1.35545e+44, global = 1.15125e+43, cumulative = 1.15125e+43
(this was at the last step just before crashing)

Thanks for your reply!

Best,
Tong

cgoniva's picture

cgoniva | Thu, 01/20/2011 - 18:38

Hi!

Probably the momentum you tried to exchange between the DEM particles and the CFD Volumes was simply too much!

Try to reduce their density and see what happens then...

"time step continuity errors" is a std. output coming from the CFD solver - it is not specific for the coupling.

Cheers,
Chris

tshan | Sat, 01/22/2011 - 11:11

Hi, Chris,

I feel sorry to disturb you on this crashing problem again, because it did not work with decreasing density. I have seen the good examples on this website but I could not run them without solve this crashing problem.

Could you help me deal with this or just give me some hints?

Many thanks for your kind help.

Best,
Tong

tshan | Mon, 01/24/2011 - 07:19

Hi, Chris!

I have tried, but it did not work. At this time, I modified the mesh number from (9 9 9) to (25 25 25) and typed blockMesh, reduced the timesteps and the coupling interval in the settlingTest, and then ran. Unfortunately, it crashed with this error again.

With another question, I have seen that you used immersed boundary method combined with CFD-DEM. Does this first version of CFD-DEM include this functionality?

Many thanks for your kind reply!

Best,
Tong

cgoniva's picture

cgoniva | Mon, 01/24/2011 - 09:38

Hi,

a finer grid will probably not (!) improve convergence, if momentum transfer dominates your problem.

We have done some tests with IB and DEM - this is work in progress and based on the coupling from GIT.
Several extensions and modifications are needed and we are currently working on that. So your current version from GIT will not allow IB simulations.

Cheers,
Chris

tshan | Tue, 01/25/2011 - 13:03

Hi, Chris!

Thanks for you reply! Actually, I indeed want to see the finer mesh case...

Could I ask you a question about what the 4-way coupling and 2-way coupling means? I do not know why the surge conveyor example is called 4-way coupled DEM-CFD.

Thanks again!

Best,
Tong

cgoniva's picture

cgoniva | Tue, 01/25/2011 - 15:40

Hi!

It is just an un-exactness ;-)

2-way coupling often refers to (both direction) momentum exchange between particulate phase and fluid phase.

some people denote those model which also account for particle-particle interaction as 4-way.

Note: Particle-particle interaction is always included here! (via the DEM code)

Cheers
Chris

cheng1988sjtu | Fri, 07/15/2016 - 18:33

Hi Chris and Tong,

I've been trying to use twoWayFiles coupling, and did a quick search on this topic, it seems that the resources are very limited.

Here is why I need to try twoWayFiles. I have run cases with twoWayMPI coupling, and I found that the volume fraction of particles near the interface of processors have a peak value, and these peaks always coincide with the interfaces between processors, I think there may be a problem with mpi in CFDEM. So I have to try running CFDEM in serial, and I chose to use twoWayFiles coupling.

However, I had a problem running it, here is the couplingProperties in CFD/constant:

dataExchangeModel twoWayFiles;
twoWayFilesProps
{
maxNumberOfParticles 1010;
DEMts 1e-6;
}

In DEM/in.liggghts_run, I have set the processor number to be 1, and set the coupling to be:
fix cfd all couple/cfd couple_every 100 file ../CFD/couplingFiles/
fix cfd2 all couple/cfd/force/implicit

When I run the command 'cfdemSolverPiso', the progress get stuck at:
Starting time loop

Time = 0.0001

Courant Number mean: 0.00360587 max: 0.00399323

Coupling...
wait for file "/home/zcheng/Desktop/PhD/RandomWalk/Muste/NBS1_beta4_lift2_balance/CFD/couplingFiles/radius1"

and This will take forever, so I think I may need the file radius1 in the folder of CFD/couplingFiles, could you teach me how to get the radius1 and what's the format of this file?

Thank you so much!

Charlie

C.Z. U of D

alice's picture

alice | Wed, 07/20/2016 - 12:51

Hi Charlie,
I would definitely recommend using MPI coupling. The results obtained with mpi coupling have been validated, and usually there are no troubles. The occuring problem might for example stem from too large particles for the given mesh...
Best regards,
Alice

cheng1988sjtu | Wed, 07/20/2016 - 19:00

Hi Alice,

Thank you for your reply!

I agree that the mpi coupling usually works with no problem (such as single particle settling), however, for the cases where the particles are frequently crossing processors, I found that the result is not correct. For example, for neutrally buoyant particles in the domain, when we drive the particles with mean flow, the volume fraction has peak only near the interfaces of processors (the analytical volume fraction should be uniform more or less).

I've tried your suggestions, and I increased my mesh size from 1 particle diameter to 4 diameters, and the result doesn't change too much. If the volume fraction peak is related to the small mesh size, I can imagine that this peak in volume fraction will not have to coincide with the processor boundary all the time, am I correct?

Best

Charlie

C.Z. U of D