Hello,
I have set up a 24-processors simulation that simulates slurry flow in an open-ended pipe. The slurry flows in from one end and flows out through the other end of the pipe. Fixed DEM boundary is used. Particles are counted and deleted using fix massflow/mesh with delete_atoms function. My simulation works fine for dilute slurry with large particles. But it almost always crashes when I use small particles and a concentrated slurry. (The total number of particles is on the order of millions.)
The error message looks like the following: ERROR on proc 13: Failed to reallocate 16719576 bytes for array CfdDatacouplingMPI:data (../memory.cpp:102)
Has anyone ever encountered this problem? Thanks for sharing the experience in advance.
Regards,
CH
---Update--- After reading some source codes I realize this error happens when LIGGGHTS tries to reallocate a large data array. The array "CfdDatacouplingMPI:data: appears to grow in size as written in cfd_datacoupling_mpi.cpp while the memory reallocation error is raised in memory.cpp. Both files are part of LIGGGHTS.
I modify the bin size but it did not help. The simulation fails when about 1.9M particles are inserted. The timing it crashes seems to be unaffected by how many particles that were left in the domain. The memory usage per processor = 108.342 Mbytes and I have 64GB RAM per core. I would really appreciate if anyone could share his/her insights. Thanks.
mbaldini | Tue, 11/01/2016 - 18:29
Hi CH I had a similar problem
Hi CH I had a similar problem running a simulation with 1 million particles. The simulation stopped because the program ran out of ram. The simulation was demanding so much ram because the neighbour list had too many neighbours, so the solution was to adjust the bin and bin size as follows:
neighbor 0.0007 bin
neigh_modify delay 0 binsize 0.002
Also, I would like to ask yo how do you created a dense slurry? Are you using insert/rate/region with a high particle rate?
I am asking because in my simulations I have problems (Courant number instabilities) if I use a high value on the particle insertion rate when I use insert/rate/region.
Good luck with your simulation!
Cheers,
Mauro
ckloss | Wed, 11/02/2016 - 16:00
Hi all,
Hi all,
the CFD-DEM coupling MPI data coupling allocates an array which is long enough to for all particles on all processors, so it might be that you run out of memory for large simulations.
For creating a dense slurry, you can e.g. use insert/rate/region with a high particle rate with a high insert rate, turn off overlap check and "relaxate" the packing in the first section of the simulation using fix nve/limit
Best wishes
Christoph