Computer resource requirement for a cacluation

Submitted by alberthappy on Mon, 07/14/2014 - 17:40

Hi everyone,

Now I need to run a modeling of granular flow within an auger mixer. Approximately the total particle number is around 400,000. I have a server with 24 CPUs around. I wonder if this computer can handle this modeling? Anyone can give a hint about this ?

richti83's picture

richti83 | Tue, 07/15/2014 - 08:14

Depends on particle diamter and your grain-size distribution.
Normely <2Mio particles is no Problem for 16..25 cores (We own three DELL Workstations with 16,20 and 24 Cores) and everyone of them can simulate 60sec per week for 1..2 Mio particles.
Consider the following:
neighbor-cutoff-distance = 2*r_max. So when there are less big particles and many very small particles the cutoff is 2*big. But than in the contactlist there are many many possible contacts between the very small particles which decreases performance a lot. (think on the "search-radius" from one very small single particle with r_search=2*r_max in a dense packaging: houndreds of other small particles will be in the contact list even when they are not in contact directly).
Also the critical timestep is influenced by very small particles (t_crit=sqrt(m/k)

Best,
Christian.

I'm not an associate of DCS GmbH and not a core developer of LIGGGHTS®
ResearchGate | Contact

alberthappy | Tue, 07/15/2014 - 17:35

Christian, Thanks for your reply. My particle is pretty small, ~ 1 mm. But for now I use uniform particle size so I guess the neighbor-cutoff-distance will not decrease the performance.

Fenglei Qi

ckloss's picture

ckloss | Sat, 07/26/2014 - 20:21

Hi albert,

that simulation size should not be a problem for LIGGGHTS

Cheers
Christoph