Seg Fault Error During Particle Template Insertion (Dynamic Fiber Insertion)

Submitted by mschramm on Mon, 08/17/2020 - 02:34

Hello,
I am adding code to the insert function of the particleToInsert class.
It is working for sparse insertions but hits a segmentation fault error once it starts to get dense.

I have started to block in segments of code and have narrowed down where it breaks. The issue is that it breaks between two fprintf statements...

I have a few questions that will hopefully narrow down where the issue is.
As spheres are added, via the create_atoms function, do all arrays also grow?
During particle insertion, do other function run as a different thread?

If it is helpful, I have made a commit on my github of the code that I am working on....
The commit (https://github.com/schrummy14/LIGGGHTS_Flexible_Fibers/commit/681e04c1fc...)

The file (https://github.com/schrummy14/LIGGGHTS_Flexible_Fibers/blob/master/src/p...)

I have also added an extra example in the bond package section under Single tests based on the shear cell. The max number of templates that can be inserted is 2143.

Any help is appreciated. Thank you.

mschramm | Tue, 08/18/2020 - 00:00

Found the issue. The error is happening after the fprintf statement. It must be doing a predictive calculation before it hits the second fprintf statement.
The code is currently working but testing is needed to make sure that the changes reproduce the results of the master branch.

Daniel Queteschiner | Tue, 09/08/2020 - 18:49

We've basically done something similar based on particletemplate/multiplespheres with additional specification of the bond partners, creating the bonds in ParticleToInsert, and then using it with fix insert/pack.
We could probably share the code if you are interested (it's not in our public repo at https://github.com/ParticulateFlow yet but might as well slip in with our next update at the end of this month).