LIGGGHTS® - User Forum

LIGGGHTS® related topics can be discussed here: discussion about models, installation, feature requests and general discussion

Computation time and dump local/gran/vtk Issues

Submitted by lumblab227 on Fri, 04/07/2017 - 07:25

Hello everyone~

Issure (I):
Thanks very much for your tutorial. Appreciated. However, when I use the command " dump dumpnetwork all local/gran/VTK 200 post/dump*.vtk " to export the local contact information and to convert it into .VTK for paraview analysis, it shows the error as entitled. I am wondering:
(1) Did I use the wrong command (e.g. lack arguments)?
(2) or the VTK is not supported when compiling.

Importing geometry problem (SolidWorks)

Submitted by neven.marticnevic on Wed, 04/05/2017 - 21:30

Hi all, I am totally new in liggghts...

Until now I worked on tutorials and made necessary changes on them to get worked it...

Right now I want to do something on my own...so I modeled simply box in SolidWorks export as ASCII .STL file and when I import geometry in to liggghts I am getting this message:

"Created orthogonal box = (-0.01 -0.06 -1.22) to (0.51 0.01 0.01)
1 by 1 by 1 MPI processor grid
ERROR on proc 0: Cannot open mesh file /meshes/ohisje.stl (../input_mesh_tri.cpp:105)
(../input_mesh_tri.cpp:105)"

Relaxing a Multisphere Simulation with Initially Large Overlaps

Submitted by estefan31 on Tue, 04/04/2017 - 21:37

What options are available for relaxing a multisphere system? My favorite fix option is nve/limit, but this only works for regular spheres. And I'm running simulations in parallel, so inserting over 100,000 multisphere particles has been very difficult if I use the 'overlap_check yes' option. I'd also prefer to keep my domain as compact as possible, so I want to avoid inserting particles from a very tall height. If there is nothing freely available, does anyone have any recommended code modifications I can implement?

Reducing dump file size

Submitted by mattkesseler on Thu, 03/30/2017 - 11:20

Hi all. I am currently doing a simulation that takes dump files of a 1.5 million particle granular slide every 2000 timesteps. Each ascii .vtk file takes up about 150 MB and the granular slide concludes in about 5000000 timesteps; i.e. the final file size of the dataset is roughly 375 GB. I am looking for ways to reduce this filesize while preserving the same data as each save interval corresponds to a frame in my laboratory recording video. Is there any way for LIGGGHTS to dump into a compressed folder for instance, and how much space would that save?

error in running example

Submitted by Tiago on Thu, 03/30/2017 - 10:52

Dear All
I want to run heat transfer example to check the heat flux in the all directions. But I faced with below error:

ERROR: Dump custom fix does not compute per-atom vector (../dump_custom.cpp:1385)

My only change in the script is (in the heattransfer_2 example):
dump dmp1 all custom 800 post/dump*.heatGran id type type x y z ix iy iz vx vy vz fx fy fz omegax omegay omegaz radius f_Temp[0] f_heatSource[0] f_directionalHeatFlux[0] f_directionalHeatFlux[1] f_directionalHeatFlux[2]

Pages

Subscribe to RSS - LIGGGHTS® - User Forum