I haven't seen much mention of this on the forums yet, but if anyone hasn't noticed LIGGGHTS 2.0 includes the ability to script LIGGGHTS from python, which can be very useful for building more complex run-time analyses, graph ongoing simulations, build a GUI to control certain aspects etc. etc.
From the python directory in LIGGGHTS 2.0 you can simply run python setup_serial.py build
to build a serial (single processor) version of LIGGGHTS which can be installed with python setup_serial.py install
. From python you can then import the module and call individual LIGGGHTS script elements, or read fix/compute/particle data etc.
In [1]: from lammps import *
In [2]: lmp = lammps()
LIGGGHTS (Version LIGGGHTS-PUBLIC 2.0.4, compiled 2012-07-23-13:58:00 by mab based on LAMMPS 20 Apr 2012)
In [3]: lmp.
lmp.close lmp.extract_compute lmp.extract_variable lmp.get_natoms lmp.put_coords
lmp.command lmp.extract_fix lmp.file lmp.lib
lmp.extract_atom lmp.extract_global lmp.get_coords lmp.lmp
There are also ways to build in parallel so that you can run with python scripting but still take advantage of MPI - this is a bit more complex, but it worked out for me on ubuntu. There are good descriptions of all of these steps on the LAMMPS website.
I don't know how well this has been tested with LIGGGHTS, but it's already really useful for "driving" simulations more easily than the scripting language in some cases. Another benefit is integration with other codes, for example I use the gengeo package to generate particle packings for certain geometries. Right now this works with import/export of ASCII data, but hopefully this can all be done "on the fly" through the python interface.
I would be interested to hear other people's experiences!
Cheers, Mark
ckloss | Sun, 08/05/2012 - 14:27
Hi Mark, >>I don't know how
Hi Mark,
>>I don't know how well this has been tested with LIGGGHTS
not too much - but on the other hand, we didn't change a single line on it - so all the features described on the LAMMPS WWW site should be functional
The main code coupling stuff we do (CFD-DEM with OpenFOAM(R) and now - more recently - LB-DEM coupling) are performed via C++. But yes, doing code coupling via python is interesting because you can get results quickly...
Thanks for the info about gengeo. Just fyi: you can generate particle packings in LIGGGHTS as well using fix insert/pack with overlapcheck no, and letting the system relax with fix nve/limit
Best,
Christoph
msbentley | Sun, 08/05/2012 - 18:22
So far so good :)
Thanks for your input Christoph! So far it's working well for me - I guess the key wrt speed is to still let the C++ code do most of the processing (i.e. don't use run 1 in a python loop, but run many timesteps).
Thanks for the insert/pack hint - can this be used for regions created with union/intersect? For some reason I thought not, but I don't see anything about restrictions on the type of region in the docs. If so, then this works just as well!
Thanks, Mark
ckloss | Sun, 08/05/2012 - 19:38
>>Thanks for the insert/pack
>>Thanks for the insert/pack hint - can this be used for regions created with union/intersect?
Sure! It also works with region tetmesh in 1.5.3 (region tetmesh not yet available for LIGGGHTS 2.X, but will come at a later point)
Cheers, Christoph
jtvanlew | Tue, 08/06/2013 - 02:17
In my continued effort to
In my continued effort to totally avoid doing any c++ coding, i've been working with this python wrapper lately too. I've put together a somewhat sophisticated loop (i'm proud of it, at least) that checks every N steps for particles with contact forces larger than some specified F_critical and then does some junk with that knowledge.
What I don't like is my hard-coded loop over IDs I have to do because I don't know how to handle the pointer to a pointer to a c_double that comes out of lammps. First i get the data by
forceData = lmp.extract_compute("fc", 2, 2)
where fc is the name of the pair/gran/local compute that we've all used in the past for forcechains, and here has 13 columns. Now, I can't just use forceData as a regular array because it comes in as a class that references the lammps data in memory. My solution is to just make a new python array and fill it with all the data from forceData. BUT, I don't know the length of forceData (and it will change in every loop) so I can't just initialize a new array with the same length. So I hard code because I'll know in this sample run that it will have fewer than 5000 entries.
for column in xrange(0, 13):
for row in xrange(0, 5000):
A[row, column] = forceData[row][column]
But of course this is far from perfect.
I'm curious if anyone knows the python - c interaction better than can suggest how to just pull all the data out of c and throw it into one of my python arrays.
deepakpawar.2310 | Wed, 06/28/2017 - 11:44
THE PYTHON INTERFACE TO LIGGGHTS 3.0
Is the above-mentioned procedure is same for LIGGGHTS 3.0?