Hi all. I am currently doing a simulation that takes dump files of a 1.5 million particle granular slide every 2000 timesteps. Each ascii .vtk file takes up about 150 MB and the granular slide concludes in about 5000000 timesteps; i.e. the final file size of the dataset is roughly 375 GB. I am looking for ways to reduce this filesize while preserving the same data as each save interval corresponds to a frame in my laboratory recording video. Is there any way for LIGGGHTS to dump into a compressed folder for instance, and how much space would that save?
j-kerbl | Thu, 03/30/2017 - 13:17
Hi Matt,
Hi Matt,
have a look in http://www.cfdem.com/media/DEM/docu/dump.html
here it states:
If the filename ends with ”.gz”, the dump file (or files, if “*” or “%” is also used) is written in gzipped format. A gzipped dump file will be about 3x smaller than the text version, but will also take longer to write.
Give it a try!
Cheers,
Josef
mattkesseler | Thu, 03/30/2017 - 14:21
Hi Josef,
Hi Josef,
Thanks, I thought there would be some common sense thing like this that I'd overlook. :)
I'll let you know if I have any problems with this.
Matt.
mattkesseler | Thu, 03/30/2017 - 14:24
dump custom/vtk
Just wanted to double check whether this approach would work with dump custom/vtk? Or would I need to save as a gz, then unzip and convert to vtk once I have downloaded the files?
Matt.
j-kerbl | Wed, 04/05/2017 - 15:18
Hi Matt,
Hi Matt,
no, the direct gzip works only with regular dumps. It isn't implemented for the vtks yet. If you post-process on the same machine, you can alternatively try the binary dumps of vtk, which need less space too. However, transferring them to anther machine might be dangerous.
Cheers,
Josef