restarting CFDEM simulations

Submitted by vinaym on Thu, 04/21/2016 - 12:35

Hello all,

I want to restart my simulations since the cluster I use does not permit to run beyond certain walltime.

I configured the case according to ErgunTestMPI_restart.
However, when I restart the simulations, all the files in processor* dirs are cleaned.
Of course, than it complains about not finding p in the latest time dir.

How can I prevent the clean up before the simulation starts?

Thanks.

regards,
Vinay

hunger's picture

hunger | Thu, 04/21/2016 - 14:10

Are you using the Allrun-Script provided in the tutorial? If I remember right, in a last step the simulation directory is cleaned if you do not quit with ctrl+C. Maybe this applies in your case.

Best regards
Harald

vinaym | Thu, 04/21/2016 - 17:34

Hi Harald,

No. I am not using Allrun script. I had commented the cleaning part out , just in case I accidentally use it.
As I said, the previous data is cleaned when I 'restart' the simulation.

kind regards,

Vinay

vinaym | Thu, 04/21/2016 - 20:01

reconstructed case before restarting simulations.

Vinay

mcsk2000 | Sun, 11/01/2020 - 16:05

There is an additional thing to do. Please refer to the example, "ErgunTestMPI_restart".
Check the liggghtsCommands file in CFD/constant to write write_file for restarting.
Without this, it will read the restart file written by the initial LIGGGHTS run.

Nico | Mon, 11/02/2020 - 14:31

One important observation to do here is that OpenFOAM will not reconstruct the lagrangian folder when reconstructing your case (one can check the lagrangian folder in a time directory inside a processor directory). OpenFOAM will complain about a ponctuation token.

The lagrangian folder contains the information about the particles positions for paraview. So even if you reconstruct your case, you will lose these informations. The correct way to restart a simulation, is by making a backup of your CFD and DEM results before restarting. This way you will prevent the loss of data.

atul2018 | Thu, 12/02/2021 - 17:53

Hello

I am facing exact same problem when I restart cfdem simulation from last time step, I get the processors (and the results) removed where my existing results are present. Reonstructing is not an option as it looses the lagrangian data. backing up the data and then starting from latest time step is quite time consuming and not an efficient way to do it.

I was thinking about modifying the source code and remove the section which removes the processors and the recompile cfdem again. But this is also very complicated and recompiling complete cfdem on super computer is a challange.

I was wondering if someone know any efficient method to do it.

Best Regards
Atul Jaiswal

atul2018 | Wed, 12/08/2021 - 12:15

I have managed to restart the CFDEM simulation from previous time step without being the previous time step data being removed from the processors folder. I am providing the solution here in case someone needs it.

Most important thing is to create a new liggghts restart script (e.g. in.liggghts_restart) where one need to read the previously written CFDEM restart file. Then we need to indicate the openfoam to read in.liggghts_restart script. This is done by modifying couplingProperties file (CFD/constant), where we just replace the file name in.liggghts_run with in.liggghts_restart in the section twoWayMPIProps and twoWayM2MProps. it is also betetr to change the name of the restart file written during this restart simulation, that is done by changing the name of restart file under writeName. Additionally the startTime should be changed to latestTime in CFD/system/controlDict. Now the case is ready to be started to from previous run. But usiing Allrun.sh (which calls the function parCFDEMrun) would remove all the processors then again decompose the case. This happens because, the funtion parCFDEMrun first executes decoposePar -force then calls cfdemSolverPiso in parallel. We just have to tell that we dont need again decoposition and use the existing decoposed case. The easiet way to do it is by calling directly cfdemSolverPiso in parallel instead of function parCFDEMrun. just modify the parCFDEMrun.sh, where we call cfdemSolverPiso in parallel as follows:

cd CFD
mpirun -np cfdemSolverPiso -parallel

Now, when you execute Allrun.sh, your simulation will start from last time step without removing previous data.
I hope, it will help others.

Best Regards
Atul Jaiswal

Akanksha Rajput | Thu, 04/20/2023 - 17:03

I have followed what you had suggested. However I had a confusion as to how and exactly where to add "cd CFD
mpirun -np cfdemSolverPiso -parallel" in parCFDEMrun.sh.
here is how I have edited:
________________________________________________________________________________________________________________
#- define variables
casePath="$(dirname "$(readlink -f ${BASH_SOURCE[0]})")"
logpath=$casePath
headerText="run_parallel_cfdemSolverPiso_ErgunTestMPI_CFDDEM"
logfileName="log_$headerText"
solverName="cfdemSolverPiso"
nrProcs="6"
machineFileName="none" # yourMachinefileName | none
debugMode="off" # on | off| strict | profile
testHarnessPath="$CFDEM_TEST_HARNESS_PATH"
runOctave="true"
postproc="false"
#--------------------------------------------------------------------------------#

#- call function to run a parallel CFD-DEM case
parCFDDEMrun $logpath $logfileName $casePath $headerText cd CFD mpirun -np cfdemSolverPiso -parallel $nrProcs $machineFileName $debugMode
______________________________________________________________________________________________________________________
And it giving me following error:
______________________________________________________________________________________________________________________
Decomposing mesh region0

Create mesh

Calculating distribution of cells
Selecting decompositionMethod scotch

Finished decomposition in 0.04 s

Calculating original mesh data

Distributing cells to processors

Distributing faces to processors

Distributing points to processors

Constructing processor meshes
/home/rimo/CFDEM/CFDEMcoupling-PUBLIC-5.x/src/lagrangian/cfdemParticle/etc/functions.sh: line 761: 17125 Segmentation fault (core dumped) decomposePar -force
do links
/home/rimo/CFDEM/CFDEMcoupling-PUBLIC-5.x/src/lagrangian/cfdemParticle/etc/functions.sh: line 1306: Fatal: command not found
/home/rimo/CFDEM/CFDEMcoupling-PUBLIC-5.x/src/lagrangian/cfdemParticle/etc/functions.sh: line 1314: Fatal: command not found
there is no dir called CFD at 6 - **check**
mkdir: cannot create directory ‘6/CFD’: No such file or directory
mkdir: cannot create directory ‘oldProcDirs’: File exists
/home/rimo/CFDEM/CFDEMcoupling-PUBLIC-5.x/src/lagrangian/cfdemParticle/etc/functions.sh: line 1334: Fatal: command not found
cp: cannot create symbolic link '6/CFD': No such file or directory
mv: cannot overwrite directory 'oldProcDirs/processor0' with non-directory
ln: failed to create symbolic link 'processor0': File exists
mkdir: cannot create directory ‘postProcessing’: File exists
mv: cannot move 'postProcessing' to '6/CFD': No such file or directory
ln: failed to create symbolic link 'postProcessing/postProcessing': File exists
linking was successful
seq: invalid floating point argument: ‘CFD’
Try 'seq --help' for more information.

// run_parallel_cfdemSolverPiso_ErgunTestMPI_CFDDEM //

/home/rimo/Desktop/fluid_flo_lab/restarting_CFDEM/CFD

rm: cannot remove 'couplingFiles/*': No such file or directory
----------------------------------------------------------------------------
Open MPI has detected that a parameter given to a command line
option does not match the expected format:

Option: np
Param: CFD

This is frequently caused by omitting to provide the parameter
to an option that requires one. Please check the command line and try again.
________________________________________________________________________________________
Kindly help in sorting this out. I think I am editing the parCFDEMrun.sh in incorrect manner.

atul2018 | Tue, 05/30/2023 - 17:12

Hello

you should remove everything and just call the two commands mentioned below (after #- call function to run a parallel CFD-DEM case)

cd CFD
mpirun -np cfdemSolverPiso -parallel

As said no need to call parCFDEMrun