Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    Nandini Ramanathan
    @nandini_udt:matrix.org
    [m]
    [centos@ip-10-23-25-100 /program/precice-2.2.1/precice]$ ll total 0 drwxr-xr-x 2 root root 24 Aug 23 21:18 bin drwxr-xr-x 3 root root 21 Aug 23 21:18 include drwxr-xr-x 4 root root 107 Aug 23 21:18 lib64 drwxr-xr-x 6 root root 58 Aug 23 21:18 share [centos@ip-10-23-25-100 /program/precice-2.2.1/precice]$ ls bin binprecice [centos@ip-10-23-25-100 /program/precice-2.2.1/precice]$ ls include/ precice [centos@ip-10-23-25-100 /program/precice-2.2.1/precice]$ ls lib64/ cmake libprecice.so libprecice.so.2 libprecice.so.2.2.1 pkgconfig [centos@ip-10-23-25-100 /program/precice-2.2.1/precice]$ ls share doc lintian man precice
    Alexander Jaust
    @ajaust
    Yes, that looks good to me. There should be a bit more files in the subdirectories, but you seem to have all you need. That is the share library libprecice.so and the header files in include/.
    You could compile the tests and/or solverdummies to see if everything works as expected.
    Nandini Ramanathan
    @nandini_udt:matrix.org
    [m]
    how to do that?
    Nandini Ramanathan
    @nandini_udt:matrix.org
    [m]
    I saw the make install test. Thanks
    Nandini Ramanathan
    @nandini_udt:matrix.org
    [m]
    The test is successful. Thank so much for all the support you have provided.
    Alexander Jaust
    @ajaust
    No problem. I hope it works out as expected. Just for the sake of completeness: You can find information about the solver dummies and how to use them in the documentation: https://precice.org/couple-your-code-api.html#minimal-reference-implementations
    The same holds true for running the tests included: https://precice.org/installation-source-testing.html#undefined
    Nandini Ramanathan
    @nandini_udt:matrix.org
    [m]
    Do you have any examples to do MPI test using precice?
    Alexander Jaust
    @ajaust
    I think there was an MPI test in the preCICE tests included. If you want to do testing using dummy solvers, I have some in my repository: https://github.com/ajaust/precice-parallel-solverdummies
    These are slight adaptations of the dummies provided by preCICE, but they are actually running in parallel using MPI.
    Nandini Ramanathan
    @nandini_udt:matrix.org
    [m]
    Thanks will give them a try
    Nandini Ramanathan
    @nandini_udt:matrix.org
    [m]
    I am getting this error trying to test the mpirun for precice install
    [udev_PkWWa@ip-10-23-87-204 ~/work/shared/precice-parallel-solverdummies/c/build]$ make Scanning dependencies of target solverdummy-c-parallel [ 50%] Building C object CMakeFiles/solverdummy-c-parallel.dir/solverdummy-c-parallel.c.o /enc/udev_PkWWa/work/shared/precice-parallel-solverdummies/c/solverdummy-c-parallel.c: In function ‘main’: /enc/udev_PkWWa/work/shared/precice-parallel-solverdummies/c/solverdummy-c-parallel.c:69:3: error: ‘for’ loop initial declarations are only allowed in C99 mode for (int i = 0; i < numberOfVertices; i++) { ^ /enc/udev_PkWWa/work/shared/precice-parallel-solverdummies/c/solverdummy-c-parallel.c:69:3: note: use option -std=c99 or -std=gnu99 to compile your code /enc/udev_PkWWa/work/shared/precice-parallel-solverdummies/c/solverdummy-c-parallel.c:70:5: error: ‘for’ loop initial declarations are only allowed in C99 mode for (int j = 0; j < dimensions; j++) { ^ /enc/udev_PkWWa/work/shared/precice-parallel-solverdummies/c/solverdummy-c-parallel.c:94:5: error: ‘for’ loop initial declarations are only allowed in C99 mode for (int i = 0; i < numberOfVertices * dimensions; i++) { ^ make[2]: *** [CMakeFiles/solverdummy-c-parallel.dir/solverdummy-c-parallel.c.o] Error 1 make[1]: *** [CMakeFiles/solverdummy-c-parallel.dir/all] Error 2 make: *** [all] Error 2
    Alexander Jaust
    @ajaust
    Could you try again? I pushed an update that requests the C standard to C99 explicitly.
    Nandini Ramanathan
    @nandini_udt:matrix.org
    [m]
    The cpp was working yesterday and I am not sure what happened today. When i compile i see these errors.
    Alexander Jaust
    @ajaust
    What do you mean? Is the cpp broken as well now? Is the c version working at least?
    nandini_udt
    @nandini_udt:matrix.org
    [m]
    I think i messed with path and ld path and it gives some errors that I am trying to fix
    Nandini Ramanathan
    @nandini_udt:matrix.org
    [m]
    Do you know what this error means .
    c/program/precice-2.2.1/precice/lib64/libprecice.so.2.2.1: undefined reference to `PetscMPIAbortErrorHandler' /program/precice-2.2.1/precice/lib64/libprecice.so.2.2.1: undefined reference to `boost::detail::set_tss_data(void const*, void (*)(void (*)(void*), void*), void (*)(void*), void*, bool)' /program/precice-2.2.1/precice/lib64/libprecice.so.2.2.1: undefined reference to `AOApplicationToPetsc' /program/precice-2.2.1/precice/lib64/libprecice.so.2.2.1: undefined reference to `boost::log::v2_mt_posix::attribute_set::find(boost::log::v2_mt_posix::attribute_name)' /program/precice-2.2.1/precice/lib64/libprecice.so.2.2.1: undefined reference to `boost::filesystem::path::operator/=(boost::filesystem::path const&)' /program/precice-2.2.1/precice/lib64/libprecice.so.2.2.1: undefined reference to `MatCreate' /program/precice-2.2.1/precice/lib64/libprecice.so.2.2.1: undefined reference to `VecWAXPY' /program/precice-2.2.1/precice/lib64/libprecice.so.2.2.1: undefined reference to `boost::log::v2_mt_posix::core::get_logging_enabled() const' /program/precice-2.2.1/precice/lib64/libprecice.so.2.2.1: undefined reference to `boost::program_options::options_description::m_default_line_length' /program/precice-2.2.1/precice/lib64/libprecice.so.2.2.1: undefined reference to `PetscObjectSetName' /program/precice-2.2.1/precice/lib64/libprecice.so.2.2.1: undefined reference to `boost::filesystem::detail::rename(boost::filesystem::path const&, boost::filesystem::path const&, boost::system::error_code*)' /program/precice-2.2.1/precice/lib64/libprecice.so.2.2.1: undefined reference to `MatGetInfo' /program/precice-2.2.1/precice/lib64/libprecice.so.2.2.1: undefined reference to `PetscRandomSetType'
    Alexander Jaust
    @ajaust
    It looks like preCICE does not find your PETSc and Boost that you have used during compile time. Have you set the LD_LIBRARY_PATH for boost and PETSc properly?
    Nandini Ramanathan
    @nandini_udt:matrix.org
    [m]
    yes i have
    echo $LD_LIBRARY_PATH /program/precice-2.2.1/intel/impi/2019.8.254/intel64//lib:/program/precice-2.2.1/gcc/6.5.0/lib64:/program/precice-2.2.1/precice/lib64:/program/precice-2.2.1/petsc-3.15.3/real-opt/lib:/program/precice-2.2.1/hdf5/lib:/program/precice-2.2.1/boost-1.70.0/lib:/program/precice-2.2.1/gcc/6.5.0/lib64:/program/precice-2.2.1/precice/lib64:/program/precice-2.2.1/petsc-3.15.3/real-opt/lib:/program/precice-2.2.1/hdf5/lib:/program/precice-2.2.1/boost-1.70.0/lib:/usr/local/lib:/program/precice-2.2.1/gcc/6.5.0/lib64:/program/precice-2.2.1/precice/lib64:/program/precice-2.2.1/petsc-3.15.3/real-opt/lib:/program/precice-2.2.1/hdf5/lib:/program/precice-2.2.1/boost-1.70.0/lib:/usr/local/lib:/lib:/lib64:/usr/lib:/usr/lib64:/program/precice-2.2.1/intel/lib64:.:/program/precice-2.2.1/intel/lib64:. [udev_RXCtd@ip-10-23-12-72 ~/work/precice-parallel-solverdummies/cpp/build]$
    Alexander Jaust
    @ajaust
    What command do you use that leads to this error?
    Nandini Ramanathan
    @nandini_udt:matrix.org
    [m]
    i did cmake and followed by make leads to this error. Do you know smoke test covers the check for mpirun?
    Alexander Jaust
    @ajaust
    There is some parallel tests included in the preCICE test suite, i.e. when you run make test after building preCICE. I am not sure whether that is what you mean.
    JulianSchl
    @JulianSchl
    Hi,
    I have issues in understanding the implicit coupling mode.
    Running a FSI simulation in explicit mode the case works well.
    Now switching to implicit, the displacement passed from the solid to the fluid solver is so huge, that it makes the simulation crash in the second iteration. Therefore I must use a smaller initial displacement to keep the simulation running. So I cannot handle displacement ranges above a certain level in implicit mode while explicit can still handle them. For me, this is a huge disadvantage of the implicit coupling mode, since I want to couple fast moving objects.
    What do you think? Am I thinking the wrong way?
    Kyle Davis
    @KyleDavisSA
    Hi @JulianSchl , Are you using serial or parallel coupling? Also, what acceleration scheme are you using?
    JulianSchl
    @JulianSchl
    Hi @KyleDavisSA , I'm in parallel coupling mode and I currently don't use acceleration
    Kyle Davis
    @KyleDavisSA
    An acceleration scheme could help significantly here. Firstly I would try aitken acceleration, however quasi-Newton is generally the more robust option. With aitken acceleration, you can specify an initial under-relaxation value, which will dampen the initial value exchanged between both solvers in parallel coupling.
    You can add the following:
    <acceleration:aitken>
      <initial-relaxation value="0.1"/>
      <data mesh="{string}" name="{string}"/>
      <data mesh="{string}" name="{string}"/>
    </acceleration:aitken>
    Just add the name of the data field and the mesh that the data lies on
    JulianSchl
    @JulianSchl
    Okay this improves stability, but am I right in assuming, that due to possible higher displacements, I have to make smaller timesteps with implicit coupling?
    Kyle Davis
    @KyleDavisSA
    If the simulation worked for explicit coupling, it should be fine for implicit coupling with suitable acceleration.
    Aitken acceleration is quite simple, so if there are still instabilities, I would suggest to then move to quasi-Newton coupling
    JulianSchl
    @JulianSchl
    I've now tried several acceleration methods from here, but the maximum I could get was 4 iterations with a initial aitken relaxation of 0.01
    Kyle Davis
    @KyleDavisSA
    Could you try the following scheme:
    <acceleration:IQN-ILS>
              <data name="{data}" mesh="{mesh}" />
              <data name="{data}" mesh="{mesh}" />
              <initial-relaxation value="0.1"/>
              <max-used-iterations value="100"/>
              <time-windows-reused value="10"/>
              <preconditioner type="residual-sum" />
              <filter type="QR2" limit="0.01"/>
            </acceleration:IQN-ILS>
    This is considered a good starting point for IQN-ILS acceleration scheme.
    JulianSchl
    @JulianSchl
    still the problem remains, that the displacement sent to the fluid solver is too big ... now after the 4th cycle -> the fluid solver crashes
    also making smaller timesteps doesn't help
    Kyle Davis
    @KyleDavisSA
    Does using an under-relaxation of 0.5 work? Sometimes if the solid solver initially sends zero displacements in the first iteration, it can cause bad scaling of the preconditioner. Serial-implicit coupling is also more stable, so you could try that to see if that solves the issue, and then move onto parallel implicit
    JulianSchl
    @JulianSchl

    Does using an under-relaxation of 0.5 work?

    Nope ...

    I could try serial, but this would mean several changes for my code, since I hard coded some extra coupling
    Benjamin Uekermann
    @uekerman
    Sounds to me that sth might be odd with how you realize checkpointing
    or a problems with how you define and treat relative displacements
    Both are longer stories and better discussed in the forum
    JulianSchl
    @JulianSchl
    👍
    Nandini Ramanathan
    @nandini_udt:matrix.org
    [m]
    I was able to compile and test mpirun. But this is th eo/p.
    ```mpirun -n 4 ./solverdummy-cpp-parallel ../precice-config-parallel.xml SolverOne MeshOne
    Using default -machinefile setting (/enc/udev_XTNxd/mpd.hosts)
    DUMMY (3): Running solver dummy with preCICE config file "../precice-config-parallel.xml", participant name "SolverOne", and mesh name "MeshOne".
    DUMMY (2): Running solver dummy with preCICE config file "../precice-config-parallel.xml", participant name "SolverOne", and mesh name "MeshOne".
    DUMMY (1): Running solver dummy with preCICE config file "../precice-config-parallel.xml", participant name "SolverOne", and mesh name "MeshOne".
    DUMMY (0): Running solver dummy with preCICE config file "../precice-config-parallel.xml", participant name "SolverOne", and mesh name "MeshOne".
    preCICE: This is preCICE version 2.2.1
    preCICE: Revision info: no-info [Git failed/Not a repository]
    preCICE: Configuration: Debug
    preCICE: Configuring preCICE with configuration "../precice-config-parallel.xml"
    preCICE: I am participant "SolverOne"
    preCICE: Connecting Slave #0 to Master
    preCICE: Connecting Master to 3 Slaves
    preCICE: Connecting Slave #1 to Master
    preCICE: Connecting Slave #2 to Master
    preCICE: Setting up master communication to coupling partner/s
    preCICE: Setting up master communication to coupling partner/s
    preCICE: Setting up master communication to coupling partner/s
    preCICE: Setting up master communication to coupling partner/s````
    it just hanges at this point
    Frédéric Simonis
    @fsimonis
    Can you delete the folder precice-run if it exists and rerun your case?
    Nandini Ramanathan
    @nandini_udt:matrix.org
    [m]
    ```[udev_XTNxd@ip-10-23-17-108 ~/work/precice-parallel-solverdummies/cpp]$ ll
    total 52
    drwxrwxrwx 3 udev_XTNxd udev_XTNxd 4096 Aug 30 18:36 build
    -rw-rw-rw- 1 udev_XTNxd udev_XTNxd 482 Aug 30 15:47 CMakeLists.txt
    -rwxrwxrwx 1 udev_XTNxd udev_XTNxd 38608 Aug 30 18:37 solverdummy-cpp-parallel
    -rw-rw-rw- 1 udev_XTNxd udev_XTNxd 3553 Aug 30 15:47 solverdummy-cpp-parallel.cpp
    there is no precice_run folder