Periodicity and scalars LES
Forum rules
Please read the forum usage recommendations before posting.
Please read the forum usage recommendations before posting.
Periodicity and scalars LES
Hello,
I am currently working with urban atmospheric dispersion simulations. I have a regular geometry which allows me to use periodic conditions for the flow and I have successfully run simulations with RANS and LES (using my RANS simulation as restart ) using CS v6.0.
I would like to add scalars in my simulation BUT NOT APPLY the periodic conditions on them. Basically having a periodic flow but with scalars that can exit through an outlet face. Currently I am not giving any mass to the species so the scalars work as tracers.
With RANS a restart in "Frozen flow" mode can work, but how about in the case of LES?
Since the faces are no longer consider as "Boundaries" after I applied the periodicity is there a routine I can adapt for this purpose?
Any feedback is welcome.
Thanks in advance,
Marcia
I am currently working with urban atmospheric dispersion simulations. I have a regular geometry which allows me to use periodic conditions for the flow and I have successfully run simulations with RANS and LES (using my RANS simulation as restart ) using CS v6.0.
I would like to add scalars in my simulation BUT NOT APPLY the periodic conditions on them. Basically having a periodic flow but with scalars that can exit through an outlet face. Currently I am not giving any mass to the species so the scalars work as tracers.
With RANS a restart in "Frozen flow" mode can work, but how about in the case of LES?
Since the faces are no longer consider as "Boundaries" after I applied the periodicity is there a routine I can adapt for this purpose?
Any feedback is welcome.
Thanks in advance,
Marcia
-
- Posts: 4210
- Joined: Mon Feb 20, 2012 3:25 pm
Re: Periodicity and scalars LES
Hello,
We do not have a "direct" way of doing this (I would love to replace the current periodicity logic with an extension of the internal coupling to nonconforming cases and vector or tensor variables, which would allow this, but this is a long-term idea which needs more thought)...
The simplest solution I can think of is to use source terms (instead of boundary conditions) at cells adjacent to to the "inlet" and sink terms at those adjacent to the "outlet" (or set the scalar value to 0 at those cells in cs_user_extra_operations.c).
Another, more complex solution would be to use a code/code coupling with 2 identical domains, one for the periodic LES computation, and another without periodicity in which the flow field is "copied" from the first case. This could be sone with minor changes, but the documentation of this type of the code is poor, and the code a bit ancient...
Best regards,
Yvan
We do not have a "direct" way of doing this (I would love to replace the current periodicity logic with an extension of the internal coupling to nonconforming cases and vector or tensor variables, which would allow this, but this is a long-term idea which needs more thought)...
The simplest solution I can think of is to use source terms (instead of boundary conditions) at cells adjacent to to the "inlet" and sink terms at those adjacent to the "outlet" (or set the scalar value to 0 at those cells in cs_user_extra_operations.c).
Another, more complex solution would be to use a code/code coupling with 2 identical domains, one for the periodic LES computation, and another without periodicity in which the flow field is "copied" from the first case. This could be sone with minor changes, but the documentation of this type of the code is poor, and the code a bit ancient...
Best regards,
Yvan
Re: Periodicity and scalars LES
Dear Yvan,
I tested the coupling option you proposed and it answered to what I was searching to execute, however, I have encountered issues when launching a Coupled simulation using the new stable version 6.0. This version was recently compiled in the computational server we use. The error I get in the listing is:
You will find in the attachements the results from the test as well as the source and setup.xml files. I saw in this post viewtopic.php?f=3&t=2273&p=13311&hilit= ... rne#p13311 that the issue may be linked to the MPI variable during the compilation. Could you please give me your feedback on the origin of the problem and a possible solution?
Thanks again for the help,
Marcia
I tested the coupling option you proposed and it answered to what I was searching to execute, however, I have encountered issues when launching a Coupled simulation using the new stable version 6.0. This version was recently compiled in the computational server we use. The error I get in the listing is:
Code: Select all
-----------------------------------------------------------
Unmatched Code_Saturne couplings:
---------------------------------
Couplage Code_Saturne :
id de couplage : 0
nom local : "SCALAR"
../../../code_saturne-6.0.0/src/base/cs_sat_coupling.c:2012: Erreur fatale.
Au moins 1 couplage Code_Saturne a été défini pour lequel
aucune communication avec une instance de Code_Saturne n'est possible.
You will find in the attachements the results from the test as well as the source and setup.xml files. I saw in this post viewtopic.php?f=3&t=2273&p=13311&hilit= ... rne#p13311 that the issue may be linked to the MPI variable during the compilation. Could you please give me your feedback on the origin of the problem and a possible solution?
Thanks again for the help,
Marcia
- Attachments
-
- testcoupling.tar.gz
- (1.18 MiB) Downloaded 289 times
-
- Posts: 4210
- Joined: Mon Feb 20, 2012 3:25 pm
Re: Periodicity and scalars LES
Hello,
Your case setup seems OK but checking the compile.log I see no reference to MPI.
Are you sure your installation detected MPI correctly ? Can you run a single case in parallel ?
Regards,
Yvan
Your case setup seems OK but checking the compile.log I see no reference to MPI.
Are you sure your installation detected MPI correctly ? Can you run a single case in parallel ?
Regards,
Yvan
Re: Periodicity and scalars LES
Thanks for the prompt reply,
I do not know about the use of MPI during the installation, however, I launched a test with no coupling and it run as expected. I am not sure if this is the test you suggested. I have attached the log files as well as the runcase file for your perusal.
Marcia
I do not know about the use of MPI during the installation, however, I launched a test with no coupling and it run as expected. I am not sure if this is the test you suggested. I have attached the log files as well as the runcase file for your perusal.
Marcia
- Attachments
-
- TEST0.tar.gz
- (31.42 KiB) Downloaded 290 times
-
- Posts: 4210
- Joined: Mon Feb 20, 2012 3:25 pm
Re: Periodicity and scalars LES
Hello,
Yes, it seems your installation does not use MPI, so you cannot run in parallel (which would be faster), and code coupling is not functional (it seems this should be checked by the code).
So I recommend reinstalling the code with MPI support. Any up to date MPI library (Open MPI, MPICH or derivatives such as Intel MPI, ...) should do. If you are running on a cluster, use an MPI library already configured/installed for that cluster.
Especially if you are running large LES calculations, you really need to be able to run in parallel even without coupling.
Best regards,
Yvan
Yes, it seems your installation does not use MPI, so you cannot run in parallel (which would be faster), and code coupling is not functional (it seems this should be checked by the code).
So I recommend reinstalling the code with MPI support. Any up to date MPI library (Open MPI, MPICH or derivatives such as Intel MPI, ...) should do. If you are running on a cluster, use an MPI library already configured/installed for that cluster.
Especially if you are running large LES calculations, you really need to be able to run in parallel even without coupling.
Best regards,
Yvan
Re: Periodicity and scalars LES
Hello,
My colleague verified the installation and reinstalled it using intel MPI. Although the installation seemed to pass we now get an error related to the writing of results when launching a simulation. I have attached the error from a test we performed as well as the config.log files from the compilation of the program (version 6).
Could you please guide us in what may be the problem?
Thanks again,
Marcia
My colleague verified the installation and reinstalled it using intel MPI. Although the installation seemed to pass we now get an error related to the writing of results when launching a simulation. I have attached the error from a test we performed as well as the config.log files from the compilation of the program (version 6).
Could you please guide us in what may be the problem?
Thanks again,
Marcia
- Attachments
-
- config_log.tar
- (401 KiB) Downloaded 291 times
-
- TESTV6.tar.gz
- (27.3 KiB) Downloaded 280 times
-
- Posts: 4210
- Joined: Mon Feb 20, 2012 3:25 pm
Re: Periodicity and scalars LES
Hello,
There might be a bug in some specific output. The crash occurs under dvvpst, which adds additional (model specific) outputs, but the backtrace does not tell us exactly where.
Could you test the case on a single MPI rank, to chek whether this is a "ghost cell" related issue ? Otherwise, you could obtain more information installing a separate "debug" build (addinf --enable-debug" to the configure line, for a separate, 3x slower installation, which has additional debug instrumentation).
An alternative solution would be to copy src/base/dvvpst.f90 to your case's SRC folder, and remove parts of the file until the crash stops occuring (basically doing a simple bisection) to see which part is causing the crash. Attached is a version with some parts not concerning atospheric flows attached (initial simplifcation). I suspect the first call with "icorio", sou you could first try without that "if icorio .eq. 1" section... (otherwise the remaining parts are for boundary values).
Regards,
Yvan
There might be a bug in some specific output. The crash occurs under dvvpst, which adds additional (model specific) outputs, but the backtrace does not tell us exactly where.
Could you test the case on a single MPI rank, to chek whether this is a "ghost cell" related issue ? Otherwise, you could obtain more information installing a separate "debug" build (addinf --enable-debug" to the configure line, for a separate, 3x slower installation, which has additional debug instrumentation).
An alternative solution would be to copy src/base/dvvpst.f90 to your case's SRC folder, and remove parts of the file until the crash stops occuring (basically doing a simple bisection) to see which part is causing the crash. Attached is a version with some parts not concerning atospheric flows attached (initial simplifcation). I suspect the first call with "icorio", sou you could first try without that "if icorio .eq. 1" section... (otherwise the remaining parts are for boundary values).
Regards,
Yvan
- Attachments
-
- dvvpst.f90
- (11.47 KiB) Downloaded 299 times
Re: Periodicity and scalars LES
Hello,
My colleague installed the debug version and he hashed the call for icorio. I tested a single domain (no coupling) case and it worked fine as well as a single domain case in parallel mode. However, I tried to launch a COUPLING case and I still encountered an error, specifically the "error_r03":
I have attached the outputs files where the error was signaled. I suspected it was a mistake in my specifications in the boundary conditions but I had no luck.
Thanks again for the help,
Marcia
My colleague installed the debug version and he hashed the call for icorio. I tested a single domain (no coupling) case and it worked fine as well as a single domain case in parallel mode. However, I tried to launch a COUPLING case and I still encountered an error, specifically the "error_r03":
Code: Select all
Signal SIGFPE (exception en virgule flottante) intercepté !
Pile d'appels :
1: 0x2ae0e07cb270 <+0x35270> (libc.so.6)
2: 0x2ae0d8ec4469 <typecl_+0x8e85> (libsaturne-6.0.so)
3: 0x2ae0d88d819c <condli_+0x1aae2> (libsaturne-6.0.so)
4: 0x2ae0d8ea6dc9 <tridim_+0x15d09> (libsaturne-6.0.so)
5: 0x2ae0d8752fec <caltri_+0x9dd4> (libsaturne-6.0.so)
6: 0x2ae0d840e629 <cs_run+0x554> (libcs_solver-6.0.so)
7: 0x2ae0d840e943 <main+0x184> (libcs_solver-6.0.so)
8: 0x2ae0e07b7c05 <__libc_start_main+0xf5> (libc.so.6)
9: 0x402ff9 <> (cs_solver)
Fin de la pile
application called MPI_Abort(comm=0x84000002, 1) - process 3
Thanks again for the help,
Marcia
- Attachments
-
- TESTV6_DBG_COUPL.tar.gz
- (123.05 KiB) Downloaded 287 times
-
- Posts: 4210
- Joined: Mon Feb 20, 2012 3:25 pm
Re: Periodicity and scalars LES
Hello,
I have limited network connectivity this week, but I'lll check.
If you have the possiblity of running under a debugger to get the exact line, that would be great.
Best regards,
Yvan
I have limited network connectivity this week, but I'lll check.
If you have the possiblity of running under a debugger to get the exact line, that would be great.
Best regards,
Yvan