Hi,
Thank you! It's working!
The code compiles properly now. However in the calculation process (
Starting calculation
), another library problem showed up:
Code: Select all
Coupling execution between:
o Code_Saturne [1 domain(s)];
o SYRTHES [1 domain(s)];
Code_Saturne is running
***********************
Version: 4.0
Path: /home/rolland/Code_Saturne_4.0.2
Result directory:
/home/rolland/Documents/EXEMPLE3/3disks2D/RESU_COUPLING/20151211-1123
Single processor Code_Saturne simulation.
Single processor SYRTHES simulation.
****************************************
Compiling user subroutines and linking
****************************************
****************************
Preparing calculation data
****************************
SYRTHES4 home directory: /home/rolland/syrthes4.3.0/arch/Linux_x86_64
MPI home directory: /usr
Building the executable file syrthes..
***** SYRTHES compilation and link completed *****
***************************
Preprocessing calculation
***************************
SyrthesCase summary:
Name = solid
Data file = solidcoupling.syd
Update Data file = True
Do preprocessing = True
Debug = False
Case dir. = /home/rolland/Documents/EXEMPLE3/3disks2D/solid
Execution dir. = /home/rolland/Documents/EXEMPLE3/3disks2D/RESU_COUPLING/20151211-1123/solid
Data dir. = /home/rolland/Documents/EXEMPLE3/3disks2D/RESU_COUPLING/20151211-1123/solid
Source dir. = /home/rolland/Documents/EXEMPLE3/3disks2D/RESU_COUPLING/20151211-1123/solid/src
Post dir. = /home/rolland/Documents/EXEMPLE3/3disks2D/RESU_COUPLING/20151211-1123/solid/POST
Conduction mesh dir. = /home/rolland/Documents/EXEMPLE3/3disks2D/solid/
Conduction mesh name = 3rond2d.syr
Total num. of processes = 1
Logfile name = syrthes.log
Echo = True
Parallel run = False
Do preprocessing = True
SyrthesParam summary
Param file name = solidcoupling.syd
Conduction mesh name = 3rond2d.syr
Radiation mesh name = None
Result prefix. = resu1
Restart = False
Coupling = True
Interpreted functions = False
---------------------------
Start SYRTHES preprocessing
---------------------------
Updating the mesh file name..
-> OK
**********************
Starting calculation
**********************
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI COMMUNICATOR 3 SPLIT FROM 0
with errorcode 1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpiexec has exited due to process rank 1 with PID 837 on
node rolland-Precision-WorkStation-T7400 exiting improperly. There are two reasons this could occur:
1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.
2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"
This may have caused other processes in the application to be
terminated by signals sent by mpiexec (as reported here).
--------------------------------------------------------------------------
solver script exited with status 1.
Error running the coupled calculation.
Either Code_Saturne or SYRTHES may have failed.
Check Code_Saturne log (listing) and SYRTHES log (syrthes.log)
for details, as well as error* files.
****************************
Saving calculation results
****************************
Error in calculation stage.
Even by lowering down the
n_procs_max and
n_procs_min to
one processor in the
coupling_parameters.py file, as well as in the CS GUI (
Number of processes) and Syrthes (
Scalar/Parallel calculation) GUI, the
Starting calculation step still fails.
I decided to overcome this problem by writing
mpi USE=yes PATH=/usr/bin instead of
mpi USE=yes PATH=/usr in the
setup.ini file to have a direct match in the path given in the
lauch.config file. And the problem got worst:
Code: Select all
Coupling execution between:
o Code_Saturne [1 domain(s)];
o SYRTHES [1 domain(s)];
Code_Saturne is running
***********************
Version: 4.0
Path: /home/rolland/Code_Saturne_4.0.2
Result directory:
/home/rolland/Documents/EXEMPLE3/3disks2D/RESU_COUPLING/20151211-1203
Single processor Code_Saturne simulation.
Single processor SYRTHES simulation.
****************************************
Compiling user subroutines and linking
****************************************
****************************
Preparing calculation data
****************************
SYRTHES4 home directory: /home/rolland/syrthes4.3.0/arch/Linux_x86_64
MPI home directory: /usr/bin
Building the executable file syrthes..
ar: /home/rolland/syrthes4.3.0/arch/Linux_x86_64/lib/libsyrthes_cfd.a: Aucun fichier ou dossier de ce type
make: *** [exe] Erreur 9
Error during the compilation stage
Traceback (most recent call last):
File "/home/rolland/Code_Saturne_4.0.2/bin/code_saturne", line 76, in <module>
retcode = cs.execute()
File "/home/rolland/Code_Saturne_4.0.2/lib/python2.7/site-packages/code_saturne/cs_script.py", line 88, in execute
return self.commands[command](options)
File "/home/rolland/Code_Saturne_4.0.2/lib/python2.7/site-packages/code_saturne/cs_script.py", line 150, in run
return cs_run.main(options, self.package)
File "/home/rolland/Code_Saturne_4.0.2/lib/python2.7/site-packages/code_saturne/cs_run.py", line 304, in main
save_results=save_results)
File "/home/rolland/Code_Saturne_4.0.2/lib/python2.7/site-packages/code_saturne/cs_case.py", line 1764, in run
force_id)
File "/home/rolland/Code_Saturne_4.0.2/lib/python2.7/site-packages/code_saturne/cs_case.py", line 1491, in prepare_data
d.prepare_data()
File "/home/rolland/Code_Saturne_4.0.2/lib/python2.7/site-packages/code_saturne/cs_case_domain.py", line 1139, in prepare_data
retval = self.syrthes_case.prepare_run(exec_srcdir, compile_logname)
File "/home/rolland/syrthes4.3.0/arch/Linux_x86_64/share/syrthes/syrthes.py", line 417, in prepare_run
destdir = self.exec_dir)
File "/home/rolland/syrthes4.3.0/arch/Linux_x86_64/share/syrthes/syrthes.py", line 1348, in build_syrthes
shutil.move(os.path.abspath("syrthes"), exec_name)
File "/usr/lib/python2.7/shutil.py", line 302, in move
copy2(src, real_dst)
File "/usr/lib/python2.7/shutil.py", line 130, in copy2
copyfile(src, dst)
File "/usr/lib/python2.7/shutil.py", line 82, in copyfile
with open(src, 'rb') as fsrc:
IOError: [Errno 2] No such file or directory: '/home/rolland/Documents/EXEMPLE3/3disks2D/RESU_COUPLING/20151211-1203/solid/src/syrthes'
That's weird as MPI library is installed in
/usr/bin not in
/usr unless Syrthes recognize the path to follow from
/usr.
Or did I miss something in the configuration/parameters of the fluid/solid domains?
Sorry for this long answer,
Regards,
QR