Hello Yvan,
What do you think about my install now? It's ok or not?
I am very interested in using code saturne and Syrthes in a coupled way
Best regards
Julien
installation Salome+CS+Syrthes via VM on ubuntu 15.10
Forum rules
Please read the forum usage recommendations before posting.
Please read the forum usage recommendations before posting.
-
- Posts: 4207
- Joined: Mon Feb 20, 2012 3:25 pm
Re: installation Salome+CS+Syrthes via VM on ubuntu 15.10
Hello,
Did you use the ".syd" Syrthes setup from the tutorial examples, or recreate it from the tutorial description ? Other users have had issues with this, and the tutorial has nor been updated yet, but you can find recent threads on this forum relating to this.
Regards,
Yvan
Did you use the ".syd" Syrthes setup from the tutorial examples, or recreate it from the tutorial description ? Other users have had issues with this, and the tutorial has nor been updated yet, but you can find recent threads on this forum relating to this.
Regards,
Yvan
Re: installation Salome+CS+Syrthes via VM on ubuntu 15.10
Hello,
I use .syd from the example folder from code saturne installation pack.
To be clear, which files I need to run this coupled study?
I need 3 folders:
- one fluid: within it DATA RESU SCRIPTS SRC folders, in the DATA folder the
fluid-3rond2D-coupling.xml file with associated mesh dependancy.
- one solid: within it the solid-coupling.syd file with associated mesh dependancy and others
- one RESU_COUPLING. (empty)
- the file coupling_parameters.py:
domains = [
{'solver': 'Code_Saturne',
'domain': 'fluid',
'script': 'runcase',
'n_procs_weight': None,
'n_procs_min': 1,
'n_procs_max': None}
,
{'solver': 'SYRTHES',
'domain': 'solid',
'script': 'solid-coupling.syd',
'n_procs_weight': None,
'n_procs_min': 1,
'n_procs_max': None,
'opt' : '-v med'}
- the runcase_coupling file
#!/bin/bash
# Ensure the correct command is found:
export PATH=/home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/bin:$PATH
# Run command:
\code_saturne run --coupling coupling_parameters.py
After launching the runcase_coupling file there is in the terminal:
Coupling execution between:
o Code_Saturne [1 domain(s)];
o SYRTHES [1 domain(s)];
Code_Saturne is running
***********************
Version: 4.0
Path: /home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64
Result directory:
/home/julien/3disks2D/newtest/4-2Ddisks/case1/RESU_COUPLING/20160708-1124
Single processor Code_Saturne simulation.
Single processor SYRTHES simulation.
****************************
Preparing calculation data
****************************
SYRTHES4 home directory: /home/julien/Syrthes/syrthes4.3.0/arch/Linux_x86_64
MPI home directory: /usr
Building the executable file syrthes..
***** SYRTHES compilation and link completed *****
***************************
Preprocessing calculation
***************************
SyrthesCase summary:
Name = solid
Data file = solid-coupling.syd
Update Data file = True
Do preprocessing = True
Debug = False
Case dir. = /home/julien/3disks2D/newtest/4-2Ddisks/case1/solid
Execution dir. = /home/julien/3disks2D/newtest/4-2Ddisks/case1/RESU_COUPLING/20160708-1124/solid
Data dir. = /home/julien/3disks2D/newtest/4-2Ddisks/case1/RESU_COUPLING/20160708-1124/solid
Source dir. = /home/julien/3disks2D/newtest/4-2Ddisks/case1/RESU_COUPLING/20160708-1124/solid/src
Post dir. = /home/julien/3disks2D/newtest/4-2Ddisks/case1/RESU_COUPLING/20160708-1124/solid/POST
Conduction mesh dir. = /home/julien/3disks2D/newtest/4-2Ddisks/case1/solid/
Conduction mesh name = 3rond2d.syr
Total num. of processes = 1
Logfile name = syrthes.log
Echo = True
Parallel run = False
Do preprocessing = True
SyrthesParam summary
Param file name = solid-coupling.syd
Conduction mesh name = 3rond2d.syr
Radiation mesh name = None
Result prefix. = resu2
Restart = False
Coupling = True
Interpreted functions = False
---------------------------
Start SYRTHES preprocessing
---------------------------
Updating the mesh file name..
-> OK
**********************
Starting calculation
**********************
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI COMMUNICATOR 3 SPLIT FROM 0
with errorcode 1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpiexec.openmpi noticed that process rank 0 with PID 1822 on node julien-VirtualBox exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------
solver script exited with status 139.
Error running the coupled calculation.
Either Code_Saturne or SYRTHES may have failed.
Check Code_Saturne log (listing) and SYRTHES log (syrthes.log)
for details, as well as error* files.
****************************
Saving calculation results
****************************
Post-processing..
.syrthes --> med..
--> /home/julien/Syrthes/syrthes4.3.0/arch/Linux_x86_64/bin/syrthes4med30 -m "/home/julien/3disks2D/newtest/4-2Ddisks/case1/solid/3rond2d.syr" -r "/home/julien/3disks2D/newtest/4-2Ddisks/case1/RESU_COUPLING/20160708-1124/solid/resu2.res" -o "resu2.med"
Traceback (most recent call last):
File "/home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/bin/code_saturne", line 76, in <module>
retcode = cs.execute()
File "/home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/lib/python2.7/site-packages/code_saturne/cs_script.py", line 88, in execute
return self.commands[command](options)
File "/home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/lib/python2.7/site-packages/code_saturne/cs_script.py", line 150, in run
return cs_run.main(options, self.package)
File "/home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/lib/python2.7/site-packages/code_saturne/cs_run.py", line 304, in main
save_results=save_results)
File "/home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/lib/python2.7/site-packages/code_saturne/cs_case.py", line 1779, in run
self.save_results()
File "/home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/lib/python2.7/site-packages/code_saturne/cs_case.py", line 1694, in save_results
d.copy_results()
File "/home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/lib/python2.7/site-packages/code_saturne/cs_case_domain.py", line 1200, in copy_results
raise RunCaseError(err_str)
cs_case_domain.RunCaseError:
Error during SYRTHES postprocessing
See also listing and syrthes.log files
Best regards
Julien
I use .syd from the example folder from code saturne installation pack.
To be clear, which files I need to run this coupled study?
I need 3 folders:
- one fluid: within it DATA RESU SCRIPTS SRC folders, in the DATA folder the
fluid-3rond2D-coupling.xml file with associated mesh dependancy.
- one solid: within it the solid-coupling.syd file with associated mesh dependancy and others
- one RESU_COUPLING. (empty)
- the file coupling_parameters.py:
domains = [
{'solver': 'Code_Saturne',
'domain': 'fluid',
'script': 'runcase',
'n_procs_weight': None,
'n_procs_min': 1,
'n_procs_max': None}
,
{'solver': 'SYRTHES',
'domain': 'solid',
'script': 'solid-coupling.syd',
'n_procs_weight': None,
'n_procs_min': 1,
'n_procs_max': None,
'opt' : '-v med'}
- the runcase_coupling file
#!/bin/bash
# Ensure the correct command is found:
export PATH=/home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/bin:$PATH
# Run command:
\code_saturne run --coupling coupling_parameters.py
After launching the runcase_coupling file there is in the terminal:
Coupling execution between:
o Code_Saturne [1 domain(s)];
o SYRTHES [1 domain(s)];
Code_Saturne is running
***********************
Version: 4.0
Path: /home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64
Result directory:
/home/julien/3disks2D/newtest/4-2Ddisks/case1/RESU_COUPLING/20160708-1124
Single processor Code_Saturne simulation.
Single processor SYRTHES simulation.
****************************
Preparing calculation data
****************************
SYRTHES4 home directory: /home/julien/Syrthes/syrthes4.3.0/arch/Linux_x86_64
MPI home directory: /usr
Building the executable file syrthes..
***** SYRTHES compilation and link completed *****
***************************
Preprocessing calculation
***************************
SyrthesCase summary:
Name = solid
Data file = solid-coupling.syd
Update Data file = True
Do preprocessing = True
Debug = False
Case dir. = /home/julien/3disks2D/newtest/4-2Ddisks/case1/solid
Execution dir. = /home/julien/3disks2D/newtest/4-2Ddisks/case1/RESU_COUPLING/20160708-1124/solid
Data dir. = /home/julien/3disks2D/newtest/4-2Ddisks/case1/RESU_COUPLING/20160708-1124/solid
Source dir. = /home/julien/3disks2D/newtest/4-2Ddisks/case1/RESU_COUPLING/20160708-1124/solid/src
Post dir. = /home/julien/3disks2D/newtest/4-2Ddisks/case1/RESU_COUPLING/20160708-1124/solid/POST
Conduction mesh dir. = /home/julien/3disks2D/newtest/4-2Ddisks/case1/solid/
Conduction mesh name = 3rond2d.syr
Total num. of processes = 1
Logfile name = syrthes.log
Echo = True
Parallel run = False
Do preprocessing = True
SyrthesParam summary
Param file name = solid-coupling.syd
Conduction mesh name = 3rond2d.syr
Radiation mesh name = None
Result prefix. = resu2
Restart = False
Coupling = True
Interpreted functions = False
---------------------------
Start SYRTHES preprocessing
---------------------------
Updating the mesh file name..
-> OK
**********************
Starting calculation
**********************
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI COMMUNICATOR 3 SPLIT FROM 0
with errorcode 1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpiexec.openmpi noticed that process rank 0 with PID 1822 on node julien-VirtualBox exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------
solver script exited with status 139.
Error running the coupled calculation.
Either Code_Saturne or SYRTHES may have failed.
Check Code_Saturne log (listing) and SYRTHES log (syrthes.log)
for details, as well as error* files.
****************************
Saving calculation results
****************************
Post-processing..
.syrthes --> med..
--> /home/julien/Syrthes/syrthes4.3.0/arch/Linux_x86_64/bin/syrthes4med30 -m "/home/julien/3disks2D/newtest/4-2Ddisks/case1/solid/3rond2d.syr" -r "/home/julien/3disks2D/newtest/4-2Ddisks/case1/RESU_COUPLING/20160708-1124/solid/resu2.res" -o "resu2.med"
Traceback (most recent call last):
File "/home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/bin/code_saturne", line 76, in <module>
retcode = cs.execute()
File "/home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/lib/python2.7/site-packages/code_saturne/cs_script.py", line 88, in execute
return self.commands[command](options)
File "/home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/lib/python2.7/site-packages/code_saturne/cs_script.py", line 150, in run
return cs_run.main(options, self.package)
File "/home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/lib/python2.7/site-packages/code_saturne/cs_run.py", line 304, in main
save_results=save_results)
File "/home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/lib/python2.7/site-packages/code_saturne/cs_case.py", line 1779, in run
self.save_results()
File "/home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/lib/python2.7/site-packages/code_saturne/cs_case.py", line 1694, in save_results
d.copy_results()
File "/home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/lib/python2.7/site-packages/code_saturne/cs_case_domain.py", line 1200, in copy_results
raise RunCaseError(err_str)
cs_case_domain.RunCaseError:
Error during SYRTHES postprocessing
See also listing and syrthes.log files
Best regards
Julien
- Attachments
-
- syrthes.log
- (7.06 KiB) Downloaded 383 times
-
- listing.txt
- (28.47 KiB) Downloaded 413 times
-
- Posts: 4207
- Joined: Mon Feb 20, 2012 3:25 pm
Re: installation Salome+CS+Syrthes via VM on ubuntu 15.10
Hello,
It is hard to tell what is wrong from the listing, as the computation seems to start OK (so the files seem to be in the right place).
Could you post the results of "ldd cs_solver" and "ldd syrthes.exe" in "/home/julien/3disks2D/newtest/4-2Ddisks/case1/RESU_COUPLING/20160708-1138/fluid" and "/home/julien/3disks2D/newtest/4-2Ddisks/case1/RESU_COUPLING/20160708-1124/solid" respectively ?
Regards,
Yvan
It is hard to tell what is wrong from the listing, as the computation seems to start OK (so the files seem to be in the right place).
Could you post the results of "ldd cs_solver" and "ldd syrthes.exe" in "/home/julien/3disks2D/newtest/4-2Ddisks/case1/RESU_COUPLING/20160708-1138/fluid" and "/home/julien/3disks2D/newtest/4-2Ddisks/case1/RESU_COUPLING/20160708-1124/solid" respectively ?
Regards,
Yvan
Re: installation Salome+CS+Syrthes via VM on ubuntu 15.10
Hello,
I don't have these files!
In fluid, I have the files:
- error
- fluid_3disks_alone.xml
- fluid-3rond2D-coupling.xml
- listing
- mesh_input
- performance.log
- preprocessor.log
- setup.log
- and a folder: posprocessing
In solid:
- 3rond2d.des
- compile.log
- domaine.case
- domaine.geom
- listing.pak
- Makefile
- resu1_all.res
- resu1_all.syr
- resu2.add
- resu2_cplcd.syr
- solid-alone.syd
- solid-coupling.syd
- solid-coupling_restart.syd
- stderr.txt
- stdout.txt
- syrthes
- syrthes.log
- syrthes.stop
- tmp.data
- 2 folders: POST and src
Best regards
Julien
I don't have these files!
In fluid, I have the files:
- error
- fluid_3disks_alone.xml
- fluid-3rond2D-coupling.xml
- listing
- mesh_input
- performance.log
- preprocessor.log
- setup.log
- and a folder: posprocessing
In solid:
- 3rond2d.des
- compile.log
- domaine.case
- domaine.geom
- listing.pak
- Makefile
- resu1_all.res
- resu1_all.syr
- resu2.add
- resu2_cplcd.syr
- solid-alone.syd
- solid-coupling.syd
- solid-coupling_restart.syd
- stderr.txt
- stdout.txt
- syrthes
- syrthes.log
- syrthes.stop
- tmp.data
- 2 folders: POST and src
Best regards
Julien
-
- Posts: 4207
- Joined: Mon Feb 20, 2012 3:25 pm
Re: installation Salome+CS+Syrthes via VM on ubuntu 15.10
Hello,
OK, you can run "ldd syrthes" in solid, and for fluid, check the "cs_solver" path inside the run_solver file in the directory containing "fluid" and "solid", and run ldd on that path.
Regards,
Yvan
OK, you can run "ldd syrthes" in solid, and for fluid, check the "cs_solver" path inside the run_solver file in the directory containing "fluid" and "solid", and run ldd on that path.
Regards,
Yvan
Re: installation Salome+CS+Syrthes via VM on ubuntu 15.10
Hello Yvan,
I run "ldd syrthes" the solid folder:
julien@julien-VirtualBox:~/3disks2D/newtest/4-2Ddisks/case1/solid$ ldd syrthes
linux-vdso.so.1 => (0x00007ffc90bef000)
/usr/lib/x86_64-linux-gnu/libgfortran.so.3 (0x00007fcd2459f000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007fcd2426d000)
libple.so.1 => /home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/lib/libple.so.1 (0x00007fcd2405b000)
libmpi.so.1 => /usr/lib/libmpi.so.1 (0x00007fcd23cd5000)
libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007fcd23ab6000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007fcd236ec000)
libquadmath.so.0 => /usr/lib/x86_64-linux-gnu/libquadmath.so.0 (0x00007fcd234ad000)
libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007fcd23295000)
libutil.so.1 => /lib/x86_64-linux-gnu/libutil.so.1 (0x00007fcd23092000)
libhwloc.so.5 => /usr/lib/x86_64-linux-gnu/libhwloc.so.5 (0x00007fcd22e49000)
libltdl.so.7 => /usr/lib/x86_64-linux-gnu/libltdl.so.7 (0x00007fcd22c3e000)
/lib64/ld-linux-x86-64.so.2 (0x00005642f636d000)
libnuma.so.1 => /usr/lib/x86_64-linux-gnu/libnuma.so.1 (0x00007fcd22a33000)
libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007fcd2282e000)
In the runcase_coupling file, CS path is ok:
# Ensure the correct command is found:
export PATH=/home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/bin:$PATH
I am not sure to understand the last step that you told me to do.
I run "ldd $PATH" in the folder where there are solid and fluid folders:
ldd: /home/julien/Syrthes/syrthes4.3.0/arch/Linux_x86_64/bin:/home/julien/Syrthes/syrthes4.3.0/arch/Linux_x86_64/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games: Aucun fichier ou dossier de ce type
or "ldd /home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/bin" in the same folder, I have:
julien@julien-VirtualBox:~/3disks2D/newtest/4-2Ddisks/case1$ ldd /home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/bin
ldd: /home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/bin: n'est pas un fichier régulier
I hope I have done what you wanted
Best regards
Julien
I run "ldd syrthes" the solid folder:
julien@julien-VirtualBox:~/3disks2D/newtest/4-2Ddisks/case1/solid$ ldd syrthes
linux-vdso.so.1 => (0x00007ffc90bef000)
/usr/lib/x86_64-linux-gnu/libgfortran.so.3 (0x00007fcd2459f000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007fcd2426d000)
libple.so.1 => /home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/lib/libple.so.1 (0x00007fcd2405b000)
libmpi.so.1 => /usr/lib/libmpi.so.1 (0x00007fcd23cd5000)
libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007fcd23ab6000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007fcd236ec000)
libquadmath.so.0 => /usr/lib/x86_64-linux-gnu/libquadmath.so.0 (0x00007fcd234ad000)
libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007fcd23295000)
libutil.so.1 => /lib/x86_64-linux-gnu/libutil.so.1 (0x00007fcd23092000)
libhwloc.so.5 => /usr/lib/x86_64-linux-gnu/libhwloc.so.5 (0x00007fcd22e49000)
libltdl.so.7 => /usr/lib/x86_64-linux-gnu/libltdl.so.7 (0x00007fcd22c3e000)
/lib64/ld-linux-x86-64.so.2 (0x00005642f636d000)
libnuma.so.1 => /usr/lib/x86_64-linux-gnu/libnuma.so.1 (0x00007fcd22a33000)
libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007fcd2282e000)
In the runcase_coupling file, CS path is ok:
# Ensure the correct command is found:
export PATH=/home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/bin:$PATH
I am not sure to understand the last step that you told me to do.
I run "ldd $PATH" in the folder where there are solid and fluid folders:
ldd: /home/julien/Syrthes/syrthes4.3.0/arch/Linux_x86_64/bin:/home/julien/Syrthes/syrthes4.3.0/arch/Linux_x86_64/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games: Aucun fichier ou dossier de ce type
or "ldd /home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/bin" in the same folder, I have:
julien@julien-VirtualBox:~/3disks2D/newtest/4-2Ddisks/case1$ ldd /home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/bin
ldd: /home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/bin: n'est pas un fichier régulier
I hope I have done what you wanted
Best regards
Julien
-
- Posts: 4207
- Joined: Mon Feb 20, 2012 3:25 pm
Re: installation Salome+CS+Syrthes via VM on ubuntu 15.10
Hello,
The second ldd should be:
ldd /home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/libexec/code_saturne/cs_solver.
Regards,
Yvan
The second ldd should be:
ldd /home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/libexec/code_saturne/cs_solver.
Regards,
Yvan
Re: installation Salome+CS+Syrthes via VM on ubuntu 15.10
Hello,
In the folder with both fluid and solid, I run
"ldd /home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/libexec/code_saturne/cs_solver"
The terminal returns:
julien@julien-VirtualBox:~/3disks2D$ ldd /home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/libexec/code_saturne/cs_solver
linux-vdso.so.1 => (0x00007ffe6f14a000)
/usr/lib/x86_64-linux-gnu/libgfortran.so.3 (0x00007fb391114000)
libsaturne.so.0 => /home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/lib/libsaturne.so.0 (0x00007fb38f752000)
libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007fb38f50b000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007fb38f140000)
libquadmath.so.0 => /usr/lib/x86_64-linux-gnu/libquadmath.so.0 (0x00007fb38ef01000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007fb38ebf9000)
libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007fb38e9e1000)
libple.so.1 => /home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/lib/libple.so.1 (0x00007fb38e7cf000)
libmedC.so.1 => /home/julien/salome/Salome-V7_6_0-x86_64/tools/Medfichier-308/lib/libmedC.so.1 (0x00007fb38e55e000)
libxml2.so.2 => /usr/local/lib/libxml2.so.2 (0x00007fb38e1f8000)
libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007fb38dff4000)
libmpi.so.1 => /usr/lib/libmpi.so.1 (0x00007fb38dc6d000)
/lib64/ld-linux-x86-64.so.2 (0x0000557f6f8c4000)
libhdf5.so.7.4.0 => /home/julien/salome/Salome-V7_6_0-x86_64/prerequisites/Hdf5-1810/lib/libhdf5.so.7.4.0 (0x00007fb38d7b0000)
libstdc++.so.6 => /usr/lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007fb38d42d000)
libz.so.1 => /lib/x86_64-linux-gnu/libz.so.1 (0x00007fb38d213000)
libutil.so.1 => /lib/x86_64-linux-gnu/libutil.so.1 (0x00007fb38d010000)
libhwloc.so.5 => /usr/lib/x86_64-linux-gnu/libhwloc.so.5 (0x00007fb38cdc6000)
libltdl.so.7 => /usr/lib/x86_64-linux-gnu/libltdl.so.7 (0x00007fb38cbbc000)
librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007fb38c9b4000)
libnuma.so.1 => /usr/lib/x86_64-linux-gnu/libnuma.so.1 (0x00007fb38c7a8000)
What sould I do now?
Best regards
Julien
In the folder with both fluid and solid, I run
"ldd /home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/libexec/code_saturne/cs_solver"
The terminal returns:
julien@julien-VirtualBox:~/3disks2D$ ldd /home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/libexec/code_saturne/cs_solver
linux-vdso.so.1 => (0x00007ffe6f14a000)
/usr/lib/x86_64-linux-gnu/libgfortran.so.3 (0x00007fb391114000)
libsaturne.so.0 => /home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/lib/libsaturne.so.0 (0x00007fb38f752000)
libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007fb38f50b000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007fb38f140000)
libquadmath.so.0 => /usr/lib/x86_64-linux-gnu/libquadmath.so.0 (0x00007fb38ef01000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007fb38ebf9000)
libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007fb38e9e1000)
libple.so.1 => /home/julien/Code_Saturne/4.0.5/code_saturne-4.0.5/arch/Linux_x86_64/lib/libple.so.1 (0x00007fb38e7cf000)
libmedC.so.1 => /home/julien/salome/Salome-V7_6_0-x86_64/tools/Medfichier-308/lib/libmedC.so.1 (0x00007fb38e55e000)
libxml2.so.2 => /usr/local/lib/libxml2.so.2 (0x00007fb38e1f8000)
libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007fb38dff4000)
libmpi.so.1 => /usr/lib/libmpi.so.1 (0x00007fb38dc6d000)
/lib64/ld-linux-x86-64.so.2 (0x0000557f6f8c4000)
libhdf5.so.7.4.0 => /home/julien/salome/Salome-V7_6_0-x86_64/prerequisites/Hdf5-1810/lib/libhdf5.so.7.4.0 (0x00007fb38d7b0000)
libstdc++.so.6 => /usr/lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007fb38d42d000)
libz.so.1 => /lib/x86_64-linux-gnu/libz.so.1 (0x00007fb38d213000)
libutil.so.1 => /lib/x86_64-linux-gnu/libutil.so.1 (0x00007fb38d010000)
libhwloc.so.5 => /usr/lib/x86_64-linux-gnu/libhwloc.so.5 (0x00007fb38cdc6000)
libltdl.so.7 => /usr/lib/x86_64-linux-gnu/libltdl.so.7 (0x00007fb38cbbc000)
librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007fb38c9b4000)
libnuma.so.1 => /usr/lib/x86_64-linux-gnu/libnuma.so.1 (0x00007fb38c7a8000)
What sould I do now?
Best regards
Julien
-
- Posts: 4207
- Joined: Mon Feb 20, 2012 3:25 pm
Re: installation Salome+CS+Syrthes via VM on ubuntu 15.10
Hello,
So, the MPI libraries seem to be the same and OK.
From your other logs, it is Syrthes which is crashing, but I don't know why. Based on your syrthes.log file, in the "solid" directory (where the syrthes executable is found), could you try:
addr2line -e syrthes 0x4196eb
addr2line -e syrthes 0x4103fe
addr2line -e syrthes 0x4080c8
addr2line -e syrthes 0x402ce2
Regards,
Yvan
So, the MPI libraries seem to be the same and OK.
From your other logs, it is Syrthes which is crashing, but I don't know why. Based on your syrthes.log file, in the "solid" directory (where the syrthes executable is found), could you try:
addr2line -e syrthes 0x4196eb
addr2line -e syrthes 0x4103fe
addr2line -e syrthes 0x4080c8
addr2line -e syrthes 0x402ce2
Regards,
Yvan