Page 1 of 1
compatible MED headers not found
Posted: Wed Feb 12, 2014 8:46 pm
by euduca
How can I solve this
I already tried manually and the installer.
I searched on this forum and the web but have not found a solution.
Code Saturne version 2.0.5 (I know, old... but they need this version)
I tried... I tried... I tried....
Code: Select all
[root@nova0 installer]# ./install_saturne.py
Installation of Code_Saturne
____________________________
The process will take several minutes.
You can have a look at the log file meanwhile.
Check the setup file and some utilities presence.
o Checking for icc... /opt/intel/CT/Compiler/11.1/072/bin/intel64/icc
o Checking for ifort... /opt/intel/CT/Compiler/11.1/072/bin/intel64/ifort
o Checking for mpicc... /opt/intel/impi/4.0.3.008/intel64/bin/mpicc
o Checking for python... /usr/bin/python
Python version is 2.4
o Checking for make... /usr/bin/make
o Checking for pdflatex... /usr/bin/pdflatex
o Checking for fig2dev... /usr/bin/fig2dev
Installation of ECS
ECS (Code_Saturne Preprocessor)
version: 2.0.2
url: http://code-saturne.org/releases/ecs-2.0.2.tar.gz
package: ecs
source_dir: /home_nfs/local/code_saturne/installer/installer/ecs-2.0.2
install_dir: /home_nfs/local/code_saturne/v.2.0.5/intel/cs-2.0
config_opts: --with-bft=/home_nfs/local/code_saturne/v.2.0.5/intel/cs-2.0 --with-hdf5=/home_nfs/local/hdf5/parallel/hdf5-1.8.12/ --with-cgns=/home_nfs/local/cgnslib/parallel/cgnslib-3.1.4/ --with-med=/home_nfs/local/code_saturne/v.2.0.5/intel/med-3.0.4 --with-metis=/home_nfs/local/metis/metis-5.1.0
o Configure...
Error during configure stage of ECS.
See install_saturne.log for more information.
And the log:
Code: Select all
[...]
checking for bft version >= "1.0.6"... compatible bft version found
checking for ADF_Database_Open in -ladf... no
configure: WARNING: no ADF support
checking hdf5.h usability... no
checking hdf5.h presence... no
checking for hdf5.h... no
checking hdf5.h usability... no
checking hdf5.h presence... no
checking for hdf5.h... no
checking for H5Fopen in -lhdf5... yes
checking for cg_open in -lcgns... yes
MED >= 2.9.0 headers not found
MED >= 2.3.4 headers not found
compatible MED headers not found
configure: error: in `/home_nfs/local/code_saturne/installer/installer/ecs-2.0.2.build':
configure: error: MED support is requested, but test for MED failed!
See `config.log' for more details
Thanks.
Re: compatible MED headers not found
Posted: Thu Feb 13, 2014 2:34 am
by Yvan Fournier
Hello,
Did you try to install MED with the installer or separately ? Could you post your setup file ?
Did you read the detailed installation manual (contains recommendations for the manual install).
Also, even if you need a Code_Saturne version 2.0, I recommend at least upgrading to release 2.0.7 for the latest fixes.
Regards,
Yvan
Re: compatible MED headers not found
Posted: Thu Feb 13, 2014 6:40 pm
by euduca
Hello,
Hello
Did you try to install MED with the installer or separately ? Could you post your setup file ?
Both. Separately my setup uses med-3.0.7. Falied.
The installer (auto install - download and install) uses the med-3.0.4. Failed.
I tried intel (icc, ifort, mpicc) and gnu ( gcc,gfortran)/openmpi.
Did you read the detailed installation manual (contains recommendations for the manual install).
Yes, I did. I read the "install-2.0.pdf and ""install-2.3.pdf"
Code: Select all
Also, even if you need a Code_Saturne version 2.0, I recommend at least upgrading to release 2.0.7 for the latest fixes.
I'll try.
Setup with gcc (med downloaded by installer):
http://nbcgib.uesc.br/duca/setup-gcc
Setup with intel :
http://nbcgib.uesc.br/duca/setup-intel
Thanks!!!!

Re: compatible MED headers not found
Posted: Fri Feb 14, 2014 1:37 am
by Yvan Fournier
Hello,
I am not sure: did the build work with version2.0.7 ?
Otherwise, one possible explaination for the issue might be your use of a parallel HDF5 build.
If you use a parallel build of HDF5, you must compile MED and the Code_Saturne preprocessor with MPI compiler wrappers, meaning you must define ComC to the same value as MPICompC in your setup file (/opt/intel/impi/4.0.3.008/intel64/bin/mpicc for example).
Also, if you have several versions of MPI on the cluster (I assume it is a cluster, given you environment), make sure the build of HDF5 you use uses a version of MPI matching the one you use for Code_Saturne.
As Code_Saturne does not currently use parallel IO for MED or CGNS, if there is a serial build of HDF5 on the cluster, I recommend using that one. Otherwise, let the installer install on for you.
In all cases, make sure MPICompC points to MPI compiler wrappers (this is the case in one of your setups, not in the other).
Regards,
Yvan
Re: compatible MED headers not found
Posted: Thu Mar 06, 2014 2:54 pm
by euduca
In all cases, make sure MPICompC points to MPI compiler wrappers (this is the case in one of your setups, not in the other).
Hi, you're right. I used the OpenMPI in this case.
I compiled the Saturne and its deps with the mpicc.
I installed the 2.0.5 and latest version too.
I'll try with the Intel MPI, now.
Thanks.
Obs.: I can't attach the file (
The extension is not allowed).
My setup file:
Code: Select all
#========================================================
# Setup file for Code_Saturne installation
#========================================================
#
#--------------------------------------------------------
# Download packages
#--------------------------------------------------------
download yes
#
#--------------------------------------------------------
# Language
# default: "en" english
# others: "fr" french
#--------------------------------------------------------
language en
#
#--------------------------------------------------------
# Install Code_Saturne with debugging symbols
#--------------------------------------------------------
debug no
#
#--------------------------------------------------------
# Installation directory
#--------------------------------------------------------
prefix /usr/local/saturne/saturne-2.0.5/mpi/openmpi
#
#--------------------------------------------------------
# Architecture Name
#--------------------------------------------------------
use_arch no
arch
#
#--------------------------------------------------------
# C compiler
#--------------------------------------------------------
compC mpicc
#
#--------------------------------------------------------
# Fortran compiler
#--------------------------------------------------------
compF mpif90
#
#--------------------------------------------------------
# MPI wrapper for C compiler
#--------------------------------------------------------
mpiCompC mpicc
#
#--------------------------------------------------------
# Disable Graphical user Interface
#--------------------------------------------------------
disable_gui no
#
#--------------------------------------------------------
# Python is mandatory to launch the Graphical User
# Interface and to use Code_Saturne scripts.
# It has to be compiled with PyQt 4 support.
#
# It is highly recommended to use the Python provided
# by the distribution and to install PyQt through
# the package manager if needed.
#
# If you need to provide your own Python, just set
# the following variable to the bin directory of Python
#--------------------------------------------------------
python
#
#--------------------------------------------------------
# BLAS For hardware-optimized Basic Linear Algebra
# Subroutines. If no system BLAS is used, one reverts
# to an internal BLAS emulation, which may be somewhat
# slower.
#
# ATLAS (or another BLAS) should be available for most
# platforms through the package manager. If using the
# Intel or IBM compilers, IMKL or ESSL may be used in
# place of ATLAS respectively.
# For a fine-tuning of BLAS library support, it may
# be necessary to install Code_Saturne Kernel manually.
#--------------------------------------------------------
blas
#
#--------------------------------------------------------
# Metis is more rarely found in Linux distributions,
# but may already be installed on massively parallel
# machines and on clusters. For good parallel
# performance, it is highly recommended.
# For meshes larger than 15 million cells, Metis 5.0
# beta is recommended, as Metis 4 may fail above
# the 20-35 million cells.
#
# Scotch can be use as an alternative.
#
# If both are present, Metis will be the default.
# If none are present, a space-filling-curve algorithm
# will be used.
#--------------------------------------------------------
metis /usr/local/metis/metis-5.1.0
scotch
#
#--------------------------------------------------------
# SYRTHES installation path for an optional coupling.
#
# Only coupling with the SYRTHES thermal code version 3
# is handled at the moment.
#
# SYRTHES has to be installed before Code_Saturne for
# a correct detection. However, it is still possible to
# update the scripts after Code_Saturne installation.
#--------------------------------------------------------
syrthes
#
#========================================================
# Name Path Use Install
#========================================================
#
#--------------------------------------------------------
# Code_Saturne kernel and module libraries
#--------------------------------------------------------
#
bft /usr/local/saturne/saturne-2.0.5/mpi/openmpi/cs-2.0 yes no
fvm /usr/local/saturne/saturne-2.0.5/mpi/openmpi/cs-2.0 yes no
mei /usr/local/saturne/saturne-2.0.5/mpi/openmpi/cs-2.0 yes no
ecs /usr/local/saturne/saturne-2.0.5/mpi/openmpi/cs-2.0 yes no
ncs /usr/local/saturne/saturne-2.0.5/mpi/openmpi/cs-2.0 yes no
#
#--------------------------------------------------------
# Optional packages:
# ------------------
#
# MED / HDF5 For MED file format support
# (used by SALOME and now by Gmsh)
#
# CGNS For CGNS file support
# (used by many meshers)
#
# Open MPI (or MPICH2)
#
# For Linux workstations, MPI, HDF5, and even MED
# packages may be available through the package manager.
# HDF5 is also often available on large systems such as
# IBM Blue Gene or Cray XT.
#
# For massively parallel architectures, it is
# recommended to use the system's default MPI library.
#
# Libxml2 is needed to read xml files output by the
# Graphical User Interface, and swig is needed by the
# Graphical User Interface to handle mathematical
# expressions. Both should be installed by you package
# manager.
#--------------------------------------------------------
#
cgns /usr/local/cgnslib/cgnslib-3.1.3-4/mpi/openmpi yes no
hdf5 /usr/local/hdf5/hdf5-1.8.12/mpi/openmpi yes no
med /usr/local/med/med-3.0.7/mpi/openmpi yes no
mpi /home_nfs/local/mpi/openmpi-1.6.5 yes no
libxml2 None auto no
swig None auto no
#
#========================================================