p.s. it's a bit of a learning curve to set things up with Spack, but
hopefully it does not take too much as there is a lot of documentation.
On the positive side, a lot of developers now give more and more attention
to Spack and maintain their packages. I know that PETSc and Trilinos guys
keep a
HI Alberto,
On Tuesday, April 4, 2017 at 10:58:39 PM UTC+2, Alberto Salvadori wrote:
>
> Sorry, I am a bit confused.
>
> "you need to make sure that building mpich is set to false."
> - where I can set "building mpich" to false? Shall I edit a file
> ~/.spack/packages.yaml that BTW currently I
Bruno is right, you definitely don't want compile MPI yourself in a cluster. At
the very top of the WiKi there is an example of how to configure Spack on a
cluster with external MPI
https://github.com/dealii/dealii/wiki/deal.II-in-Spack
Probably you missed this.
I will edit the WiKi to stress t
Got it, sorry.
A
*Alberto Salvadori* Dipartimento di Ingegneria Civile, Architettura,
Territorio, Ambiente e di Matematica (DICATAM)
Universita` di Brescia, via Branze 43, 25123 Brescia
Italy
tel 030 3711239
fax 030 3711312
e-mail:
alberto.salvad...@unibs.it
web-pages:
http://m4lab.unibs.i
Ouch ... something went wrong:
asalvad2 *~/.spack/linux* *$* *cd $SPACK_ROOT*
asalvad2 */scratch/asalvad2/spack* *$* *spack install dealii %gcc@6.2*
*==>* Error: : Additional properties are not allowed
('openmpi' was unexpected)
asalvad2 */scratch/asalvad2/spack* *$* *module list*
*Currently L
Yeah it looks good.
Bruno
2017-04-04 17:25 GMT-04:00 Alberto Salvadori :
> Hi Bruno,
>
> I really appreciate your help.
> I got the right location of my openmpi and just mean to double check before
> running spack. I shall therefore:
>
> 1 - create a file ~/.spack/linux/packages.yaml as follows:
Hi Bruno,
I really appreciate your help.
I got the right location of my openmpi and just mean to double check before
running spack. I shall therefore:
1 - create a file ~/.spack/linux/packages.yaml as follows:
openmpi:
version: [2.0.1]
paths:
openmpi@2.0.1%gcc@6.2.0: ...SOMEPATH..
2017-04-04 16:58 GMT-04:00 Alberto Salvadori :
> Sorry, I am a bit confused.
>
> "you need to make sure that building mpich is set to false."
> - where I can set "building mpich" to false? Shall I edit a file
> ~/.spack/packages.yaml that BTW currently I do not have?
Yes, you need to create the fi
Sorry, I am a bit confused.
"you need to make sure that building mpich is set to false."
- where I can set "building mpich" to false? Shall I edit a file
~/.spack/packages.yaml that BTW currently I do not have?
- if so, would this be OK?
*spack providers mpi*
*intel-parallel-studio@cluster
Alberto,
2017-04-04 16:04 GMT-04:00 Alberto Salvadori :
> If I understand well then, I have to add the mpich/3.2-gcc-6.2.0 compiler
> to spack - I did this just now.
> I checked that the file compilers.yaml includes the compiler and its
> path.The same file still includes the compiler gcc-4.9.2
T
Thanks, Bruno.
If I understand well then, I have to add the mpich/3.2-gcc-6.2.0 compiler
to spack - I did this just now.
I checked that the file compilers.yaml includes the compiler and its
path.The same file still includes the compiler gcc-4.9.2
Shall I delete the corresponding section in the c
Alberto,
you don't want spack to install mpi on a cluster take a look here
https://spack.readthedocs.io/en/latest/getting_started.html#system-packages
Also gcc 4.9 and gcc 6.2 are not compatible, so you need everything to be
compiled by the same compiler, i.e. gcc 6.2.
Best,
Bruno
On Tuesday
Hi,
your help in this issue is greatly appreciated.
I installed deal.II on a cluster using the spack distribution, very easily.
I also run some examples provided with deal.II with no issues. Great job.
I am now using a library for tensor calculus, that apparently conflicts
with the gcc compil
Dear all,
I'm am trying to reproduce with my implementation, the results in the
Photonic Crystal computations performed in [1]. Here, the author uses a
grid with an inner disk with radius R=0.475, and for FEM it is used the
software Concepts [2] that implements curvilinear elements denoted Blen
14 matches
Mail list logo