> I must reinstall Petsc and deal.ii again. Is there a need to reinstall P4est
> as well ?
p4est also uses MPI so yes.
By the way: after you install deal.II, you should run "make test" in
the build directory. I am pretty sure it would have detected this
problem.
--
Timo Heister
http://www.math.
Respected Professor,
Yes, I remember this discussion regarding not using --download mpich tag
while installing Petsc happening on github as well.
Yes, I think I should now do it. Although it still remains a mystery to me,
how and why openmpi got installed and is now clashing with mpich.
Anyway
On 10/28/2016 11:49 AM, RAJAT ARORA wrote:
I think that openmpi and Mpich are clashing now. (I didn't even know
that both were installed upto now).
I think, if I can disable openmpi somehow, I will be in good shape. This
is because, I compiled deal.ii with mpich library which was installed by
p
I am surprised why it is not working. I can't recall what has changed.
I don't remember installing any new libraries. It was working till Monday.
Also, I have installed petsc with --download mpich tag but donot have any
other mpi installation.
and more importantly, it was working until monday.
>> Should I reinstall my mpi library ? Does deal.ii will also need to be
>> recompiled then ?
>
> I don't know. You say that it suddenly stopped working. Did it work
> correctly before? What has changed in the meantime?
This can happen if you have conflicting mpi libraries installed (for
example l
On 10/28/2016 09:55 AM, RAJAT ARORA wrote:
I am just calling dealii::Utilities::MPI::MPI_InitFinalize
mpi_initialization(argc, argv, 1) once.
I am not making an explicit call to MPI_Init anywhere.
Yes, calling MPI_InitFinalize is enough.
Should I reinstall my mpi library ? Does deal.ii wil
Respected Prof. Bangerth,
I am just calling dealii::Utilities::MPI::MPI_InitFinalize
mpi_initialization(argc, argv, 1) once.
I am not making an explicit call to MPI_Init anywhere.
Should I reinstall my mpi library ? Does deal.ii will also need to be
recompiled then ?
On Friday, October 28, 2
On 10/28/2016 09:47 AM, RAJAT ARORA wrote:
Yes, you are right :).
When running with 4 mpi processes, all threads print their rank as 0.
Why is it broken all of a sudden. What is this problem called ? How do I
correct it ?
I don't know. Are you calling MPI_Init, for example?
W.
--
---
Respected Prof. Bangerth,
Yes, you are right :).
When running with 4 mpi processes, all threads print their rank as 0.
Why is it broken all of a sudden. What is this problem called ? How do I
correct it ?
On Friday, October 28, 2016 at 11:39:33 AM UTC-4, Wolfgang Bangerth wrote:
>
> On 10/28/20
On 10/28/2016 09:34 AM, RAJAT ARORA wrote:
To run the code on n processes, I used to run the command
mpirun -np ./
But, now, when I run it using this command, it runs n programs with n
mpi processes each.
It is like I have executed mpirun -np ./ command n times
and the compiler is then run
Hello all,
I am working on a 3D solid mechanics problem using deal.ii.
To run the code on n processes, I used to run the command
mpirun -np ./
But, now, when I run it using this command, it runs n programs with n mpi
processes each.
It is like I have executed mpirun -np ./ command n times
11 matches
Mail list logo