I'm now using MPICH and Intel compiler 2017. Everything works perfectly
fine. Thanks to everyone ;)
Best regards,
S. A. Mohseni
--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see
https://groups.google.com/d/forum/dealii?hl=en
---
You received this
2016-11-30 2:40 GMT-05:00 seyedali88 via deal.II User Group
:
> Can you tell me which one is faster, MPICH or OpenMPI? I choose Intel MPI,
> because I assumed it is the fastest available. Googled about it, but the
> opinions about it are random.
There are no difference. That's not where your code s
Can you tell me which one is faster, MPICH or OpenMPI? I choose Intel MPI,
because I assumed it is the fastest available. Googled about it, but the
opinions about it are random.
Best Regards,
S. A. Mohseni
--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum optio
2016-11-29 16:25 GMT-05:00 seyedali88 via deal.II User Group
:
> I have only Intel MPI installed at the moment. It's strange that it works
> for 2 or 4 cores. Single core works flawlessly.
> Maybe I should try the older compiler and see, if it works...
I don't think it's a compiler problem. However
I have only Intel MPI installed at the moment. It's strange that it works
for 2 or 4 cores. Single core works flawlessly.
Maybe I should try the older compiler and see, if it works...
Best regards,
S. A. Mohseni
--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum
On Tuesday, November 29, 2016 at 11:13:07 AM UTC-5,
seyedal...@googlemail.com wrote:
>
> I reported the problem to the PETSc team. Now I have setup my desktop
> system with Intel compiler 2017 which compiled now without problems and the
> step-40 works better now, but still gives an error. On
You have to try this piece-by-piece to get a result. Certainly I am
confused as to which configurations you have tried and which work and
which don't, and if configure or linking fails... :-)
Try as Wolfgang suggested and start with a minimal configuration. Try
this, for example,
./configure
The make.log is from a previous successful compilation. First PETSc
configures and checks, than it builds the files, so I assume make.log will
not help in that case. Unfortunately, there is a linking issue with shared
libraries that prevents PETSc from detecting the mpiicc correctly. I will
try
On 11/25/2016 12:48 AM, seyedali88 via deal.II User Group wrote:
Sorry for the delayed answer, yesterday I was just too busy and sleepy :)
I was only able to find the RDict.log and a make.log, but the make.log is a
corrupt file it won't open correctly.
Additionally, there is a scaling.log which
On 11/24/2016 09:36 AM, seyedali88 via deal.II User Group wrote:
Surprisingly, my MPI installation from Intel works. I tried some examples with
mpirun and mpiicc to validate it. Even deal.II was compiled completely without
errors using the same Intel compiler. I assume there has to be a library
s
Surprisingly, my MPI installation from Intel works. I tried some examples
with mpirun and mpiicc to validate it. Even deal.II was compiled completely
without errors using the same Intel compiler. I assume there has to be a
library specified since PETSc cannot detect the compiler, it's PETSc
pro
On 11/24/2016 08:55 AM, seyedali88 via deal.II User Group wrote:
***
UNABLE to EXECUTE BINARIES for ./configure
---
Cannot ru
No of course I have used the same MPI installations both for PETSc and
deal.II. Since I used mpicc it uses the MPICH wrapper or the OPENMPI
wrapper. Unfortunately, the mpicc always works with the GNU compiler gcc
and I haven't succeeded using the Intel compiler for them.
Thank you for point 1/.
On 11/24/2016 03:34 AM, seyedali88 via deal.II User Group wrote:
This problem is giving me a headache. PETSC is one of the buggiest libraries
ever...
If I don't touch anything, the compiler is detected as mpicc (I assume it is
some wrapper based on gcc compiler?) and PETSC compiles fine.
The prob
This problem is giving me a headache. PETSC is one of the buggiest
libraries ever...
If I don't touch anything, the compiler is detected as mpicc (I assume it
is some wrapper based on gcc compiler?) and PETSC compiles fine.
The problem though is that I think mpicc is wrong since I use mpiicc from
I checked out the latest development version of deal.II from the git
repository. It should be compatible.
With regard to my second issue with installing PETSC 3.6.0, I have found
out that with the GNU compiler and MPICH it works fine, but if I use Intel
compiler and the Intel MPI library it gi
Seyed,
On Monday, November 21, 2016 at 6:10:49 AM UTC-5, seyedal...@googlemail.com
wrote:
>
> step-40 works perfectly, if I use mpirun with only 1 core, namely -np 1.
> But for 8 cores it gets stuck at cycle 6 without any message. It runs
> forever.
> I assume it is related to PETSC since I am
17 matches
Mail list logo