Re: [OMPI users] MPI Exit Code:1 on an OpenFoam application

2021-01-10 Thread Kahnbein Kai via users

Hey Tony,
it works without the -parallel flag, all four cpu's are at 100% and 
running fine.


Best regards
Kai

Am 05.01.21 um 20:36 schrieb Tony Ladd via users:

Just run the executable without mpirun and the -parallel flag.

On 1/2/21 11:39 PM, Kahnbein Kai via users wrote:

*[External Email]*

Ok, sorry, what do you mean with the "serial version" ?

Best regards
Kai

Am 31.12.20 um 16:25 schrieb tladd via users:


I did not see the whole email chain before. The problem is not that 
it cannot find the MPI directories. I think this INIT error comes 
when the program cannot start for some reason. For example a missing 
input file. Does the serial version work.



On 12/31/20 6:33 AM, Kahnbein Kai via users wrote:

*[External Email]*

I compared the /etc/bashrc files of both versions of OF (v7 and v8) 
and i dont found any difference.

Here are the lines (i thought related to openmpi) of both files:

OpenFOAM v7:
Line 86 till 89:
#- MPI implementation:
#    WM_MPLIB = SYSTEMOPENMPI | OPENMPI | SYSTEMMPI | MPICH | 
MPICH-GM | HPMPI

#   | MPI | FJMPI | QSMPI | SGIMPI | INTELMPI
export WM_MPLIB=SYSTEMOPENMPI

Line 169 till 174:
# Source user setup files for optional packages
# ~
_foamSource `$WM_PROJECT_DIR/bin/foamEtcFile config.sh/mpi`
_foamSource `$WM_PROJECT_DIR/bin/foamEtcFile config.sh/paraview`
_foamSource `$WM_PROJECT_DIR/bin/foamEtcFile config.sh/ensight`
_foamSource `$WM_PROJECT_DIR/bin/foamEtcFile config.sh/gperftools`

OpenFOAM v8:
Line 86 till 89:
#- MPI implementation:
#    WM_MPLIB = SYSTEMOPENMPI | OPENMPI | SYSTEMMPI | MPICH | 
MPICH-GM | HPMPI

#   | MPI | FJMPI | QSMPI | SGIMPI | INTELMPI
export WM_MPLIB=SYSTEMOPENMPI

Line 169 till 174:
# Source user setup files for optional packages
# ~
_foamSource `$WM_PROJECT_DIR/bin/foamEtcFile config.sh/mpi`
_foamSource `$WM_PROJECT_DIR/bin/foamEtcFile config.sh/paraview`
_foamSource `$WM_PROJECT_DIR/bin/foamEtcFile config.sh/ensight`
_foamSource `$WM_PROJECT_DIR/bin/foamEtcFile config.sh/gperftools`


Are you think these are the right lines ?

I wish you a healthy start into the new year,
Kai

Am 30.12.20 um 15:25 schrieb tladd via users:
Probably because OF cannot find your mpi installation. Once you 
set your OF environment, where is it looking for mpicc? Note the 
OF environment overrides your .bashrc once you source the OF 
bashrc. That takes its settings from the src/etc directory in the 
OF source code.



On 12/29/20 10:23 AM, Kahnbein Kai via users wrote:

[External Email]

Thank you for this hint. I installed OpenFOAM v8 (the newest) on my
computer and it works ...

At the Version v7 i get still this mpi error 

I dont know why ...

I wish you a healthy start into the new year :)


Am 28.12.20 um 19:16 schrieb Benson Muite via users:
Have you tried reinstalling OpenFOAM? If you are mostly working 
in a

desktop, there are pre-compiled versions available:
https://urldefense.proofpoint.com/v2/url?u=https-3A__openfoam.com_download_&d=DwIDaQ&c=sJ6xIWYx-zLMB3EPkvcnVg&r=kgFAU2BfgKe7cozjrP7uWDPH6xt6LAmYVlQPwQuK7ek&m=9YEwGLzNfCD1pAUuvNpqStsbpagtNfIzEt6wL6f3_7I&s=bZFAwh79J3ZL1Ut9Jt4qj-kBCubrvjsLNhq51hnAwXk&e= 



If you are using a pre-compiled version, do also consider reporting
the error to the packager. It seems unlikely to be an MPI error, 
more

likely something with OpenFOAM and/or the setup.

On 12/28/20 6:25 PM, Kahnbein Kai via users wrote:

Good morning,
im trying to fix this error by myself and i have a little update.
The ompi version i use is the:
Code:

kai@Kai-Desktop:~/Dokumente$ mpirun --version
mpirun (Open MPI) 4.0.3

If i create a *.c file, with the following content:
Code:

#include 
#include 

int main(int argc, char** argv) {
 // Initialize the MPI environment
 MPI_Init(NULL, NULL);

 // Get the number of processes
 int world_size;
 MPI_Comm_size(MPI_COMM_WORLD, &world_size);

 // Get the rank of the process
 int world_rank;
 MPI_Comm_rank(MPI_COMM_WORLD, &world_rank);

 // Get the name of the processor
 char processor_name[MPI_MAX_PROCESSOR_NAME];
 int name_len;
 MPI_Get_processor_name(processor_name, &name_len);

 // Print off a hello world message
 printf("Hello world from processor %s, rank %d out of %d
processors\n",
    processor_name, world_rank, world_size);

 // Finalize the MPI environment.
 MPI_Finalize();
  }


After i compile it and execute it:
Code:

kai@Kai-Desktop:~/Dokumente$ mpirun -np 4 ./hello_world -parallel
Hello world from processor Kai-Desktop, rank 0 out of 4 processors
Hello world from processor Kai-Desktop, rank 1 out of 4 processors
Hello world from processor Kai-Desktop, rank 2 out of 4 processors
Hello world from processor Kai-Desktop, rank 3 out of 4 processors


In conclusion mpi works on my computer, or not ?

Why are OpenFoam dosent work with it. ?


Best regards
Kai

Am 27.12.20 um 15:03 schrieb Kahnbein Kai

Re: [OMPI users] MPI Exit Code:1 on an OpenFoam application

2021-01-10 Thread Tony Ladd via users

Kai

That means your case directory is mostly OK. Exactly what command did 
you use to run the executable. By serial mode I actually meant a single 
processor. For example


simpleFoam

But then its surprising if it uses multiple cores. But it may be using 
multithreading by default.


On a multicore node something like

simpleFoam -parallel

might use a default number of cores (probably all of them).

For a proper parallel job you need to decompose the problem first with 
decomposePar. One possible source of error is the decomposeParDict file. 
If you don't get a proper decomposition that can be a problem. There are 
examples online.


My typical run script would be something like

decomposePar

mpirun -np 4 simpleFoam -parallel 2>&1 | tee log

reconstructPar

You can check the decomposition with paraview (in the individual 
processor dirs)


Tony


On 1/10/21 5:29 AM, Kahnbein Kai via users wrote:

[External Email]

Hey Tony,
it works without the -parallel flag, all four cpu's are at 100% and
running fine.

Best regards
Kai

Am 05.01.21 um 20:36 schrieb Tony Ladd via users:

Just run the executable without mpirun and the -parallel flag.

On 1/2/21 11:39 PM, Kahnbein Kai via users wrote:

*[External Email]*

Ok, sorry, what do you mean with the "serial version" ?

Best regards
Kai

Am 31.12.20 um 16:25 schrieb tladd via users:


I did not see the whole email chain before. The problem is not that
it cannot find the MPI directories. I think this INIT error comes
when the program cannot start for some reason. For example a missing
input file. Does the serial version work.


On 12/31/20 6:33 AM, Kahnbein Kai via users wrote:

*[External Email]*

I compared the /etc/bashrc files of both versions of OF (v7 and v8)
and i dont found any difference.
Here are the lines (i thought related to openmpi) of both files:

OpenFOAM v7:
Line 86 till 89:
#- MPI implementation:
#    WM_MPLIB = SYSTEMOPENMPI | OPENMPI | SYSTEMMPI | MPICH |
MPICH-GM | HPMPI
#   | MPI | FJMPI | QSMPI | SGIMPI | INTELMPI
export WM_MPLIB=SYSTEMOPENMPI

Line 169 till 174:
# Source user setup files for optional packages
# ~
_foamSource `$WM_PROJECT_DIR/bin/foamEtcFile config.sh/mpi`
_foamSource `$WM_PROJECT_DIR/bin/foamEtcFile config.sh/paraview`
_foamSource `$WM_PROJECT_DIR/bin/foamEtcFile config.sh/ensight`
_foamSource `$WM_PROJECT_DIR/bin/foamEtcFile config.sh/gperftools`

OpenFOAM v8:
Line 86 till 89:
#- MPI implementation:
#    WM_MPLIB = SYSTEMOPENMPI | OPENMPI | SYSTEMMPI | MPICH |
MPICH-GM | HPMPI
#   | MPI | FJMPI | QSMPI | SGIMPI | INTELMPI
export WM_MPLIB=SYSTEMOPENMPI

Line 169 till 174:
# Source user setup files for optional packages
# ~
_foamSource `$WM_PROJECT_DIR/bin/foamEtcFile config.sh/mpi`
_foamSource `$WM_PROJECT_DIR/bin/foamEtcFile config.sh/paraview`
_foamSource `$WM_PROJECT_DIR/bin/foamEtcFile config.sh/ensight`
_foamSource `$WM_PROJECT_DIR/bin/foamEtcFile config.sh/gperftools`


Are you think these are the right lines ?

I wish you a healthy start into the new year,
Kai

Am 30.12.20 um 15:25 schrieb tladd via users:

Probably because OF cannot find your mpi installation. Once you
set your OF environment, where is it looking for mpicc? Note the
OF environment overrides your .bashrc once you source the OF
bashrc. That takes its settings from the src/etc directory in the
OF source code.


On 12/29/20 10:23 AM, Kahnbein Kai via users wrote:

[External Email]

Thank you for this hint. I installed OpenFOAM v8 (the newest) on my
computer and it works ...

At the Version v7 i get still this mpi error 

I dont know why ...

I wish you a healthy start into the new year :)


Am 28.12.20 um 19:16 schrieb Benson Muite via users:

Have you tried reinstalling OpenFOAM? If you are mostly working
in a
desktop, there are pre-compiled versions available:
https://urldefense.proofpoint.com/v2/url?u=https-3A__openfoam.com_download_&d=DwIDaQ&c=sJ6xIWYx-zLMB3EPkvcnVg&r=kgFAU2BfgKe7cozjrP7uWDPH6xt6LAmYVlQPwQuK7ek&m=9YEwGLzNfCD1pAUuvNpqStsbpagtNfIzEt6wL6f3_7I&s=bZFAwh79J3ZL1Ut9Jt4qj-kBCubrvjsLNhq51hnAwXk&e= 




If you are using a pre-compiled version, do also consider 
reporting

the error to the packager. It seems unlikely to be an MPI error,
more
likely something with OpenFOAM and/or the setup.

On 12/28/20 6:25 PM, Kahnbein Kai via users wrote:

Good morning,
im trying to fix this error by myself and i have a little update.
The ompi version i use is the:
Code:

kai@Kai-Desktop:~/Dokumente$ mpirun --version
mpirun (Open MPI) 4.0.3

If i create a *.c file, with the following content:
Code:

#include 
#include 

int main(int argc, char** argv) {
 // Initialize the MPI environment
 MPI_Init(NULL, NULL);

 // Get the number of processes
 int world_size;
 MPI_Comm_size(MPI_COMM_WORLD, &world_size);

 // Get the rank of the process
 int world_rank;
 MPI_Comm_rank(MPI_COMM_WORLD, &world_rank);

 // Get t