[OMPI users] Segmentation fault

2023-08-09 Thread Aziz Ogutlu via users
Hi there all, We're using SU2 with OpenMPI 4.0.3, gcc 8.5.0 on Redhat 7.9. We compiled all component for using on HPC system. When I use SU2 with QuickStart config file with OpenMPI, it gives error like in attached file. Command is: |mpirun -np 8 --allow-run-as-root SU2_CFD inv_NACA0012.cfg|

Re: [OMPI users] Segmentation fault

2023-08-09 Thread Jeff Squyres (jsquyres) via users
I'm afraid I don't know anything about the SU2 application. You are using Open MPI v4.0.3, which is fairly old. Many bug fixes have been released since that version. Can you upgrade to the latest version of Open MPI (v4.1.5)? From: users on behalf of Aziz Ogut

Re: [OMPI users] Segmentation fault

2023-08-09 Thread Aziz Ogutlu via users
Hi Jeff, I also tried with OpenMPI 4.1.5, I got same error. On 8/9/23 17:05, Jeff Squyres (jsquyres) wrote: I'm afraid I don't know anything about the SU2 application. You are using Open MPI v4.0.3, which is fairly old.  Many bug fixes have been released since that version.  Can you upgrade

Re: [OMPI users] Segmentation fault

2023-08-09 Thread Jeff Squyres (jsquyres) via users
Ok, thanks for upgrading. Are you also using the latest version of SU2? Without knowing what that application is doing, it's a little hard to debug the issue from our side. At first glance, it looks like it is crashing when it has completed writing a file and is attempting to close it. But th

Re: [OMPI users] Segmentation fault

2023-08-09 Thread Jeff Squyres (jsquyres) via users
Without knowing anything about SU2, we can't really help debug the issue. The seg fault stack trace that you provided was quite deep; we don't really have the resources to go learn about how a complex application like SU2 is implemented -- sorry! Can you or they provide a small, simple MPI app

Re: [OMPI users] Segmentation fault

2023-08-09 Thread Aziz Ogutlu via users
Hi Jeff, I'm using also SU2 lastest version, also open issue on github page. They says it could be about OpenMPI :) On 8/9/23 17:28, Jeff Squyres (jsquyres) wrote: Ok, thanks for upgrading.  Are you also using the latest version of SU2? Without knowing what that application is doing, it's

Re: [OMPI users] MPI I/O, Romio vs Ompio on GPFS

2023-08-09 Thread Latham, Robert J. via users
Hah! look at me ressurecting this old thread...I should check in on my OpenMPI folder more often than every 18 months. Historically, GPFS performs best with block-aligned I/O. Today, the performance difference between unaligned and aligned is not as dramatic as it used to be (at least on ORNL's