That's a little odd. Usually, the specific .h files that are listed as dependencies came from somewhere​ -- usually either part of the GNU Autotools dependency analysis.
I'm guessing that /usr/lib/gcc/x86_64-linux-gnu/9/include/float.h doesn't actually exist on your system -- but then how did it get into Open MPI's makefiles? Did you run configure on one machine and make on a different machine, perchance? ________________________________ From: users <users-boun...@lists.open-mpi.org> on behalf of Jeffrey Layton via users <users@lists.open-mpi.org> Sent: Monday, July 17, 2023 2:05 PM To: Open MPI Users <users@lists.open-mpi.org> Cc: Jeffrey Layton <layto...@gmail.com> Subject: [OMPI users] Error build Open MPI 4.1.5 with GCC 11.3 Good afternoon, I'm trying to build Open MPI 4.1.5 using GCC 11.3. However, I get an error that I'm not sure how to correct. The error is, ... CC pscatter.lo CC piscatter.lo CC pscatterv.lo CC piscatterv.lo CC psend.lo CC psend_init.lo CC psendrecv.lo CC psendrecv_replace.lo CC pssend_init.lo CC pssend.lo CC pstart.lo CC pstartall.lo CC pstatus_c2f.lo CC pstatus_f2c.lo CC pstatus_set_cancelled.lo CC pstatus_set_elements.lo CC pstatus_set_elements_x.lo CC ptestall.lo CC ptestany.lo CC ptest.lo CC ptest_cancelled.lo CC ptestsome.lo CC ptopo_test.lo CC ptype_c2f.lo CC ptype_commit.lo CC ptype_contiguous.lo CC ptype_create_darray.lo make[3]: *** No rule to make target '/usr/lib/gcc/x86_64-linux-gnu/9/include/float.h', needed by 'ptype_create_f90_complex.lo'. Stop. make[3]: Leaving directory '/home/laytonjb/src/openmpi-4.1.5/ompi/mpi/c/profile' make[2]: *** [Makefile:2559: all-recursive] Error 1 make[2]: Leaving directory '/home/laytonjb/src/openmpi-4.1.5/ompi/mpi/c' make[1]: *** [Makefile:3566: all-recursive] Error 1 make[1]: Leaving directory '/home/laytonjb/src/openmpi-4.1.5/ompi' make: *** [Makefile:1912: all-recursive] Error 1 Here is the configuration output from configure: Open MPI configuration: ----------------------- Version: 4.1.5 Build MPI C bindings: yes Build MPI C++ bindings (deprecated): no Build MPI Fortran bindings: mpif.h, use mpi, use mpi_f08 MPI Build Java bindings (experimental): no Build Open SHMEM support: false (no spml) Debug build: no Platform file: (none) Miscellaneous ----------------------- CUDA support: no HWLOC support: external Libevent support: internal Open UCC: no PMIx support: Internal Transports ----------------------- Cisco usNIC: no Cray uGNI (Gemini/Aries): no Intel Omnipath (PSM2): no Intel TrueScale (PSM): no Mellanox MXM: no Open UCX: no OpenFabrics OFI Libfabric: no OpenFabrics Verbs: no Portals4: no Shared memory/copy in+copy out: yes Shared memory/Linux CMA: yes Shared memory/Linux KNEM: no Shared memory/XPMEM: no TCP: yes Resource Managers ----------------------- Cray Alps: no Grid Engine: no LSF: no Moab: no Slurm: yes ssh/rsh: yes Torque: no OMPIO File Systems ----------------------- DDN Infinite Memory Engine: no Generic Unix FS: yes IBM Spectrum Scale/GPFS: no Lustre: no PVFS2/OrangeFS: no Any suggestions! Thanks! Jeff