Always thank you for the comments!! I'd like to follow the first choice, installing MPI on my machine.
I installed openmpi-2.0.1. Coming to the original problem, I tried install trillions 12.4.2 again. but now I received this error message while i am installing trilinos. what might be the problem now? .. Do I use other packages for mph? " ..... [ 5%] Building CXX object packages/teuchos/core/src/CMakeFiles/teuchoscore.dir/Teuchos_VerbosityLevel.cpp.o [ 5%] Building CXX object packages/teuchos/core/src/CMakeFiles/teuchoscore.dir/Teuchos_Workspace.cpp.o [ 5%] Building CXX object packages/teuchos/core/src/CMakeFiles/teuchoscore.dir/Teuchos_dyn_cast.cpp.o [ 5%] Building CXX object packages/teuchos/core/src/CMakeFiles/teuchoscore.dir/Teuchos_stacktrace.cpp.o [ 5%] *Linking CXX shared library libteuchoscore.dylib* Undefined symbols for architecture x86_64: "_MPI_Allgather", referenced from: Teuchos::GlobalMPISession::allGather(int, Teuchos::ArrayView<int> const&) in Teuchos_GlobalMPISession.cpp.o "_MPI_Allreduce", referenced from: Teuchos::GlobalMPISession::sum(int) in Teuchos_GlobalMPISession.cpp.o "_MPI_Barrier", referenced from: Teuchos::GlobalMPISession::barrier() in Teuchos_GlobalMPISession.cpp.o "_MPI_Comm_rank", referenced from: Teuchos::GlobalMPISession::initialize(std::__1::basic_ostream<char, std::__1::char_traits<char> >*) in Teuchos_GlobalMPISession.cpp.o "_MPI_Comm_size", referenced from: Teuchos::GlobalMPISession::initialize(std::__1::basic_ostream<char, std::__1::char_traits<char> >*) in Teuchos_GlobalMPISession.cpp.o "_MPI_Finalize", referenced from: Teuchos::GlobalMPISession::~GlobalMPISession() in Teuchos_GlobalMPISession.cpp.o "_MPI_Get_processor_name", referenced from: Teuchos::GlobalMPISession::GlobalMPISession(int*, char***, std::__1::basic_ostream<char, std::__1::char_traits<char> >*) in Teuchos_GlobalMPISession.cpp.o "_MPI_Init", referenced from: Teuchos::GlobalMPISession::GlobalMPISession(int*, char***, std::__1::basic_ostream<char, std::__1::char_traits<char> >*) in Teuchos_GlobalMPISession.cpp.o "_MPI_Initialized", referenced from: Teuchos::GlobalMPISession::GlobalMPISession(int*, char***, std::__1::basic_ostream<char, std::__1::char_traits<char> >*) in Teuchos_GlobalMPISession.cpp.o Teuchos::GlobalMPISession::initialize(std::__1::basic_ostream<char, std::__1::char_traits<char> >*) in Teuchos_GlobalMPISession.cpp.o Teuchos::Time::Time(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, bool) in Teuchos_Time.cpp.o Teuchos::Time::start(bool) in Teuchos_Time.cpp.o Teuchos::Time::Time(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, bool) in Teuchos_Time.cpp.o Teuchos::Time::wallTime() in Teuchos_Time.cpp.o Teuchos::Time::stop() in Teuchos_Time.cpp.o ... "_MPI_Wtime", referenced from: Teuchos::Time::Time(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, bool) in Teuchos_Time.cpp.o Teuchos::Time::start(bool) in Teuchos_Time.cpp.o Teuchos::Time::Time(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, bool) in Teuchos_Time.cpp.o Teuchos::Time::wallTime() in Teuchos_Time.cpp.o Teuchos::Time::stop() in Teuchos_Time.cpp.o Teuchos::Time::totalElapsedTime(bool) const in Teuchos_Time.cpp.o "_ompi_mpi_comm_world", referenced from: Teuchos::GlobalMPISession::initialize(std::__1::basic_ostream<char, std::__1::char_traits<char> >*) in Teuchos_GlobalMPISession.cpp.o Teuchos::GlobalMPISession::barrier() in Teuchos_GlobalMPISession.cpp.o Teuchos::GlobalMPISession::sum(int) in Teuchos_GlobalMPISession.cpp.o Teuchos::GlobalMPISession::allGather(int, Teuchos::ArrayView<int> const&) in Teuchos_GlobalMPISession.cpp.o "_ompi_mpi_int", referenced from: Teuchos::GlobalMPISession::sum(int) in Teuchos_GlobalMPISession.cpp.o Teuchos::GlobalMPISession::allGather(int, Teuchos::ArrayView<int> const&) in Teuchos_GlobalMPISession.cpp.o "_ompi_mpi_op_sum", referenced from: Teuchos::GlobalMPISession::sum(int) in Teuchos_GlobalMPISession.cpp.o ld: symbol(s) not found for architecture x86_64 clang: error: linker command failed with exit code 1 (use -v to see invocation) make[2]: *** [packages/teuchos/core/src/libteuchoscore.12.4.2.dylib] Error 1 make[1]: *** [packages/teuchos/core/src/CMakeFiles/teuchoscore.dir/all] Error 2 make: *** [all] Error 2 2016년 10월 22일 토요일 오후 9시 14분 1초 UTC-5, Wolfgang Bangerth 님의 말: > > > Jaekwang, > > > > */Users/jaekwangjk/Programs/trilinos-12.4.2-Source/packages/teuchos/core/src/Teuchos_GlobalMPISession.cpp:49:12: > > > > **fatal error: * > > > > * 'mpi.h' file not found* > > If you configure Trilinos to use MPI (which you did when you called > Trilinos' > cmake script), then you obviously have to have MPI installed on your > system. > You have two options: You can either install the MPI package on your > machine, > or you can tell Trilinos not to use MPI. That choice is yours. > > Best > W. > -- > ------------------------------------------------------------------------ > Wolfgang Bangerth email: bang...@colostate.edu > <javascript:> > www: http://www.math.colostate.edu/~bangerth/ > > -- The deal.II project is located at http://www.dealii.org/ For mailing list/forum options, see https://groups.google.com/d/forum/dealii?hl=en --- You received this message because you are subscribed to the Google Groups "deal.II User Group" group. To unsubscribe from this group and stop receiving emails from it, send an email to dealii+unsubscr...@googlegroups.com. For more options, visit https://groups.google.com/d/optout.