Hi Antoine,

Yes I did. It was installed in the correct directory in the correct conda
env. Moreover make can definitely find the other libraries under the same
directory such as libarrow.a & libarrow.dylib but oddly not libparquet.a or
libparquet.dylib.

Ian

On Thursday, February 3, 2022, Antoine Pitrou <anto...@python.org> wrote:

>
> Hi Ian,
>
> Did you run "make install" as well after compiling Arrow C++? Perhaps
> PyArrow is picking up an old installed version?
>
> Regards
>
> Antoine.
>
> Le 03/02/2022 à 08:35, Ian Joiner a écrit :
>
>> Hi,
>>
>> In order to prevent problematic PRs from happening again I’m cleaning up
>> my local env.
>>
>> Here is my cmake:
>>
>> cmake -DCMAKE_INSTALL_PREFIX=$ARROW_HOME \
>>        -DCMAKE_INSTALL_LIBDIR=lib \
>>        -DCMAKE_BUILD_TYPE=debug \
>>        -DARROW_WITH_BZ2=ON \
>>        -DARROW_WITH_ZLIB=ON \
>>        -DARROW_WITH_ZSTD=ON \
>>        -DARROW_WITH_LZ4=ON \
>>        -DARROW_WITH_SNAPPY=ON \
>>        -DARROW_WITH_BROTLI=ON \
>>        -DARROW_PARQUET=ON \
>>        -DARROW_PYTHON=ON \
>>        -DARROW_ORC=ON \
>>        -DARROW_BUILD_TESTS=ON \
>>        ..
>>
>> After running make -j4 as in https://github.com/apache/arro
>> w/blob/master/docs/source/developers/python.rst I ran the C++ unit tests
>> all of which passed. Parquet clearly exists.
>>
>> After running make install I began to build pyarrow. However unless turn
>> off Parquet I got a Parquet error.
>>
>> CMake Error at /Users/karlkatzen/anaconda3/en
>> vs/pyarrow-dev/share/cmake-3.21/Modules/FindPackageHandleStandardArgs.cmake:230
>> (message):
>>    Could NOT find Parquet (missing: PARQUET_INCLUDE_DIR PARQUET_LIB_DIR
>>    PARQUET_SO_VERSION)
>> Call Stack (most recent call first):
>>    /Users/karlkatzen/anaconda3/envs/pyarrow-dev/share/cmake-3.
>> 21/Modules/FindPackageHandleStandardArgs.cmake:594
>> (_FPHSA_FAILURE_MESSAGE)
>>    cmake_modules/FindParquet.cmake:115 (find_package_handle_standard_
>> args)
>>    CMakeLists.txt:447 (find_package)
>>
>> What’s the cause of this problem?
>>
>> Ian
>>
>>
>>
>>

Reply via email to