Young Park created ARROW-1130:
---------------------------------

             Summary: io-hdfs-test failure when built with GCC 4.8
                 Key: ARROW-1130
                 URL: https://issues.apache.org/jira/browse/ARROW-1130
             Project: Apache Arrow
          Issue Type: Bug
          Components: C++
         Environment: Ubuntu 16.04, GCC 4.8, Parquet-cpp
            Reporter: Young Park
            Priority: Blocker


Hi,

I have noticed that arrow-cpp's io-hdfs-test fails when compiled with GCC 4.8, 
but passes when compiled with GCC 5.4 (as it just skips all tests as it doesn't 
connect to the HDFS client)

I went into the test output log and it seemed to want me to set the variable 
ARROW_HDFS_TEST_USER, so I set the variable to 'root' and ARROW_HDFS_TEST_PORT 
to '9000' (which is the port that I use to connect to my local hdfs) and the 
test passes.

Do I need to configure the environment and the variables in a specific way to 
get it to work?

I'm mainly asking as I am trying to use arrow and parquet c++ libraries in an 
external project and I continue to run into segfaults in the libhdfs jni_helper 
even though I successfully connect to HDFS on my local Hadoop cluster and even 
read in a single Parquet file and am hoping that somehow this will help me 
figure out the issue in my external project as well.

Thank you in advance for your help.




--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to