LuciferYang commented on PR #45583:
URL: https://github.com/apache/spark/pull/45583#issuecomment-2255088028

   
   Sorry to disturb everyone, but when I execute `OrcEncryptionSuite` on my M2 
Max, I find that there are some differences when using Hadoop 3.4.0 and Hadoop 
3.3.4.
   
   `build/sbt clean "sql/testOnly 
org.apache.spark.sql.execution.datasources.orc.OrcEncryptionSuite"`
   
   - branch-3.5(with hadoop 3.3.4)
   
   ```
   [info] OrcEncryptionSuite:
   14:44:11.580 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load 
native-hadoop library for your platform... using builtin-java classes where 
applicable
   [info] - Write and read an encrypted file (1 second, 921 milliseconds)
   [info] - Write and read an encrypted table (374 milliseconds)
   [info] - SPARK-35325: Write and read encrypted nested columns (358 
milliseconds)
   [info] - SPARK-35992: Write and read fully-encrypted columns with default 
masking (570 milliseconds)
   14:44:15.461 WARN 
org.apache.spark.sql.execution.datasources.orc.OrcEncryptionSuite: 
   
   [info] Run completed in 4 seconds, 694 milliseconds.
   [info] Total number of tests run: 4
   [info] Suites: completed 1, aborted 0
   [info] Tests: succeeded 4, failed 0, canceled 0, ignored 0, pending 0
   [info] All tests passed.
   ```
   
   - master(with hadoop 3.4.0)
   
   ```
   [info] OrcEncryptionSuite:
   14:49:15.267 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load 
native-hadoop library for your platform... using builtin-java classes where 
applicable
   14:49:17.636 WARN org.apache.hadoop.crypto.OpensslCipher: Failed to load 
OpenSSL Cipher.
   java.lang.UnsatisfiedLinkError: 'boolean 
org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl()'
        at org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl(Native 
Method)
        at 
org.apache.hadoop.crypto.OpensslCipher.<clinit>(OpensslCipher.java:86)
        at 
org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec.<init>(OpensslAesCtrCryptoCodec.java:36)
        at 
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native
 Method)
        at 
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
        at 
java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at 
java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499)
   [info] - Write and read an encrypted file (2 seconds, 343 milliseconds)
   [info] - Write and read an encrypted table (405 milliseconds)
   [info] - SPARK-35325: Write and read encrypted nested columns (308 
milliseconds)
   [info] - SPARK-35992: Write and read fully-encrypted columns with default 
masking (555 milliseconds)
   14:49:19.493 WARN 
org.apache.spark.sql.execution.datasources.orc.OrcEncryptionSuite: 
   
   [info] Run completed in 5 seconds, 84 milliseconds.
   [info] Total number of tests run: 4
   [info] Suites: completed 1, aborted 0
   [info] Tests: succeeded 4, failed 0, canceled 0, ignored 0, pending 0
   [info] All tests passed.
   ```
   
   When using Hadoop 3.4.0, although there were no test failures, an 
`UnsatisfiedLinkError` was thrown. Is this expected, or do I need to configure 
additional dependencies? This issue should only occur on Apple Silicon chips 
now. @dongjoon-hyun @steveloughran 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to