[ 
https://issues.apache.org/jira/browse/HADOOP-19167?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17845202#comment-17845202
 ] 

ASF GitHub Bot commented on HADOOP-19167:
-----------------------------------------

hadoop-yetus commented on PR #6807:
URL: https://github.com/apache/hadoop/pull/6807#issuecomment-2103901140

   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |:----:|----------:|--------:|:--------:|:-------:|
   | +0 :ok: |  reexec  |   0m 33s |  |  Docker mode activated.  |
   |||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files 
found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +0 :ok: |  detsecrets  |   0m  0s |  |  detect-secrets was not available.  
|
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain 
any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to 
include 1 new or modified test files.  |
   |||| _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  44m 48s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  17m 24s |  |  trunk passed with JDK 
Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1  |
   | +1 :green_heart: |  compile  |  15m 54s |  |  trunk passed with JDK 
Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06  |
   | +1 :green_heart: |  checkstyle  |   1m 18s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   1m 42s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   1m 16s |  |  trunk passed with JDK 
Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1  |
   | +1 :green_heart: |  javadoc  |   0m 54s |  |  trunk passed with JDK 
Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06  |
   | +1 :green_heart: |  spotbugs  |   2m 36s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  35m 28s |  |  branch has no errors 
when building and testing our client artifacts.  |
   |||| _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   0m 57s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  16m 36s |  |  the patch passed with JDK 
Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1  |
   | +1 :green_heart: |  javac  |  16m 36s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  15m 46s |  |  the patch passed with JDK 
Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06  |
   | +1 :green_heart: |  javac  |  15m 46s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks 
issues.  |
   | +1 :green_heart: |  checkstyle  |   1m 14s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   1m 40s |  |  the patch passed  |
   | +1 :green_heart: |  javadoc  |   1m 10s |  |  the patch passed with JDK 
Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1  |
   | +1 :green_heart: |  javadoc  |   0m 53s |  |  the patch passed with JDK 
Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06  |
   | +1 :green_heart: |  spotbugs  |   2m 46s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  35m 39s |  |  patch has no errors 
when building and testing our client artifacts.  |
   |||| _ Other Tests _ |
   | +1 :green_heart: |  unit  |  20m 39s |  |  hadoop-common in the patch 
passed.  |
   | +1 :green_heart: |  asflicense  |   1m  3s |  |  The patch does not 
generate ASF License warnings.  |
   |  |   | 224m 13s |  |  |
   
   
   | Subsystem | Report/Notes |
   |----------:|:-------------|
   | Docker | ClientAPI=1.45 ServerAPI=1.45 base: 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6807/2/artifact/out/Dockerfile
 |
   | GITHUB PR | https://github.com/apache/hadoop/pull/6807 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall 
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
   | uname | Linux 00683afafac5 5.15.0-101-generic #111-Ubuntu SMP Tue Mar 5 
20:16:58 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / cbd328a2a6aa1b7487409e6620451e91c0e0ae53 |
   | Default Java | Private Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 |
   | Multi-JDK versions | 
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.22+7-post-Ubuntu-0ubuntu220.04.1 
/usr/lib/jvm/java-8-openjdk-amd64:Private 
Build-1.8.0_402-8u402-ga-2ubuntu1~20.04-b06 |
   |  Test Results | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6807/2/testReport/ |
   | Max. process+thread count | 1429 (vs. ulimit of 5500) |
   | modules | C: hadoop-common-project/hadoop-common U: 
hadoop-common-project/hadoop-common |
   | Console output | 
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-6807/2/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   




> Change of Codec configuration does not work
> -------------------------------------------
>
>                 Key: HADOOP-19167
>                 URL: https://issues.apache.org/jira/browse/HADOOP-19167
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: compress
>            Reporter: Zhikai Hu
>            Priority: Minor
>              Labels: pull-request-available
>
> In one of my projects, I need to dynamically adjust compression level for 
> different files. 
> However, I found that in most cases the new compression level does not take 
> effect as expected, the old compression level continues to be used.
> Here is the relevant code snippet:
> ZStandardCodec zStandardCodec = new ZStandardCodec();
> zStandardCodec.setConf(conf);
> conf.set("io.compression.codec.zstd.level", "5"); // level may change 
> dynamically
> conf.set("io.compression.codec.zstd", zStandardCodec.getClass().getName());
> writer = SequenceFile.createWriter(conf, 
> SequenceFile.Writer.file(sequenceFilePath),
>                                 
> SequenceFile.Writer.keyClass(LongWritable.class),
>                                 
> SequenceFile.Writer.valueClass(BytesWritable.class),
>                                 
> SequenceFile.Writer.compression(CompressionType.BLOCK));
> The reason is SequenceFile.Writer.init() method will call 
> CodecPool.getCompressor(codec, null) to get a compressor. 
> If the compressor is a reused instance, the conf is not applied because it is 
> passed as null:
> public static Compressor getCompressor(CompressionCodec codec, Configuration 
> conf) {
> Compressor compressor = borrow(compressorPool, codec.getCompressorType());
> if (compressor == null)
> { compressor = codec.createCompressor(); LOG.info("Got brand-new compressor 
> ["+codec.getDefaultExtension()+"]"); }
> else {
> compressor.reinit(conf);   //conf is null here
> ......
>  
> Please also refer to my unit test to reproduce the bug. 
> To address this bug, I modified the code to ensure that the configuration is 
> read back from the codec when a compressor is reused.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to