[ https://issues.apache.org/jira/browse/HIVE-9125?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Jimmy Xiang resolved HIVE-9125. ------------------------------- Resolution: Fixed Looked into the cluster and found out that it is a log4j configuration issue, not a code problem. Fixed the configuration. > RSC stdout is logged twice [Spark Branch] > ----------------------------------------- > > Key: HIVE-9125 > URL: https://issues.apache.org/jira/browse/HIVE-9125 > Project: Hive > Issue Type: Sub-task > Components: Spark > Affects Versions: spark-branch > Reporter: Brock Noland > Assignee: Jimmy Xiang > Priority: Minor > > This is quite strange and I don't see the issue at first glance. > {noformat} > 2014-12-16 12:44:48,826 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - 2014-12-16T12:44:48.638-0500: [Full GC > [PSYoungGen: 111616K->50711K(143360K)] [ParOldGen: 349385K->349385K(349696K)] > 461001K->400097K(493056K) [PSPermGen: 58684K->58684K(58880K)], 0.1879000 > secs] [Times: user=1.14 sys=0.00, real=0.19 secs] > 2014-12-16 12:44:48,826 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - 2014-12-16T12:44:48.638-0500: [Full GC > [PSYoungGen: 111616K->50711K(143360K)] [ParOldGen: 349385K->349385K(349696K)] > 461001K->400097K(493056K) [PSPermGen: 58684K->58684K(58880K)], 0.1879000 > secs] [Times: user=1.14 sys=0.00, real=0.19 secs] > {noformat} > {noformat} > 2014-12-16 12:44:48,554 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - > "sparkDriver-akka.actor.default-dispatcher-3" daemon prio=10 > tid=0x00007f9e3c5cc000 nid=0x3698 runnable [0x00007f9e30376000] > 2014-12-16 12:44:48,554 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - > "sparkDriver-akka.actor.default-dispatcher-3" daemon prio=10 > tid=0x00007f9e3c5cc000 nid=0x3698 runnable [0x00007f9e30376000] > 2014-12-16 12:44:48,554 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - java.lang.Thread.State: RUNNABLE > 2014-12-16 12:44:48,554 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - java.lang.Thread.State: RUNNABLE > 2014-12-16 12:44:48,554 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:109) > 2014-12-16 12:44:48,554 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:109) > 2014-12-16 12:44:48,554 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:656) > 2014-12-16 12:44:48,554 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:656) > 2014-12-16 12:44:48,554 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:767) > 2014-12-16 12:44:48,554 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:767) > 2014-12-16 12:44:48,554 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:131) > 2014-12-16 12:44:48,554 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:131) > 2014-12-16 12:44:48,554 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:17) > 2014-12-16 12:44:48,554 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:17) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:139) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:139) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:17) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:17) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:672) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:672) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hadoop.hive.ql.exec.Utilities.deserializeObjectByKryo(Utilities.java:1050) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hadoop.hive.ql.exec.Utilities.deserializeObjectByKryo(Utilities.java:1050) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hadoop.hive.ql.exec.Utilities.deserializePlan(Utilities.java:941) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hadoop.hive.ql.exec.Utilities.deserializePlan(Utilities.java:941) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hadoop.hive.ql.exec.Utilities.deserializePlan(Utilities.java:955) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hadoop.hive.ql.exec.Utilities.deserializePlan(Utilities.java:955) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:397) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:397) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hadoop.hive.ql.exec.Utilities.getMapWork(Utilities.java:287) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hadoop.hive.ql.exec.Utilities.getMapWork(Utilities.java:287) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hadoop.hive.ql.io.CombineHiveInputFormat$CombineHiveInputSplit.<init>(CombineHiveInputFormat.java:101) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hadoop.hive.ql.io.CombineHiveInputFormat$CombineHiveInputSplit.<init>(CombineHiveInputFormat.java:101) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getCombineSplits(CombineHiveInputFormat.java:441) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getCombineSplits(CombineHiveInputFormat.java:441) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getSplits(CombineHiveInputFormat.java:508) > 2014-12-16 12:44:48,555 INFO [stdout-redir-1]: client.SparkClientImpl > (SparkClientImpl.java:run(435)) - at > org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getSplits(CombineHiveInputFormat.java:508) > {noformat} -- This message was sent by Atlassian JIRA (v6.3.4#6332)