[ 
https://issues.apache.org/jira/browse/HIVE-15887?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15863160#comment-15863160
 ] 

Rui Li commented on HIVE-15887:
-------------------------------

Hi [~KaiXu], the failure is just a warning and it shouldn't make the query fail:
{code}
  @Override
  public String getAppID() {
    Future<String> getAppID = sparkClient.run(new GetAppIDJob());
    try {
      return getAppID.get(sparkClientTimeoutInSeconds, TimeUnit.SECONDS);
    } catch (Exception e) {
      LOG.warn("Failed to get APP ID.", e);
      return null;
    }
  }
{code}
Please check your log again to find the real cause of the failure. Btw this has 
nothing to do with Spark and you don't need to log a Spark JIRA.

> could not get APP ID and cause failed to connect to spark driver on 
> yarn-client mode
> ------------------------------------------------------------------------------------
>
>                 Key: HIVE-15887
>                 URL: https://issues.apache.org/jira/browse/HIVE-15887
>             Project: Hive
>          Issue Type: Bug
>          Components: Hive, Spark
>    Affects Versions: 2.2.0
>         Environment: Hive2.2
> Spark2.0.2
> hadoop2.7.1
>            Reporter: KaiXu
>
> {noformat}
> 2017-02-13T03:10:01,639 INFO [stderr-redir-1] client.SparkClientImpl: 
> 17/02/13 03:10:01 INFO yarn.Client: Application report for 
> application_1486905599813_0046 (state: ACCEPTED)
> 2017-02-13T03:10:06,640 INFO [stderr-redir-1] client.SparkClientImpl: 
> 17/02/13 03:10:06 INFO yarn.Client: Application report for 
> application_1486905599813_0046 (state: ACCEPTED)
> 2017-02-13T03:10:08,176 WARN [c807cf48-301a-47b4-96df-495b2827d6ba main] 
> impl.RemoteSparkJobStatus: Failed to get APP ID.
> java.util.concurrent.TimeoutException
> at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:49) 
> ~[netty-all-4.0.29.Final.jar:4.0.29.Final]
> at 
> org.apache.hadoop.hive.ql.exec.spark.status.impl.RemoteSparkJobStatus.getAppID(RemoteSparkJobStatus.java:65)
>  ~[hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:114) 
> ~[hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199) 
> ~[hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) 
> ~[hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2168) 
> ~[hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1824) 
> ~[hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1511) 
> ~[hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1222) 
> ~[hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1212) 
> ~[hive-exec-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233) 
> ~[hive-cli-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184) 
> ~[hive-cli-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:400) 
> ~[hive-cli-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336) 
> ~[hive-cli-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:430) 
> ~[hive-cli-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:446) 
> ~[hive-cli-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749) 
> ~[hive-cli-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:715) 
> ~[hive-cli-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:642) 
> ~[hive-cli-2.2.0-SNAPSHOT.jar:2.2.0-SNAPSHOT]
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_60]
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> ~[?:1.8.0_60]
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  ~[?:1.8.0_60]
> at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_60]
> at org.apache.hadoop.util.RunJar.run(RunJar.java:221) 
> ~[hadoop-common-2.7.1.jar:?]
> at org.apache.hadoop.util.RunJar.main(RunJar.java:136) 
> ~[hadoop-common-2.7.1.jar:?]
> 2017-02-13T03:10:11,641 INFO [stderr-redir-1] client.SparkClientImpl: 
> 17/02/13 03:10:11 INFO yarn.Client: Application report for 
> application_1486905599813_0046 (state: ACCEPTED)
> 2017-02-13T03:10:16,643 INFO [stderr-redir-1] client.SparkClientImpl: 
> 17/02/13 03:10:16 INFO yarn.Client: Application report for 
> application_1486905599813_0046 (state: ACCEPTED)
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Reply via email to