Other guys also run into this on Mac.
The spark binary is downloaded to itests/thirdparty and then unpacked and 
copied to itests/qtest-spark/target/spark. Maybe you can manually do the 
process and check if anything goes wrong.

Cheers,
Rui Li

-----Original Message-----
From: Sergey Shelukhin [mailto:ser...@hortonworks.com] 
Sent: Friday, July 03, 2015 6:32 AM
To: dev@hive.apache.org
Subject: Re: problems running spark tests

I was able to get the tests to run with the parameter Hari suggested, on a 
different (Linux) machine.
However, on my Mac laptop, the bin/ part of spark directory is not regenerated. 
I guess I will do the usual shamanic dances like nuking the maven repo, 
re-cloning the code, etc., next time I need it. If that doesn’t work I might 
file a bug or revive this thread.

On 15/7/2, 11:40, "Szehon Ho" <sze...@cloudera.com> wrote:

>This works for me..
>
>mvn test -Dtest=TestSparkCliDriver -Dqfile=join1.q -Phadoop-2 For 
>multiple tests you might need to add quotes around the comma-separated 
>list.
>
>I haven't seen that error, did you run from itests directory?  There 
>are some steps in pom to copy over the spark scripts needed to run, 
>that look like they were skipped as that script is not available in your run.
>
>Thanks
>Szehon
>
>On Thu, Jul 2, 2015 at 10:31 AM, Sergey Shelukhin 
><ser...@hortonworks.com>
>wrote:
>
>> Hi. I am trying to run TestSparkCliDriver.
>>
>> 1) Spark tests do not appear to support specifying a query like other  
>>tests; when I run mvn test -Phadoop-2 -Dtest=TestSparkCliDriver tests 
>>run,  but with  mvn test -Phadoop-2 -Dtest=TestSparkCliDriver 
>>-Dqfile=foo.q,bar.q,..
>>test
>> just instantly succeeds w/o running any queries. Is there some other 
>>way  to specify those?
>>
>> 2) When I run all the test, they fail with the below exception  I’ve 
>>done a full regular build (mvn clean install … in root and then  
>>itests). Are more steps necessary?
>> The itests/qtest-spark/../../itests/qtest-spark/target/spark 
>>directory  exists and has bunch of stuff, but bin/ subdirectory that 
>>it tries to run  from is indeed empty.
>>
>> 2015-07-02 10:11:58,678 ERROR [main]: spark.SparkTask
>> (SessionState.java:printError(987)) - Failed to execute spark task, 
>> with exception 
>> 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark 
>> client.)'
>> org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create 
>> spark client.
>> at
>> 
>>org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(Spa
>>rkS
>>es
>> sionImpl.java:57)
>> at
>> 
>>org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.g
>>etS
>>es
>> sion(SparkSessionManagerImpl.java:114)
>> at
>> 
>>org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(Sp
>>ark
>>Ut
>> ilities.java:127)
>> at
>> 
>>org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:
>>101
>>)
>> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
>> at
>> 
>>org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.jav
>>a:8
>>9)
>> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1672)
>> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1431)
>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1212)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1063)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1053)
>> at
>>org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:21
>>3)  at 
>>org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165)
>> at 
>>org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376)
>> at 
>>org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:311)
>> at 
>>org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:840)
>> at
>> 
>>org.apache.hadoop.hive.cli.TestSparkCliDriver.<clinit>(TestSparkCliDri
>>ver
>>.j
>> ava:59)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)  at
>> 
>>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
>>ava
>>:6
>> 2)
>> at
>> 
>>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccess
>>orI
>>mp
>> l.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:497)
>> at
>> 
>>org.junit.internal.runners.SuiteMethod.testFromSuiteMethod(SuiteMethod
>>.ja
>>va
>> :35)
>> at org.junit.internal.runners.SuiteMethod.<init>(SuiteMethod.java:24)
>> at
>> 
>>org.junit.internal.builders.SuiteMethodBuilder.runnerForClass(SuiteMet
>>hod
>>Bu
>> ilder.java:11)
>> at
>> 
>>org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder
>>.ja
>>va
>> :59)
>> at
>> 
>>org.junit.internal.builders.AllDefaultPossibilitiesBuilder.runnerForCl
>>ass
>>(A
>> llDefaultPossibilitiesBuilder.java:26)
>> at
>> 
>>org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder
>>.ja
>>va
>> :59)
>> at
>>org.junit.internal.requests.ClassRequest.getRunner(ClassRequest.java:2
>>6)
>> at
>> 
>>org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider
>>.ja
>>va
>> :262)
>> at
>> 
>>org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4P
>>rov
>>id
>> er.java:153)
>> at
>> 
>>org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.
>>jav
>>a:
>> 124)
>> at
>> 
>>org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClas
>>sLo
>>ad
>> er(ForkedBooter.java:200)
>> at
>> 
>>org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(Forke
>>dBo
>>ot
>> er.java:153)
>> at
>> 
>>org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:1
>>03)  Caused by: java.io.IOException: Cannot run program
>> 
>>“[snip]/itests/qtest-spark/../../itests/qtest-spark/target/spark/bin/s
>>par
>>k-
>> submit": error=2, No such file or directory  at 
>>java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
>> at
>> 
>>org.apache.hive.spark.client.SparkClientImpl.startDriver(SparkClientImpl.
>>ja
>> va:415)
>> at
>> 
>>org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:
>>94
>> )
>> at
>> 
>>org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClie
>>ntF
>>ac
>> tory.java:80)
>> at
>> 
>>org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(Remo
>>teH
>>iv
>> eSparkClient.java:91)
>> at
>> 
>>org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHive
>>Spa
>>rk
>> Client(HiveSparkClientFactory.java:65)
>> at
>> 
>>org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(Spa
>>rkS
>>es
>> sionImpl.java:55)
>> ... 33 more
>> Caused by: java.io.IOException: error=2, No such file or directory  
>>at java.lang.UNIXProcess.forkAndExec(Native Method)  at 
>>java.lang.UNIXProcess.<init>(UNIXProcess.java:248)
>> at java.lang.ProcessImpl.start(ProcessImpl.java:134)
>> at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
>> ... 39 more
>>
>>
>>

Reply via email to