Alternatively, setting spark.driver.extraClassPath should work.

Cheers

On Fri, Jul 3, 2015 at 2:59 AM, Steve Loughran <ste...@hortonworks.com>
wrote:

>
>> On Thu, Jul 2, 2015 at 7:38 AM, Daniel Haviv <
>> daniel.ha...@veracity-group.com> wrote:
>>
>>> Hi,
>>> I'm trying to start the thrift-server and passing it azure's blob
>>> storage jars but I'm failing on :
>>>  Caused by: java.io.IOException: No FileSystem for scheme: wasb
>>>         at
>>> org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2584)
>>>         at
>>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591)
>>>         at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
>>>         at
>>> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630)
>>>         at
>>> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612)
>>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
>>>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169)
>>>         at
>>> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:342)
>>>         ... 16 more
>>>
>>>  If I start the spark-shell the same way, everything works fine.
>>>
>>>  spark-shell command:
>>>   ./bin/spark-shell --master yarn --jars
>>> /home/hdiuser/azureclass/azure-storage-1.2.0.jar,/home/hdiuser/azureclass/hadoop-azure-2.7.0.jar
>>> --num-executors 4
>>>
>>>  thrift-server command:
>>>  ./sbin/start-thriftserver.sh --master yarn--jars
>>> /home/hdiuser/azureclass/azure-storage-1.2.0.jar,/home/hdiuser/azureclass/hadoop-azure-2.7.0.jar
>>> --num-executors 4
>>>
>>>  How can I pass dependency jars to the thrift server?
>>>
>>>  Thanks,
>>> Daniel
>>>
>>
>>
>
>  you should be able to add the JARs to the environment variable
> SPARK_SUBMIT_CLASSPATH or SPARK_CLASSPATH and have them picked up when
> bin/compute-classpath.{cmd.sh} builds up the classpath
>
>
>

Reply via email to