shell? Our app has a number of jars that
I don't particularly want to have to upload each time I want to run a small
ad-hoc spark-shell session.
Thanks,
Ishaaq
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/launching-concurrent-jobs-programmatically-t
sing the same SparkContext or
>> should I create a new one each time my app needs to run a job?
>>
>> Thanks,
>> Ishaaq
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/launching-concurrent-jobs-programmatically-tp4990.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>
>
a job?
>
> Thanks,
> Ishaaq
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/launching-concurrent-jobs-programmatically-tp4990.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
one each time my app needs to run a job?
Thanks,
Ishaaq
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/launching-concurrent-jobs-programmatically-tp4990.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.