Actually one more question along the same line. This is about .getOrCreate()
?

JavaStreamingContext doesn't seem to have a way to accept SparkSession
object so does that mean a streaming context is not required? If so, how do
I pass a lambda to .getOrCreate using SparkSession? The lambda that we
normally pass when we call StreamingContext.getOrCreate.








On Thu, Apr 27, 2017 at 8:47 AM, kant kodali <kanth...@gmail.com> wrote:

> Ahhh Thanks much! I miss my sparkConf.setJars function instead of this
> hacky comma separated jar names.
>
> On Thu, Apr 27, 2017 at 8:01 AM, Yanbo Liang <yblia...@gmail.com> wrote:
>
>> Could you try the following way?
>>
>> val spark = 
>> SparkSession.builder.appName("my-application").config("spark.jars", "a.jar, 
>> b.jar").getOrCreate()
>>
>>
>> Thanks
>>
>> Yanbo
>>
>>
>> On Thu, Apr 27, 2017 at 9:21 AM, kant kodali <kanth...@gmail.com> wrote:
>>
>>> I am using Spark 2.1 BTW.
>>>
>>> On Wed, Apr 26, 2017 at 3:22 PM, kant kodali <kanth...@gmail.com> wrote:
>>>
>>>> Hi All,
>>>>
>>>> I am wondering how to create SparkSession using SparkConf object?
>>>> Although I can see that most of the key value pairs we set in SparkConf we
>>>> can also set in SparkSession or  SparkSession.Builder however I don't see
>>>> sparkConf.setJars which is required right? Because we want the driver jar
>>>> to be distributed across the cluster whether we run it in client mode or
>>>> cluster mode. so I am wondering how is this possible?
>>>>
>>>> Thanks!
>>>>
>>>>
>>>
>>
>

Reply via email to