w.r.t. Kafka library, see

https://repository.apache.org/content/repositories/orgapachespark-1104/org/apache/spark/spark-streaming-kafka_2.11/1.4.0-rc2/

FYI

On Tue, May 26, 2015 at 8:33 AM, Ritesh Kumar Singh <
riteshoneinamill...@gmail.com> wrote:

> Yes, recommended version is 2.10 as all the features are not supported by
> 2.11 version. Kafka libraries and JDBC components are yet to be ported to
> 2.11 version. And so if your project doesn't depend on these components,
> you can give v2.11 a try.
>
> Here's a link
> <https://spark.apache.org/docs/1.2.0/building-spark.html#building-for-scala-211>
>  for
> building with 2.11 version.
>
> Though, you won't be running into any issues if you try v2.10 as of now.
> But then again, the future releases will have to shift to 2.11 version once
> support for v2.10 ends in the long run.
>
>
> On Tue, May 26, 2015 at 8:21 PM, Punyashloka Biswal <
> punya.bis...@gmail.com> wrote:
>
>> Dear Spark developers and users,
>>
>> Am I correct in believing that the recommended version of Scala to use
>> with Spark is currently 2.10? Is there any plan to switch to 2.11 in
>> future? Are there any advantages to using 2.11 today?
>>
>> Regards,
>> Punya
>
>
>

Reply via email to