Which version of Spark do you use? I've found the current version supports
"lz4" by default, then it looks like you don't have to set anything. If you
want to another type of compression, you can set the configuration settings
into interpreter tab instead of using "conf.set'. Actually, that doesn't
How could I do that? Could you please send me any reference link? On Feb 17, 2017 1:17 AM, Jeff Zhang wrote:zeppelin will create SparkContext implicitly for users, so it might be too late to set it after interpreter is opened. You can try to set that in interpreter setting.Muhammad Rezaul Karim
zeppelin will create SparkContext implicitly for users, so it might be too
late to set it after interpreter is opened. You can try to set that in
interpreter setting.
Muhammad Rezaul Karim 于2017年2月16日周四 下午11:52写道:
> Hi Lee,
>
> Thanks for the info that really helped. I set the compression code
Hi Lee,
Thanks for the info that really helped. I set the compression codec in the
Spark side -i.e. inside the SPARK_HOME and now the problem resolved. However, I
was wondering if it's possible to set the same from the Zeppelin notebook.
I tried in the following way:
%spark conf.set("spark.io.c
Hi Jeff,
Thanks a lot for the info. But, I was looking for books on Apache Zeppelin, not
Apache Kafka :) and probably there's no book published yet for Zeppelin.
On Thursday, February 16, 2017 3:18 PM, Jeffrey Groves
wrote:
#yiv7061667245 #yiv7061667245 -- _filtered #yiv7061667245 {p
Hi Reza:
My suggestion isn’t a book, but a video course available on Oreilly.com.
The course is:
Introduction to Apache Kafka
A Quick Primer for Developers and Administrators
By Gwen Shapira
and it provided pretty much everything I needed to jumpstart my understanding
of Kafka and its use. It