Try to debug your code in IDE. You should look at your array s since it
complains about array index.
Thanks,
Wd
> On Nov 16, 2016, at 10:44 PM, Muhammad Rezaul Karim
> wrote:
>
> Hi All,
>
> I have the following Scala code (taken from
> https://zeppelin.apache.org/docs/0.6.2/quickstart/tuto
Hello Muhammad.
Please check your bank-full.csv file first and you can filter item length
in your scala code
for example *val bank = bankText.map(s => s.split(";")).filter(s =>
(s.size)>5).filter(s => s(0) != "\"age\"")*
Hope this helps.
2016-11-17 21:26 GMT+09:00 Dayong :
> Try to debug you
Hi Shim,
Now it works perfectly. Thank you so much. Actually, I am from Java background
and learning the Scala.
Thanks and Regards,
-
Md. Rezaul Karim
PhD Researcher, Insight Centre for Data Analytics
National University of Ireland Galway
E-mail: rezaul.ka...@i
Hi All,
I am a new user of Zeppelin and got to know that Apache Zeppelin is using Spark
as the backend interpreter.
Till date, I have run some codes written in Scala on the Zeppelin notebook.
However, I am pretty familiar with writing Spark application using Java.
Now my question: is it possi
Are you able to run the same code in SPARK_HOME/bin/spark-shell ?
Thanks,
moon
On Mon, Nov 14, 2016 at 2:21 PM Nirav Patel wrote:
> I have a following map:
>
> final val idxMapArr = idxMap.collectAsMap
>
> Which is being used in one of spark transformation here:
>
> def labelStr(predictions: Wr
Yes it will. I guess there are some implementations too
On Thu, Nov 17, 2016 at 10:41 PM, Muhammad Rezaul Karim <
reza_cse...@yahoo.com> wrote:
> Hi All,
>
> I am a new user of Zeppelin and got to know that Apache Zeppelin is using
> Spark as the backend interpreter.
>
> Till date, I have run som
Although "export PATH=$PATH..." is not really necessary in
zeppelin-env.sh, i think your configuration looks okay.
Have you tried remove /home/asif/zeppelin-0.6.2-bin-all/metastore_db ?
Thanks,
moon
On Wed, Nov 16, 2016 at 6:48 PM Muhammad Rezaul Karim
wrote:
Hi Moon,
I have set those variab
Hi,
Do you have the same problem on SPARK_HOME/bin/spark-shell?
Are you using standalone spark cluster? or Yarn?
Thanks,
moon
On Sun, Nov 13, 2016 at 8:19 PM York Huang wrote:
> I ran into the "No space left on device" error in zeppelin spark when I
> tried to run the following.
> cache table
Hi,
Thanks a lot. Yes, I have removed the metastore_db directory too.
On Thursday, November 17, 2016 5:38 PM, moon soo Lee
wrote:
Although "export PATH=$PATH..." is not really necessary in zeppelin-env.sh, i
think your configuration looks okay.
Have you tried remove /home/asif/zep
Are you able to run the same code in SPARK_HOME/bin/spark-shell?
Thanks,
moon
On Thu, Nov 17, 2016 at 9:47 AM Muhammad Rezaul Karim
wrote:
> Hi,
>
> Thanks a lot. Yes, I have removed the metastore_db directory too.
>
>
>
>
> On Thursday, November 17, 2016 5:38 PM, moon soo Lee
> wrote:
>
>
> A
Hi,
I can run the same code in SPARK_HOME/bin/spark-shell. However, it does not
allow me to execute the SQL command.
On Thursday, November 17, 2016 6:01 PM, moon soo Lee
wrote:
Are you able to run the same code in SPARK_HOME/bin/spark-shell?
Thanks,moon
On Thu, Nov 17, 2016 at 9:47
Hi All,
I have a bucket that I’m working with and I want to pull orc files from there
and use it in my Spark/Scala magic. The only thing is that these files are KMS
encrypted. When I try to get a KMS file however, it shows me an AWS Access
Denied error, although there is no possible way that co
I have to try but I think it probably will happen with spark-shell as well.
I have found alternative to pass a Array to UDF as a parameter.
Thanks
On Thu, Nov 17, 2016 at 9:24 AM, moon soo Lee wrote:
> Are you able to run the same code in SPARK_HOME/bin/spark-shell ?
>
> Thanks,
> moon
>
> On M
Recently I started to getting following error upon execution of spark sql.
validInputDocs.createOrReplaceTempView("valInput")
%sql select count(*) from valInput //Fails with ClassNotFoundException
exception
But validInputDocs.show works just fine.
ANy interpreter settings that may have affecte
Never mind. I had logically incorrect transformation.
val validInputDocs = inputDocsDs.filter(doc => {
(doc.labelIdx != -1 && doc.label != "test ") // Predicate should be
!"test".equals(doc.label) ; I copied incorrect one from sql statement I
wrote earlier :)
})
Fixing above seem to resol
Good to hear it helps.
2016년 11월 18일 (금) 오전 1:52, Muhammad Rezaul Karim 님이
작성:
> Hi Shim,
>
> Now it works perfectly. Thank you so much. Actually, I am from Java
> background and learning the Scala.
>
>
> Thanks and Regards,
> -
> *Md. Rezaul Karim*
> PhD Researcher
16 matches
Mail list logo