Some problem of using Zeppelin with Spark

2016-10-07 Thread mingda li
Dear all, Hi, we are using Zeppelin with Spark and AsterixDB now. The Spark will get data from AsterixDB and Zeppelin can let users operate the data in Spark. Now, we can run an example well in Spark's shell. But once we come to Zeppelin, the example cannot run. There are some problem that org.SLF

Re: Some problem of using Zeppelin with Spark

2016-10-07 Thread mingda li
and [ERROR] mvn -rf :zeppelin-spark-dependencies_2.10 clash@SCAI01:~/zeppelinFinal/zeppelin/incubator-zeppelin$ On Fri, Oct 7, 2016 at 1:20 PM, DuyHai Doan wrote: > Try adding -U before -DskipTests > > On Fri, Oct 7, 2016 at 10:06 PM, mingda li wrote: > >> Dear all, &

Run binary file and failed due to SLF4J bindings

2016-10-07 Thread mingda li
Dear all, I have tried to run the zeppelin on Spark by the zeppelin-0.6.1-bin-all. But when I run any query, it just showed me error without message. When I check the log file, I find the following: Does anyone meet such problem about SLF4J bindings? Java HotSpot(TM) 64-Bit Server VM warning: ign

About the org.slf4j version

2016-10-07 Thread mingda li
Hi, I find our cluster use Hadoop 1.0.4 which uses org.slf4j 1.4.3. This is not the same as Zeppelin's 1.7.10. Could I build Zeppelin with 1.4.3 version for my cluster? Or will this influence the working of Zeppelin? I have tried to use Zeppelin but failed and thought this is caused by org.slf4j's

Re: About the org.slf4j version

2016-10-07 Thread mingda li
IRC, zeppelin doesn't support hadoop 1 as hadoop 1.x and 2.x is not > incompatible. > > On Sat, Oct 8, 2016 at 10:20 AM, mingda li wrote: > >> Hi, >> I find our cluster use Hadoop 1.0.4 which uses org.slf4j 1.4.3. This is >> not the same as Zeppelin's 1.7.10. Could

Error about PySpark

2017-01-31 Thread mingda li
Dear all, We are using Zeppelin. And I have added the export PYTHONPATH=/home/clash/sparks/spark-1.6.1-bin-hadoop12/python to zeppelin-env.sh. But each time, when I want to use pyspark, for example the program: %pyspark from pyspark import SparkContext logFile = "hiv.data" logData = sc.textFile(l

Re: Error about PySpark

2017-02-02 Thread mingda li
/sparks/spark-1.6.1-bin-hadoop12* > > now you can restart zeppelin please run your python command. > > *. Could you give the absolute path for logFile like following. > logFile = "/Users/user/hiv.data" > > > 2017-02-01 11:48 GMT+09:00 mingda li : > >> Dear all, &g

Re: Error about PySpark

2017-02-02 Thread mingda li
And I tried the ./bin/pyspark to run same program with package of mllib, That can work well for spark. So do I need to set something for Zeppelin? Like PySpark_Python or PythonPath. Bests, Mingda On Thu, Feb 2, 2017 at 12:07 PM, mingda li wrote: > Thanks. But when I changed the env

Re: Error about PySpark

2017-02-02 Thread mingda li
numpy > > > Best Regard, > Jeff Zhang > > > From: mingda li > Reply-To: "users@zeppelin.apache.org" > Date: Friday, February 3, 2017 at 6:03 AM > To: "users@zeppelin.apache.org" > Subject: Re: Error about PySpark > > And I tried the ./bin

Re: Error about PySpark

2017-02-03 Thread mingda li
017 at 5:29 PM, Jeff Zhang wrote: > Do you have multiple python installed ? From the error message, it is > clear that it complains about no numpy is installed. > > > mingda li 于2017年2月3日周五 上午9:16写道: > >> I have numpy on the cluster. Otherwise the pySpark can't work a

How to let Zeppelin have map visulization function

2017-03-20 Thread mingda li
Dear all, Now, I wan to let Zeppelin 0.7.0 and 0.6.0 support the map visualization. Is that function support by the official version? Or we need to do something like here https://github.com/apache/zeppelin/pull/152 https://github.com/apache/zeppelin/pull/765 Thanks, Mingda

Re: How to let Zeppelin have map visulization function

2017-03-21 Thread mingda li
://zeppelin.apache.org/helium_packages.html > [3] http://zeppelin.apache.org/docs/snapshot/development/wri > tingzeppelinvisualization.html > > On Mon, Mar 20, 2017 at 11:06 PM mingda li wrote: > >> Dear all, >> >> Now, I wan to let Zeppelin 0.7.0 and 0.6.0 support t

Re: How to let Zeppelin have map visulization function

2017-03-21 Thread mingda li
Dear moon, Is the https://github.com/apache/zeppelin/pull/765/files the right one? On Tue, Mar 21, 2017 at 9:04 AM, mingda li wrote: > Dear moon, > Thanks for your explain. But I don't find map visualization in the > pluggable list. > What is the map visualization without

Re: How to let Zeppelin have map visulization function

2017-03-21 Thread mingda li
y dealing with is make map visualization as Helium plugin package and let > individual user accept license when enabling the package. > > Thanks, > moon > > > On Tue, Mar 21, 2017 at 9:05 AM mingda li wrote: > >> Dear moon, >> Thanks for your explain. But I d

Re: How to let Zeppelin have map visulization function

2017-03-22 Thread mingda li
il.derra...@intellifylearning.com> wrote: > This is fairly easy to follow along and get working. > https://gist.github.com/granturing/a09aed4a302a7367be92 > > On Tue, Mar 21, 2017 at 2:15 PM, mingda li wrote: > >> Oh, I see. Thanks. >> Could someone help to publish the

Does Spark need to run for Zeppelin

2017-03-22 Thread mingda li
Hi, I recently met a strange problem. My spark is not running. But when I restart Zeppelin and run spark program on it. It can run. And even after I start spark and run zeppelin, I can't see the application of Zeppelin running in Spark. Does anyone have idea of that? Thanks, Mingda

Re: Does Spark need to run for Zeppelin

2017-03-22 Thread mingda li
If I tell zeppelin spark's home in conf, will it start spark by self? On Wed, Mar 22, 2017 at 1:45 PM, mingda li wrote: > Hi, > > I recently met a strange problem. My spark is not running. But when I > restart Zeppelin and run spark program on it. It can run. > And even aft

Re: Does Spark need to run for Zeppelin

2017-03-22 Thread mingda li
at 5:30 PM, Jianfeng (Jeff) Zhang < jzh...@hortonworks.com> wrote: > >>> I can't see the application of Zeppelin running in Spark. > > What do you mean ? Do you mean you don’t see yarn app ? Maybe you didn’t > run yarn-client mode > > > > Best Regard,