Zeppelin execute job (all paragraphs) with parameters via REST API

2018-04-19 Thread Spico Florin
Hello! I have a zeppelin note that has many paragraphs. I have one paragraph that should receive/set up some parameters that will be further used by the other paragraphs. I would like to submit a job via zeppelin REST API that will with these parameters set up in the body. I know that in zeppeli

Manually import notes via copy notebook folder

2018-04-27 Thread Spico Florin
Hello! I would like to import notes into Zeppelin by manually overwriting the notebook folder. The files are copied in the notebook folder, but I cannot see them in the Zeppelin UI. Is any other place where Zeppelin is storing information about the notebooks? Besides the REST API, to import the

Re: Manually import notes via copy notebook folder

2018-04-27 Thread Spico Florin
1/z/b/NWFlMzFjYTE4OTY4/dNnKmlylxn0x2Jr1438WvnPmk6G9pmCFrq-9ihEhegNoOwyUCu2nteJ2jwIYIJL2pVF_RRJW1V75GAPSdG1p_I2wd5qdbmcNKYN4B311uFBlOC-kxRMpOja1AG2u0SsUkQrShDM5X-_6aYBXoIZg2TxapmGs50sqcS1IrFvrNH7HfYy_9CKC8CtGEPfqHlZ6Gjg6yv0ZcWNPVPsZ9o4FVu25ehqm1d76Kg==>[image: >> PlaceIQ:CES 2018] >> <https://share.

Re: Manually import notes via copy notebook folder

2018-04-27 Thread Spico Florin
Hello! Thank you all. It worked. I had a problem that the notebook files were not copied in the proper location. Best regards, Florin On Fri, Apr 27, 2018 at 4:46 PM, Mohit Jaggi wrote: > Restart Z. And wait a min or two before checking. > > On Fri, Apr 27, 2018 at 6:45 AM, Spi

How to track a zeppelin job for multiple job submission request

2018-05-01 Thread Spico Florin
Hello! I have a zeppelin notebook that I would like to be exposed as a REST service to multiple users. A user can request multiple times the results from the REST service backed by zeppelin. I would like the calls to service to be asynchronous and to use the async api https://zeppelin.apache.or

Re: [ANNOUNCE] Apache Zeppelin 0.8.0 released

2018-06-29 Thread Spico Florin
Hi! I tried to get the docker image for this version 0.8.0, but it seems that is not in the official docker hub repository: https://hub.docker.com/r/apache/zeppelin/tags/ there is no such as version 0.8.0 Also, the commands docker pull apache/zeppelin:0.8.0 or docker run -p 8080:8080 --rm --nam

Run/install tensorframes on zeppelin pyspark

2018-08-08 Thread Spico Florin
Hi! I would like to use tensorframes in my pyspark notebook. I have performed the following: 1. In the spark intepreter adde a new repository http://dl.bintray.com/spark-packages/maven 2. in the spark interpreter added the dependency databricks:tensorframes:0.2.9-s_2.11 3. pip install tensorfram

Re: Run/install tensorframes on zeppelin pyspark

2018-08-10 Thread Spico Florin
If yes how? I look forward for your answers/ Regards, Florin On Thu, Aug 9, 2018 at 3:52 AM, Jeff Zhang wrote: > > Make sure you use the correct python which has tensorframe installed. Use > PYSPARK_PYTHON > to configure the python > > > > S

Available and custom roles

2018-10-26 Thread Spico Florin
Hello! I would like to know what are the available roles in Zeppelin (besides admin that has *). How can I create/define my own roles based on the actions that an user is allowed. In the shiro.ini the examples are to generic, having role1, role2 all action allowed *. Can you please define the fin

Re: Available and custom roles

2018-10-26 Thread Spico Florin
the below urls that you want to hide. >> # anon means the access is anonymous. >> # authc means Form based Auth Security >> # To enfore security, comment the line below and uncomment the next one >> /api/version = anon >> /api/openid/* = anon >> /api/interp

Manage Flink job third party libraries with Zeppelin on a Flink cluster

2018-11-27 Thread Spico Florin
Hello! I'm using Zeppelin 0.7.3 with Flink 1.4.2 in cluster mode. My Flink job has dependencies on third party libraries (Flink CEP, jackson json etc) and when I run the notebook, I got ClassNotFoundException on the Flink Task side, even though I have configured the Flink Interpreter dependencies

Re: Store CSV file with Notebook?

2018-11-27 Thread Spico Florin
Hi! I have created a volume for the the docker container and put the data in that volume. I'm using docker compose and the docker compose for zeppelin looks like this interactive-analytics: build: interactive-analytics container_name: "zeppelin-analytics" environment: - KAFKA