How about a export transform feature that can take notebook content and convert it to a different form that can be submitted to Livy?
That could be useful for something like export to code (.py, .scala) or export to PDF? On Thu, Jun 16, 2016 at 10:35 PM -0700, "Jeff Zhang" <zjf...@gmail.com<mailto:zjf...@gmail.com>> wrote: Yeah, I don't think zeppelin has the capability to create artifact for you. But if zepplein has such feature, you can of course reuse the code in interactive session and submit it as batch session to livy. On Fri, Jun 17, 2016 at 12:33 PM, kishore <vkishore...@gmail.com<mailto:vkishore...@gmail.com>> wrote: Our thought is to use the Zeppelin to create the algorithms leveraging the interactive data analysis and eventually deploy the algorithm in the production environment In that process after the algorithm is created and published. we can eventually create job artifact and post the artifact to the rest api (using either spark job server or livy server) and get the algorithm deployed into the spark production cluster and run it in a scheduled interval. For creating the job artifact may be the Zeppelin may not have out of the box we thought of extending the capability to publish to a repository. Let me know if you have any suggestions on how we can get the notebook created during the interactive analysis to eventually get deployed in production environment. Thanks Kishore -- View this message in context: http://apache-zeppelin-users-incubating-mailing-list.75479.x6.nabble.com/Zeppelin-Integration-with-Livy-Server-tp3272p3292.html Sent from the Apache Zeppelin Users (incubating) mailing list mailing list archive at Nabble.com. -- Best Regards Jeff Zhang