Hi, You can get reference to ResourcePool by running
val resourcePool = z.getInterpreterContext().getResourcePool() // [1] Then you can invoke methods to put resource [2] into it. Hope this works. Thanks, moon [1] https://github.com/apache/zeppelin/blob/master/zeppelin-interpreter/src/main/java/org/apache/zeppelin/interpreter/InterpreterContext.java#L168 [2] https://github.com/apache/zeppelin/blob/master/zeppelin-interpreter/src/main/java/org/apache/zeppelin/resource/ResourcePool.java#L68 On Tue, Apr 18, 2017 at 8:11 PM fish fish <fishfish...@gmail.com> wrote: > Hi > > Just wonder any updates on this thread? Thanks! > > Best, > Chen > > 2017-04-18 16:10 GMT+08:00 fish fish <fishfish...@gmail.com>: > >> Thank you Lee! I can run the Clock example. However, in >> https://github.com/apache/zeppelin/blob/master/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java#L1287 >> , SparkInterpreter only put last evaluated variable into resource pool. So >> if my helium app need multiple resources >> like [[":org.apache.spark.sql.SparkSession", >> ":org.apache.spark.SparkContext", ":org.apache.spark.sql.SQLContext"]], how >> should I run in one paragraph? I run following code in one paragraph but >> seems only the last sqlContext be out in resource pool. >> %spark >> sc >> spark >> sqlContext >> >> Best, >> Chen >> >> 2017-04-17 13:27 GMT+08:00 moon soo Lee <m...@apache.org>: >> >>> Hi, >>> >>> Helium application launch will appear when ResourcePool has required >>> resource [1] for an application. >>> >>> Spark interpreter put last evaluated object in the ResourcePool. >>> For example, if you run "new java.util.Date()" in a paragraph, a date >>> object will be created in the ResourcePool. And there will be a button >>> displayed in the paragraph and list all applications that consumes >>> java.util.Date type object. >>> >>> if you build Zeppelin with -Pexamples flag, then you can run "new >>> java.util.Date()" in spark interpreter and run example application. >>> >>> Let me know if it helps. >>> >>> Thanks, >>> moon >>> >>> [1] >>> http://zeppelin.apache.org/docs/0.8.0-SNAPSHOT/development/writingzeppelinapplication.html#resources >>> >>> >>> >>> On Mon, Apr 17, 2017 at 12:55 PM fish fish <fishfish...@gmail.com> >>> wrote: >>> >>>> Hi Lee, >>>> >>>> Thank you for your reply. Actually I already tried dev mode and >>>> succeeded. My question is how to launch Helium application not in >>>> development mode but in deploy mode like 'SPELL' described in >>>> https://zeppelin.apache.org/docs/snapshot/development/writingzeppelinspell.html#1-enabling. >>>> Could you please kindly guide more? Thanks! >>>> >>>> Best, >>>> Chen >>>> >>>> 2017-04-15 17:13 GMT+08:00 moon soo Lee <m...@apache.org>: >>>> >>>>> Hi, >>>>> >>>>> Dev mode in actual implementation became '%dev run' instead of >>>>> '%helium run' in proposal. >>>>> >>>>> Please check >>>>> http://zeppelin.apache.org/docs/0.8.0-SNAPSHOT/development/writingzeppelinapplication.html#development-mode >>>>> . >>>>> >>>>> To use '%dev run', you need build Zeppelin with -Phelium-dev flag. >>>>> >>>>> Hope this helps! >>>>> >>>>> Best, >>>>> moon >>>>> >>>>> On Fri, Apr 14, 2017 at 8:16 PM fish fish <fishfish...@gmail.com> >>>>> wrote: >>>>> >>>>>> Hi Group, >>>>>> >>>>>> I have written a Helium application and deploy in local mode. Also I >>>>>> enable it in 'Helium' page. However, I don't know how to launch the >>>>>> enabled >>>>>> application in notebook. I tried type "%helium run" as showed in this >>>>>> video: >>>>>> https://cwiki.apache.org/confluence/display/ZEPPELIN/Helium+proposal >>>>>> , but error occurs which indicate no helium interpreter. Could anyone >>>>>> kindly tell me how to launch Helium application in right way? Thanks! >>>>>> >>>>>> Zeppelin version: 0.8.0-snapshot >>>>>> >>>>>> application.json: >>>>>> >>>>>> { >>>>>> "type" : "APPLICATION", >>>>>> "name" : "test_app", >>>>>> "description" : "test Helium App", >>>>>> "license" : "Apache-2.0", >>>>>> "artifact" : "./examples/zeppelin-mytest-0.0.1-SNAPSHOT.jar", >>>>>> "className" : "com.test.Test", >>>>>> "resources" : [[":org.apache.spark.sql.SparkSession", >>>>>> ":org.apache.spark.SparkContext", ":org.apache.spark.sql.SQLContext"]], >>>>>> "icon" : "<i class='fa fa-search-minus'></i>" >>>>>> } >>>>>> >>>>>> >>>> >> >