Hi Vinay,

I import your note, and it works properly in my local box. I suspect it is
due to some configuration issue. Do you use spark 2.0-preview rather than
the official spark 2.0 ?


Best Regard,
Jeff Zhang





On 8/16/16, 12:52 AM, "Vinay Shukla" <vinayshu...@gmail.com> wrote:

>With RC2,
>
>Zeppelin Tutorial passes,
>Import Notebook passes.
>Create a simple notebook with Spark 2.0 that reads Spark sample
>people.json
>
>A simple Spark SQL query ("%sql select * from people") failed with
>
>the following exception. The simple notebook that reproduces the error is
>here
><https://github.com/vinayshukla/zeppelin-notebooks/blob/master/Spark2.0Not
>e.json>.
>If others see this issue we should dig deeper.
>
>java.lang.NoSuchMethodException:
>org.apache.spark.io.LZ4CompressionCodec.<init>(org.apache.spark.SparkConf)
>at java.lang.Class.getConstructor0(Class.java:3082)
>at java.lang.Class.getConstructor(Class.java:1825)
>at
>org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:7
>2)
>at
>org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:6
>6)
>at org.apache.spark.broadcast.TorrentBroadcast.org
>$apache$spark$broadcast$TorrentBroadcast$$setConf(TorrentBroadcast.scala:7
>4)
>at
>org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:
>81)
>at
>org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBro
>adcastFactory.scala:34)
>at
>org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.
>scala:56)
>at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1370)
>at
>org.apache.spark.sql.execution.datasources.json.JsonFileFormat.buildReader
>(JsonFileFormat.scala:102)
>at
>org.apache.spark.sql.execution.datasources.FileFormat$class.buildReaderWit
>hPartitionValues(fileSourceInterfaces.scala:260)
>at
>org.apache.spark.sql.execution.datasources.TextBasedFileFormat.buildReader
>WithPartitionValues(fileSourceInterfaces.scala:304)
>at
>org.apache.spark.sql.execution.datasources.FileSourceStrategy$.apply(FileS
>ourceStrategy.scala:112)
>at
>org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(Query
>Planner.scala:60)
>at
>org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(Query
>Planner.scala:60)
>at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
>at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
>at
>org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scal
>a:61)
>at org.apache.spark.sql.execution.SparkPlanner.plan(SparkPlanner.scala:47)
>at
>org.apache.spark.sql.execution.SparkPlanner$$anonfun$plan$1$$anonfun$apply
>$1.applyOrElse(SparkPlanner.scala:51)
>at
>org.apache.spark.sql.execution.SparkPlanner$$anonfun$plan$1$$anonfun$apply
>$1.applyOrElse(SparkPlanner.scala:48)
>at
>org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(
>TreeNode.scala:301)
>at
>org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(
>TreeNode.scala:301)
>at
>org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.sca
>la:69)
>at
>org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:30
>0)
>at
>org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.sca
>la:298)
>at
>org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.sca
>la:298)
>at
>org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$5.apply(TreeNode.sca
>la:321)
>at
>org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.s
>cala:179)
>at
>org.apache.spark.sql.catalyst.trees.TreeNode.transformChildren(TreeNode.sc
>ala:319)
>at
>org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:29
>8)
>at
>org.apache.spark.sql.execution.SparkPlanner$$anonfun$plan$1.apply(SparkPla
>nner.scala:48)
>at
>org.apache.spark.sql.execution.SparkPlanner$$anonfun$plan$1.apply(SparkPla
>nner.scala:48)
>at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
>at
>org.apache.spark.sql.execution.QueryExecution.sparkPlan$lzycompute(QueryEx
>ecution.scala:78)
>at
>org.apache.spark.sql.execution.QueryExecution.sparkPlan(QueryExecution.sca
>la:76)
>at
>org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(Quer
>yExecution.scala:83)
>at
>org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.
>scala:83)
>at org.apache.spark.sql.Dataset.withTypedCallback(Dataset.scala:2558)
>at org.apache.spark.sql.Dataset.head(Dataset.scala:1924)
>at org.apache.spark.sql.Dataset.take(Dataset.scala:2139)
>at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>at
>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:
>62)
>at
>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorIm
>pl.java:43)
>at java.lang.reflect.Method.invoke(Method.java:498)
>at
>org.apache.zeppelin.spark.ZeppelinContext.showDF(ZeppelinContext.java:214)
>at
>org.apache.zeppelin.spark.SparkSqlInterpreter.interpret(SparkSqlInterprete
>r.java:129)
>at
>org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInte
>rpreter.java:94)
>at
>org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJo
>b.jobRun(RemoteInterpreterServer.java:341)
>at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
>at 
>org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
>at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>at
>java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.acces
>s$201(ScheduledThreadPoolExecutor.java:180)
>at
>java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(S
>cheduledThreadPoolExecutor.java:293)
>at
>java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:
>1142)
>at
>java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java
>:617)
>at java.lang.Thread.run(Thread.java:745)
>
>
>Thank you,
>Vinay
>
>On Mon, Aug 15, 2016 at 7:55 AM, rohit choudhary <rconl...@gmail.com>
>wrote:
>
>> +1.
>>
>> Ran sample notebooks. Cent.
>>
>> Thanks,
>> Rohit.
>>
>> On Mon, Aug 15, 2016 at 4:23 PM, madhuka udantha
>><madhukaudan...@gmail.com
>> >
>> wrote:
>>
>> > +1
>> >
>> > Built in windows 8
>> >
>> > On Mon, Aug 15, 2016 at 4:03 PM, mina lee <mina...@apache.org> wrote:
>> >
>> > > +1 (binding)
>> > >
>> > > On Sun, Aug 14, 2016 at 9:58 PM, Felix Cheung <
>> felixcheun...@hotmail.com
>> > >
>> > > wrote:
>> > >
>> > > > +1
>> > > >
>> > > > Tested out binaries and netinstall, with spark and a few other
>> > > > interpreters.
>> > > >
>> > > > Thanks Mina!
>> > > >
>> > > >
>> > > > _____________________________
>> > > > From: Alexander Bezzubov <b...@apache.org<mailto:b...@apache.org>>
>> > > > Sent: Sunday, August 14, 2016 12:05 AM
>> > > > Subject: Re: [VOTE] Apache Zeppelin release 0.6.1 (rc2)
>> > > > To: <dev@zeppelin.apache.org<mailto:dev@zeppelin.apache.org>>
>> > > >
>> > > >
>> > > > +1 (binding), for this rapid release put together so well by Mina
>>Lee
>> > > > again!
>> > > >
>> > > > Verified:
>> > > > - build from sources, SparkInterpreter over apache spark 2.0 and
>> > > > PythonInterpreter
>> > > > - https://dist.apache.org is super slow for me :\ so can not help
>>\w
>> > > 517mb
>> > > > binaries
>> > > >
>> > > > --
>> > > > Alex
>> > > >
>> > > > On Sun, Aug 14, 2016 at 1:54 AM, Victor Manuel Garcia <
>> > > > victor.gar...@beeva.com<mailto:victor.gar...@beeva.com>> wrote:
>> > > >
>> > > > > +1
>> > > > >
>> > > > > 2016-08-13 18:45 GMT+02:00 Prabhjyot Singh <
>> > prabhjyotsi...@apache.org<
>> > > > mailto:prabhjyotsi...@apache.org>>:
>> > > > >
>> > > > > > +1
>> > > > > >
>> > > > > > On 13 Aug 2016 9:25 p.m., "Sourav Mazumder" <
>> > > > sourav.mazumde...@gmail.com<mailto:sourav.mazumde...@gmail.com>
>> > > > > >
>> > > > > > wrote:
>> > > > > >
>> > > > > > > + 1
>> > > > > > >
>> > > > > > > Regards,
>> > > > > > > Sourav
>> > > > > > >
>> > > > > > > > On ১৩ আগস্ট, ২০১৬, at ২:১৭ পূর্বাহ্ণ, Hyung Sung Shim <
>> > > > > > hss...@nflabs.com<mailto:hss...@nflabs.com>>
>> > > > > > > wrote:
>> > > > > > > >
>> > > > > > > > +1
>> > > > > > > >
>> > > > > > > > 2016년 8월 13일 토요일, Khalid Huseynov<khalid...@nflabs.com<
>> mailto:
>> > > > khalid...@nflabs.com>>님이 작성한 메시지:
>> > > > > > > >
>> > > > > > > >> +1
>> > > > > > > >>
>> > > > > > > >> On Aug 13, 2016 11:10 AM, "Ahyoung Ryu" <
>> > ahyoungry...@gmail.com
>> > > <
>> > > > mailto:ahyoungry...@gmail.com>
>> > > > > > > >> <javascript:;>> wrote:
>> > > > > > > >>
>> > > > > > > >>> +1
>> > > > > > > >>>
>> > > > > > > >>> 2016년 8월 13일 (토) 오후 2:41, Anthony Corbacho <
>> > > > > > anthonycorba...@apache.org<mailto:anthonycorba...@apache.org>
>> > > > > > > >> <javascript:;>>님이
>> > > > > > > >>> 작성:
>> > > > > > > >>>
>> > > > > > > >>>> +1
>> > > > > > > >>>>
>> > > > > > > >>>> On Saturday, 13 August 2016, moon soo Lee <
>> m...@apache.org
>> > > > <mailto:m...@apache.org>
>> > > > > > > >> <javascript:;>> wrote:
>> > > > > > > >>>>
>> > > > > > > >>>>> +1
>> > > > > > > >>>>>
>> > > > > > > >>>>> Verified
>> > > > > > > >>>>> - No unexpected binaries in source package
>> > > > > > > >>>>> - LICENSE, NOTICE exists
>> > > > > > > >>>>> - Build from source
>> > > > > > > >>>>> - Signature
>> > > > > > > >>>>> - Run binary packages (netinst, all)
>> > > > > > > >>>>> - Run SparkInterpreter with spark-2.0 and scala-2.11
>> > > > > > > >>>>>
>> > > > > > > >>>>> Best,
>> > > > > > > >>>>> moon
>> > > > > > > >>>>>
>> > > > > > > >>>>> On Fri, Aug 12, 2016 at 6:37 AM Jianfeng (Jeff) Zhang
>><
>> > > > > > > >>>>> jzh...@hortonworks.com<mailto:jzh...@hortonworks.com>
>> > > > <javascript:;> <javascript:;>> wrote:
>> > > > > > > >>>>>
>> > > > > > > >>>>>>
>> > > > > > > >>>>>> +1
>> > > > > > > >>>>>>
>> > > > > > > >>>>>> Build it with profile spark-2.0, and ran tutorial
>>note
>> > > > > > > >> successfully.
>> > > > > > > >>>>>> AM & Executor logs are available in yarn-client mode.
>> > > > > > > >>>>>>
>> > > > > > > >>>>>>
>> > > > > > > >>>>>>
>> > > > > > > >>>>>> Best Regard,
>> > > > > > > >>>>>> Jeff Zhang
>> > > > > > > >>>>>>
>> > > > > > > >>>>>>
>> > > > > > > >>>>>>
>> > > > > > > >>>>>>
>> > > > > > > >>>>>>
>> > > > > > > >>>>>> On 8/12/16, 7:36 PM, "mina lee" <mina...@apache.org
>> > <mailto:
>> > > > mina...@apache.org>
>> > > > > > <javascript:;>
>> > > > > > > >> <javascript:;>>
>> > > > > > > >>>>> wrote:
>> > > > > > > >>>>>>
>> > > > > > > >>>>>>> Hi folks,
>> > > > > > > >>>>>>>
>> > > > > > > >>>>>>> I propose the following RC to be released for the
>> Apache
>> > > > > Zeppelin
>> > > > > > > >>>> 0.6.1
>> > > > > > > >>>>>>> release.
>> > > > > > > >>>>>>>
>> > > > > > > >>>>>>> The commit id is c928f9a46ecacebc868d6dc10a95c0
>> > 2f9018a18e
>> > > > > which
>> > > > > > > >> is
>> > > > > > > >>>>>>> corresponds
>> > > > > > > >>>>>>> to the tag v0.6.1-rc2:
>> > > > > > > >>>>>>> *
>> > > > > > > >>>>>> https://git-wip-us.apache.org/
>> repos/asf?p=zeppelin.git;a=
>> > > > > > > >>>>> commit;h=c928f9a
>> > > > > > > >>>>>>> 46ecacebc868d6dc10a95c02f9018a18e
>> > > > > > > >>>>>>> <
>> > > > > > > >>>>>> https://git-wip-us.apache.org/
>> repos/asf?p=zeppelin.git;a=
>> > > > > > > >>>>> commit;h=c928f9a
>> > > > > > > >>>>>>> 46ecacebc868d6dc10a95c02f9018a18e>*
>> > > > > > > >>>>>>>
>> > > > > > > >>>>>>> The release archives (tgz), signature, and checksums
>> are
>> > > here
>> > > > > > > >>>>>>> https://dist.apache.org/repos/dist/dev/zeppelin/
>> > > > > > > >> zeppelin-0.6.1-rc2/
>> > > > > > > >>>>>>>
>> > > > > > > >>>>>>> The release candidate consists of the following
>>source
>> > > > > > > >> distribution
>> > > > > > > >>>>>>> archive
>> > > > > > > >>>>>>> zeppelin-0.6.1.tgz
>> > > > > > > >>>>>>>
>> > > > > > > >>>>>>> In addition, the following supplementary binary
>> > > distributions
>> > > > > are
>> > > > > > > >>>>> provided
>> > > > > > > >>>>>>> for user convenience at the same location
>> > > > > > > >>>>>>> zeppelin-0.6.1-bin-all.tgz
>> > > > > > > >>>>>>> zeppelin-0.6.1-netinst-all.tgz
>> > > > > > > >>>>>>>
>> > > > > > > >>>>>>> The maven artifacts are here
>> > > > > > > >>>>>> https://repository.apache.org/content/repositories/
>> > > > > > > >>>>> orgapachezeppelin-1016/
>> > > > > > > >>>>>>> org/apache/zeppelin/
>> > > > > > > >>>>>>>
>> > > > > > > >>>>>>> You can find the KEYS file here:
>> > > > > > > >>>>>>> https://people.apache.org/keys/committer/minalee.asc
>> > > > > > > >>>>>>>
>> > > > > > > >>>>>>> Release notes available at
>> > > > > > > >>>>>>> *
>> > > > > > > >>>>>>
>>https://issues.apache.org/jira/secure/ReleaseNote.jspa?
>> > > > > > > >>>>> version=12336543&s
>> > > > > > > >>>>>>> tyleName=Html&projectId=12316221
>> > > > > > > >>>>>>> <
>> > > > > > > >>>>>>
>>https://issues.apache.org/jira/secure/ReleaseNote.jspa?
>> > > > > > > >>>>> version=12336543&s
>> > > > > > > >>>>>>> tyleName=Html&projectId=12316221>*
>> > > > > > > >>>>>>>
>> > > > > > > >>>>>>> Vote will be open for next 72 hours (close at 5am
>> 15/Aug
>> > > > PDT).
>> > > > > > > >>>>>>>
>> > > > > > > >>>>>>> [ ] +1 approve
>> > > > > > > >>>>>>> [ ] 0 no opinion
>> > > > > > > >>>>>>> [ ] -1 disapprove (and reason why)
>> > > > > > > >>
>> > > > > > >
>> > > > > >
>> > > > >
>> > > > >
>> > > > >
>> > > > > --
>> > > > > *Victor Manuel Garcia Martinez*
>> > > > > *Technical Architect*
>> > > > > <https://www.linkedin.com/profile/view?id=AAMAABFzLAsBCDe_qh9oo-
>> > > > > ENueO999zvw0zkhXQ&trk=hp-identity-headline>
>> > > > >
>> > > > >
>> > > > > *+34 672104297 | victor.gar...@beeva.com<mailto:
>> > > victor.gar...@beeva.com>
>> > > > <victor.gar...@beeva.com<mailto:victor.gar...@beeva.com>>*
>> > > > > * | victormanuel.garcia.marti...@bbva.com<mailto:victormanuel.
>> > > > garcia.marti...@bbva.com>
>> > > > > <victormanuel.garcia.marti...@bbva.com<mailto:victormanuel.
>> > > > garcia.marti...@bbva.com>>*
>> > > > >
>> > > > >
>> > > > >
>> > > > > <http://www.beeva.com/>
>> > > > >
>> > > >
>> > > >
>> > > >
>> > >
>> >
>> >
>> >
>> > --
>> > Cheers,
>> > Madhuka Udantha
>> > http://madhukaudantha.blogspot.com
>> >
>>

Reply via email to