FWIW - I synchronized access to the transformer and the problem went
away so this looks like some type of concurrent access issue when
dealing with UDFs

On Tue, Mar 29, 2016 at 9:19 AM, Timothy Potter <thelabd...@gmail.com> wrote:
> It's a local spark master, no cluster. I'm not sure what you mean
> about assembly or package? all of the Spark dependencies are on my
> classpath and this sometimes works.
>
> On Mon, Mar 28, 2016 at 11:45 PM, Jacek Laskowski <ja...@japila.pl> wrote:
>> Hi,
>>
>> How do you run the pipeline? Do you assembly or package? Is this on
>> local or spark or other cluster manager? What's the build
>> configuration?
>>
>> Pozdrawiam,
>> Jacek Laskowski
>> ----
>> https://medium.com/@jaceklaskowski/
>> Mastering Apache Spark http://bit.ly/mastering-apache-spark
>> Follow me at https://twitter.com/jaceklaskowski
>>
>>
>> On Mon, Mar 28, 2016 at 7:11 PM, Timothy Potter <thelabd...@gmail.com> wrote:
>>> I'm seeing the following error when trying to generate a prediction
>>> from a very simple ML pipeline based model. I've verified that the raw
>>> data sent to the tokenizer is valid (not null). It seems like this is
>>> some sort of weird classpath or class loading type issue. Any help you
>>> can provide in trying to troubleshoot this further would be
>>> appreciated.
>>>
>>>  Error in machine-learning, docId=20news-18828/alt.atheism/51176
>>> scala.reflect.internal.Symbols$CyclicReference: illegal cyclic
>>> reference involving package <root>
>>>     at scala.reflect.internal.Symbols$TypeSymbol.tpe(Symbols.scala:2768)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> scala.reflect.internal.Mirrors$Roots$RootPackage$.<init>(Mirrors.scala:268)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> scala.reflect.internal.Mirrors$Roots.RootPackage$lzycompute(Mirrors.scala:267)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at scala.reflect.internal.Mirrors$Roots.RootPackage(Mirrors.scala:267)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> scala.reflect.runtime.JavaMirrors$JavaMirror.scala$reflect$runtime$JavaMirrors$$makeScalaPackage(JavaMirrors.scala:902)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> scala.reflect.runtime.JavaMirrors$class.missingHook(JavaMirrors.scala:1299)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at scala.reflect.runtime.JavaUniverse.missingHook(JavaUniverse.scala:12)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> scala.reflect.internal.Mirrors$RootsBase.universeMissingHook(Mirrors.scala:77)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> scala.reflect.internal.Mirrors$RootsBase.missingHook(Mirrors.scala:79)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> scala.reflect.internal.Mirrors$RootsBase.staticModuleOrClass(Mirrors.scala:72)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:119)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:21)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> org.apache.spark.ml.feature.HashingTF$$typecreator1$1.apply(HashingTF.scala:66)
>>> ~[spark-mllib_2.10-1.6.1.jar:1.6.1]
>>>     at 
>>> scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe$lzycompute(TypeTags.scala:231)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe(TypeTags.scala:231)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> org.apache.spark.sql.catalyst.ScalaReflection$class.localTypeOf(ScalaReflection.scala:654)
>>> ~[spark-catalyst_2.10-1.6.1.jar:1.6.1]
>>>     at 
>>> org.apache.spark.sql.catalyst.ScalaReflection$.localTypeOf(ScalaReflection.scala:30)
>>> ~[spark-catalyst_2.10-1.6.1.jar:1.6.1]
>>>     at 
>>> org.apache.spark.sql.catalyst.ScalaReflection$class.schemaFor(ScalaReflection.scala:642)
>>> ~[spark-catalyst_2.10-1.6.1.jar:1.6.1]
>>>     at 
>>> org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:30)
>>> ~[spark-catalyst_2.10-1.6.1.jar:1.6.1]
>>>     at org.apache.spark.sql.functions$.udf(functions.scala:2576)
>>> ~[spark-sql_2.10-1.6.1.jar:1.6.1]
>>>     at org.apache.spark.ml.feature.HashingTF.transform(HashingTF.scala:66)
>>> ~[spark-mllib_2.10-1.6.1.jar:1.6.1]
>>>     at 
>>> org.apache.spark.ml.PipelineModel$$anonfun$transform$1.apply(Pipeline.scala:297)
>>> ~[spark-mllib_2.10-1.6.1.jar:1.6.1]
>>>     at 
>>> org.apache.spark.ml.PipelineModel$$anonfun$transform$1.apply(Pipeline.scala:297)
>>> ~[spark-mllib_2.10-1.6.1.jar:1.6.1]
>>>     at 
>>> scala.collection.IndexedSeqOptimized$class.foldl(IndexedSeqOptimized.scala:51)
>>> ~[scala-library-2.10.5.jar:?]
>>>     at 
>>> scala.collection.IndexedSeqOptimized$class.foldLeft(IndexedSeqOptimized.scala:60)
>>> ~[scala-library-2.10.5.jar:?]
>>>     at scala.collection.mutable.ArrayOps$ofRef.foldLeft(ArrayOps.scala:108)
>>> ~[scala-library-2.10.5.jar:?]
>>>     at org.apache.spark.ml.PipelineModel.transform(Pipeline.scala:297)
>>> ~[spark-mllib_2.10-1.6.1.jar:1.6.1]
>>>     at 
>>> org.apache.spark.ml.tuning.CrossValidatorModel.transform(CrossValidator.scala:338)
>>> ~[spark-mllib_2.10-1.6.1.jar:1.6.1]
>>>
>>>
>>> I've also seen similar errors such as:
>>>
>>> java.lang.AssertionError: assertion failed: List(package linalg, package 
>>> linalg)
>>>     at scala.reflect.internal.Symbols$Symbol.suchThat(Symbols.scala:1678)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:44)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> scala.reflect.internal.Mirrors$RootsBase.staticModuleOrClass(Mirrors.scala:72)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:119)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:21)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> org.apache.spark.ml.feature.HashingTF$$typecreator1$1.apply(HashingTF.scala:66)
>>> ~[spark-mllib_2.10-1.6.1.jar:1.6.1]
>>>     at 
>>> scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe$lzycompute(TypeTags.scala:231)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe(TypeTags.scala:231)
>>> ~[scala-reflect-2.10.5.jar:?]
>>>     at 
>>> org.apache.spark.sql.catalyst.ScalaReflection$class.localTypeOf(ScalaReflection.scala:654)
>>> ~[spark-catalyst_2.10-1.6.1.jar:1.6.1]
>>>     at 
>>> org.apache.spark.sql.catalyst.ScalaReflection$.localTypeOf(ScalaReflection.scala:30)
>>> ~[spark-catalyst_2.10-1.6.1.jar:1.6.1]
>>>     at 
>>> org.apache.spark.sql.catalyst.ScalaReflection$class.schemaFor(ScalaReflection.scala:642)
>>> ~[spark-catalyst_2.10-1.6.1.jar:1.6.1]
>>>     at 
>>> org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:30)
>>> ~[spark-catalyst_2.10-1.6.1.jar:1.6.1]
>>>     at org.apache.spark.sql.functions$.udf(functions.scala:2576)
>>> ~[spark-sql_2.10-1.6.1.jar:1.6.1]
>>>     at org.apache.spark.ml.feature.HashingTF.transform(HashingTF.scala:66)
>>> ~[spark-mllib_2.10-1.6.1.jar:1.6.1]
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to