This looks to me like you have incompatible versions of scala on your
classpath?

On Thu, Apr 2, 2015 at 4:28 PM, Okehee Goh <oke...@gmail.com> wrote:

> yes, below is the stacktrace.
> Thanks,
> Okehee
>
> java.lang.NoSuchMethodError: 
> scala.reflect.NameTransformer$.LOCAL_SUFFIX_STRING()Ljava/lang/String;
>       at scala.reflect.internal.StdNames$CommonNames.<init>(StdNames.scala:97)
>       at scala.reflect.internal.StdNames$Keywords.<init>(StdNames.scala:203)
>       at scala.reflect.internal.StdNames$TermNames.<init>(StdNames.scala:288)
>       at scala.reflect.internal.StdNames$nme$.<init>(StdNames.scala:1045)
>       at 
> scala.reflect.internal.SymbolTable.nme$lzycompute(SymbolTable.scala:16)
>       at scala.reflect.internal.SymbolTable.nme(SymbolTable.scala:16)
>       at scala.reflect.internal.StdNames$class.$init$(StdNames.scala:1041)
>       at scala.reflect.internal.SymbolTable.<init>(SymbolTable.scala:16)
>       at scala.reflect.runtime.JavaUniverse.<init>(JavaUniverse.scala:16)
>       at scala.reflect.runtime.package$.universe$lzycompute(package.scala:17)
>       at scala.reflect.runtime.package$.universe(package.scala:17)
>       at org.apache.spark.sql.types.NativeType.<init>(dataTypes.scala:337)
>       at org.apache.spark.sql.types.StringType.<init>(dataTypes.scala:351)
>       at org.apache.spark.sql.types.StringType$.<init>(dataTypes.scala:367)
>       at org.apache.spark.sql.types.StringType$.<clinit>(dataTypes.scala)
>       at org.apache.spark.sql.types.DataTypes.<clinit>(DataTypes.java:30)
>       at 
> com.quixey.dataengine.dataprocess.parser.ToTableRecord.generateTableSchemaForSchemaRDD(ToTableRecord.java:282)
>       at 
> com.quixey.dataengine.dataprocess.parser.ToUDMTest.generateTableSchemaTest(ToUDMTest.java:132)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:483)
>       at 
> org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:85)
>       at org.testng.internal.Invoker.invokeMethod(Invoker.java:696)
>       at org.testng.internal.Invoker.invokeTestMethod(Invoker.java:882)
>       at org.testng.internal.Invoker.invokeTestMethods(Invoker.java:1189)
>       at 
> org.testng.internal.TestMethodWorker.invokeTestMethods(TestMethodWorker.java:124)
>       at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:108)
>       at org.testng.TestRunner.privateRun(TestRunner.java:767)
>       at org.testng.TestRunner.run(TestRunner.java:617)
>       at org.testng.SuiteRunner.runTest(SuiteRunner.java:348)
>       at org.testng.SuiteRunner.runSequentially(SuiteRunner.java:343)
>       at org.testng.SuiteRunner.privateRun(SuiteRunner.java:305)
>       at org.testng.SuiteRunner.run(SuiteRunner.java:254)
>       at org.testng.SuiteRunnerWorker.runSuite(SuiteRunnerWorker.java:52)
>       at org.testng.SuiteRunnerWorker.run(SuiteRunnerWorker.java:86)
>       at org.testng.TestNG.runSuitesSequentially(TestNG.java:1224)
>       at org.testng.TestNG.runSuitesLocally(TestNG.java:1149)
>       at org.testng.TestNG.run(TestNG.java:1057)
>       at 
> org.gradle.api.internal.tasks.testing.testng.TestNGTestClassProcessor.stop(TestNGTestClassProcessor.java:115)
>       at 
> org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.stop(SuiteTestClassProcessor.java:57)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:483)
>       at 
> org.gradle.messaging.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
>       at 
> org.gradle.messaging.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
>       at 
> org.gradle.messaging.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:32)
>       at 
> org.gradle.messaging.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:93)
>       at com.sun.proxy.$Proxy2.stop(Unknown Source)
>       at 
> org.gradle.api.internal.tasks.testing.worker.TestWorker.stop(TestWorker.java:115)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:483)
>       at 
> org.gradle.messaging.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
>       at 
> org.gradle.messaging.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
>       at 
> org.gradle.messaging.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:355)
>       at 
> org.gradle.internal.concurrent.DefaultExecutorFactory$StoppableExecutorImpl$1.run(DefaultExecutorFactory.java:64)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>       at java.lang.Thread.run(Thread.java:745)
>
>
> On Thu, Apr 2, 2015 at 2:51 PM, Michael Armbrust <mich...@databricks.com>
> wrote:
>
>> Do you have a full stack trace?
>>
>> On Thu, Apr 2, 2015 at 11:45 AM, ogoh <oke...@gmail.com> wrote:
>>
>>>
>>> Hello,
>>> My ETL uses sparksql to generate parquet files which are served through
>>> Thriftserver using hive ql.
>>> It especially defines a schema programmatically since the schema can be
>>> only
>>> known at runtime.
>>> With spark 1.2.1, it worked fine (followed
>>>
>>> https://spark.apache.org/docs/latest/sql-programming-guide.html#programmatically-specifying-the-schema
>>> ).
>>>
>>> I am trying to migrate into spark 1.3.0, but the API are confusing.
>>> I am not sure if the example of
>>>
>>> https://spark.apache.org/docs/latest/sql-programming-guide.html#programmatically-specifying-the-schema
>>> is still valid on Spark1.3.0?
>>> For example, DataType.StringType is not there any more.
>>> Instead, I found DataTypes.StringType etc. So, I migrated as below and it
>>> builds fine.
>>> But at runtime, it throws Exception.
>>>
>>> I appreciate any help.
>>> Thanks,
>>> Okehee
>>>
>>> == Exception thrown
>>> java.lang.reflect.InvocationTargetException
>>> scala.reflect.NameTransformer$.LOCAL_SUFFIX_STRING()Ljava/lang/String;
>>> java.lang.NoSuchMethodError:
>>> scala.reflect.NameTransformer$.LOCAL_SUFFIX_STRING()Ljava/lang/String;
>>>
>>> ==== my code's snippet
>>> import org.apache.spark.sql.types.DataTypes;
>>> DataTypes.createStructField(property, DataTypes.IntegerType, true)
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Generating-a-schema-in-Spark-1-3-failed-while-using-DataTypes-tp22362.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>

Reply via email to