spark"
Date: 02/12/2016 05:50 AM
Subject: Re: Spark 2.x Pyspark Spark SQL createDataframe Error
Hello Vinayak,
As I understand it, Spark creates a Derby metastore database in the
current location, in the metastore_db subdirectory, whenever you first use
an SQL context. This datab
essController.doPrivileged(Native Method)
> at
> org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown
> Source)
> at
> org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown
> Source)
> at org.apache.derby.impl.serv
pl.store.raw.data.BaseDataFileFactory.boot(Unknown
Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown
Source)
Regards,
Vinayak Joshi
From: Vinayak Joshi5/India/IBM@IBMIN
To: "user.spark"
Date: 01/12/2016 10:53 PM
Subject:Spark 2.x Pyspar
With a local spark instance built with hive support, (-Pyarn -Phadoop-2.6
-Dhadoop.version=2.6.0 -Phive -Phive-thriftserver)
The following script/sequence works in Pyspark without any error against
1.6.x, but fails with 2.x.
people = sc.parallelize(["Michael,30", "Andy,12", "Justin,19"])
peopl