Yes... I know... but
The cluster is not administered by me
On Mié., Ago. 9, 2017 at 13:46, Gourav Sengupta wrote:
Hi,
Just out of sheer curiosity - why are you using SPARK 1.6? Since then SPARK has
made significant advancement and improvement, why not take advantage of that?
Regards,
Goura
Hi,
Just out of sheer curiosity - why are you using SPARK 1.6? Since then SPARK
has made significant advancement and improvement, why not take advantage
of that?
Regards,
Gourav
On Wed, Aug 9, 2017 at 10:41 AM, toletum wrote:
> Thanks Matteo
>
> I fixed it
>
> Regards,
> JCS
>
> On Mié., Ago
Thanks Matteo
I fixed it
Regards,
JCS
On Mié., Ago. 9, 2017 at 11:22, Matteo Cossu wrote: Hello,
try to use these options when starting Spark:
--conf "spark.driver.userClassPathFirst=true" --conf
"spark.executor.userClassPathFirst=true"
In this way you will be sure that the executor and the dr
Hello,
try to use these options when starting Spark:
*--conf "spark.driver.userClassPathFirst=true" --conf
"spark.executor.userClassPathFirst=true" *
In this way you will be sure that the executor and the driver of Spark will
use the classpath you define.
Best Regards,
Matteo Cossu
On 5 August
Hi everybody
I'm trying to connect Spark to Hive.
Hive uses Derby Server for metastore_db.
$SPARK_HOME/conf/hive-site.xml
javax.jdo.option.ConnectionURL
jdbc:derby://derby:1527/metastore_db;create=true
JDBC connect string for a JDBC metastore
javax.jdo.option.ConnectionDriverName
org.ap