Could you run spark-shell at $SPARK_HOME DIR?
You can try to change you command run at $SPARK_HOME or, point to README.md
with full path.
Peter Zhang
--
Google
Sent with Airmail
On January 19, 2016 at 11:26:14, Oleg Ruchovets (oruchov...@gmail.com) wrote:
It looks spark is not working fine
Thanks,
I will try.
Peter
--
Google
Sent with Airmail
On January 19, 2016 at 12:44:46, Jeff Zhang (zjf...@gmail.com) wrote:
Please make sure you export environment variable HADOOP_CONF_DIR which contains
the core-site.xml
On Mon, Jan 18, 2016 at 8:23 PM, Peter Zhang wrote:
Hi all,
http
file:/user/hive/warehouse/src 16/01/19 12:11:51 ERROR DDLTask:
org.apache.hadoop.hive.ql.metadata.HiveException:
MetaException(message:file:/user/hive/warehouse/src is not a directory or
unable to create one)
How to use HDFS instead of local file system(file)?
Which parameter should to set?
Thanks a lot.
Peter Zhang
--
Google
Sent with Airmail
Hi Eran,
Missing import package.
import org.apache.spark.sql.types._
will work. please try.
Peter Zhang
--
Google
Sent with Airmail
On December 20, 2015 at 21:43:42, Eran Witkon (eranwit...@gmail.com) wrote:
Hi,
I am using spark-shell with version 1.5.2.
scala