Re: need someone to help clear some questions.

2014-03-07 Thread Mayur Rustagi
groups.google.com/forum/#!forum/shark-users Mayur Rustagi Ph: +1 (760) 203 3257 http://www.sigmoidanalytics.com @mayur_rustagi On Thu, Mar 6, 2014 at 8:08 PM, qingyang li wrote: > Hi, Yana, do you know if there is mailing list for shark like spark's? > > >

Re: need someone to help clear some questions.

2014-03-06 Thread qingyang li
Hi, Yana, do you know if there is mailing list for shark like spark's? 2014-03-06 23:39 GMT+08:00 Yana Kadiyska : > Hi qingyang, > > 1. You do not need to install shark on every node. > 2. Not really sure..it's just a warning so I'd see if it works despite it > 3. You need to provide the actual

Re: need someone to help clear some questions.

2014-03-06 Thread qingyang li
many thanks for guiding. 2014-03-06 23:39 GMT+08:00 Yana Kadiyska : > Hi qingyang, > > 1. You do not need to install shark on every node. > 2. Not really sure..it's just a warning so I'd see if it works despite it > 3. You need to provide the actual hdfs path, e.g. > hdfs://namenode/user2/vols.c

Re: need someone to help clear some questions.

2014-03-06 Thread Yana Kadiyska
Hi qingyang, 1. You do not need to install shark on every node. 2. Not really sure..it's just a warning so I'd see if it works despite it 3. You need to provide the actual hdfs path, e.g. hdfs://namenode/user2/vols.csv, see this thread https://groups.google.com/forum/#!topic/tachyon-users/3Da4zcHK

Re: need someone to help clear some questions.

2014-03-06 Thread qingyang li
just a addition for #3, i have such configuration in shark-env.sh: export HADOOP_HOME=/usr/lib/hadoop export HADOOP_CONF_DIR=/etc/hadoop/conf export HIVE_HOME=/usr/lib/hive/ #export HIVE_CONF_DIR=/etc/hive/conf export MASTER=spark://bigdata001:7077 - 2014-03-06 16:20 GMT+08:00 qingyang

need someone to help clear some questions.

2014-03-06 Thread qingyang li
hi, spark community, i have setup 3 nodes cluster using spark 0.9 and shark 0.9, My question is : 1. is there any neccessary to install shark on every node since it is a client to use spark service ? 2. when i run shark-withinfo, i got such warning: WARN shark.SharkEnv: Hive Hadoop shims detecte