RE: Fwd: DF creation

2016-03-18 Thread Diwakar Dhanuskodi
Import sqlContext.implicits._  before  using  df () Sent from Samsung Mobile. Original message From: satyajit vegesna Date:19/03/2016 06:00 (GMT+05:30) To: u...@spark.apache.org, dev@spark.apache.org Cc: Subject: Fwd: DF creation Hi , I am trying to create separate val

RE: spark on yarn wastes one box (or 1 GB on each box) for am container

2016-02-09 Thread Diwakar Dhanuskodi
Are you using  Yarn   to  run  spark jobs only  ?. Are you  configuring  spark   properties in  spark-submit parameters? . If  so  did  you  try  with  --no - of - executors x*53 (where  x is  no of  nodes ) --spark executor-memory 1g --spark-driver-memory 1g. You  might  see  yarn  allocating