Re: [Spark] spark client for Hadoop 2.x

2022-04-06 Thread Morven Huang
e use Hadoop > 2.7.7 in our infrastructure currently. > > 1) Does Spark have a plan to publish the Spark client dependencies for Hadoop > 2.x? > 2) Are the new Spark clients capable of connecting to the Hadoop 2.x cluster? > (According to a simple test, Spark client 3.2.1 had

[Spark] spark client for Hadoop 2.x

2022-04-06 Thread Amin Borjian
>From Spark version 3.1.0 onwards, the clients provided for Spark are built >with Hadoop 3 and placed in maven Repository. Unfortunately we use Hadoop >2.7.7 in our infrastructure currently. 1) Does Spark have a plan to publish the Spark client dependencies for Hadoop 2.x? 2) Ar

Re: K8s-Spark client mode : Executor image not able to download application jar from driver

2019-04-28 Thread Nikhil Chinnapa
Thanks for explaining in such detail and pointing to the source code. Yes, its helpful and cleared lot of confusions. -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ - To unsubscribe e-mail: user-unsubscr...

Re: K8s-Spark client mode : Executor image not able to download application jar from driver

2019-04-28 Thread Stavros Kontopoulos
Yes here is why the initial effort didnt work, explained a bit better. As I mentioned earlier SparkContext will add your jars/files (declared with the related conf properties) to the FileServer. If it is a local to the container's fs jar (has schema local:) it will just be resolved to: file + abso

Re: K8s-Spark client mode : Executor image not able to download application jar from driver

2019-04-27 Thread Nikhil Chinnapa
Hi Stavros, Thanks a lot for pointing in right direction. I got stuck in some release, so didn’t got time earlier. The mistake was “LINUX_APP_RESOURCE” : I was using “local” instead it should be “file”. I reached above due to your email only. What I understood: Driver image : $SPARK_HOME/bin an

Re: K8s-Spark client mode : Executor image not able to download application jar from driver

2019-04-19 Thread Stavros Kontopoulos
Hi Nikhil, Application jar by default is added to spark.jars so it is fetched by executors when tasks are launched (behind the scenes SparkContext will

K8s-Spark client mode : Executor image not able to download application jar from driver

2019-04-16 Thread Nikhil Chinnapa
Environment: Spark: 2.4.0 Kubernetes:1.14 Query: Does application jar needs to be part of both Driver and Executor image? Invocation point (from Java code): sparkLaunch = new SparkLauncher() .setMaster(LINUX_MASTER)

Re: Interest in adding ability to request GPU's to the spark client?

2018-07-23 Thread Susan X. Huynh
u >> >> El mié., 16 may. 2018 a las 2:58, Daniel Galvez () >> escribió: >> >>> Hi all, >>> >>> Is anyone here interested in adding the ability to request GPUs to >>> Spark's client (i.e, spark-submit)? As of now, Yarn 3.0's re

Re: Interest in adding ability to request GPU's to the spark client?

2018-07-12 Thread Mich Talebzadeh
>> Spark's client (i.e, spark-submit)? As of now, Yarn 3.0's resource manager >> server has the ability to schedule GPUs as resources via cgroups, but the >> Spark client lacks an ability to request these. >> >> The ability to guarantee GPU resources would

Re: Interest in adding ability to request GPU's to the spark client?

2018-07-12 Thread Maximiliano Felice
ark's > client (i.e, spark-submit)? As of now, Yarn 3.0's resource manager server > has the ability to schedule GPUs as resources via cgroups, but the Spark > client lacks an ability to request these. > > The ability to guarantee GPU resources would be practically useful for my &

Interest in adding ability to request GPU's to the spark client?

2018-05-15 Thread Daniel Galvez
Hi all, Is anyone here interested in adding the ability to request GPUs to Spark's client (i.e, spark-submit)? As of now, Yarn 3.0's resource manager server has the ability to schedule GPUs as resources via cgroups, but the Spark client lacks an ability to request these. The ability to

Can I have two different receivers for my Spark client program?

2016-11-30 Thread kant kodali
HI All, I am wondering if it makes sense to have two receivers inside my Spark Client program? The use case is as follows. 1) We have to support a feed from Kafka so this will be a direct receiver #1. We need to perform batch inserts from kafka feed to Cassandra. 2) an gRPC receiver where we

Re: How to Disable or do minimal Logging for apache spark client Driver program?

2016-10-07 Thread kant kodali
lone, yarn or mesos) just from the command line, however that will require using spark's launcher interface and bundling your application in a jar. On Thu, Oct 6, 2016 at 9:27 AM, kant kodali wrote: > How to Disable or do minimal Logging f

Re: How to Disable or do minimal Logging for apache spark client Driver program?

2016-10-07 Thread Jakob Odersky
t;> referring to spark local mode? It is possible to also run spark >> >> applications in "distributed mode" (i.e. standalone, yarn or >> >> mesos) just from the command line, however that will require >> >> using spark's launcher interface and bu

Re: How to Disable or do minimal Logging for apache spark client Driver program?

2016-10-06 Thread kant kodali
at 9:27 AM, kant kodali wrote: > How to Disable or do minimal Logging for apache spark client Driver program? > I couldn't find this information on docs. By Driver program I mean the java > program where I initialize spark context. It produces lot of INFO messages > but I wou

Re: How to Disable or do minimal Logging for apache spark client Driver program?

2016-10-06 Thread Mahendra Kutare
de" (i.e. standalone, yarn or > mesos) just from the command line, however that will require > using spark's launcher interface and bundling your application in > a jar. > > On Thu, Oct 6, 2016 at 9:27 AM, kant kodali wrote: > > How to Disable or do minimal Logging f

Re: How to Disable or do minimal Logging for apache spark client Driver program?

2016-10-06 Thread Jakob Odersky
lications in "distributed mode" (i.e. standalone, yarn or mesos) just from the command line, however that will require using spark's launcher interface and bundling your application in a jar. On Thu, Oct 6, 2016 at 9:27 AM, kant kodali wrote: > How to Disable or do minimal Logging

How to Disable or do minimal Logging for apache spark client Driver program?

2016-10-06 Thread kant kodali
How to Disable or do minimal Logging for apache spark client Driver program? I couldn't find this information on docs. By Driver program I mean the java program where I initialize spark context. It produces lot of INFO messages but I would like to know only when there is error or a Exception

Re: How to set up a Spark Client node?

2015-06-15 Thread ayan guha
workers in UI which typically starts on Master URL:8080. Once you do that,you follow Akhil's instruction above to get a sqlContexxt and set master property properly and runyour app. HTH On Mon, Jun 15, 2015 at 7:02 PM, Akhil Das wrote: > I'm assuming by spark-client you mean the

Re: How to set up a Spark Client node?

2015-06-15 Thread Akhil Das
I'm assuming by spark-client you mean the spark driver program. In that case you can pick any machine (say Node 7), create your driver program in it and use spark-submit to submit it to the cluster or if you create the SparkContext within your driver program (specifying all the properties)

How to set up a Spark Client node?

2015-06-13 Thread MrAsanjar .
Is there any instuctions on how to setup spark client in a cluster mode? I am not sure if I am doing it right. Thanks in advance

Re: Spark Client

2015-06-03 Thread pavan kumar Kolamuri
>>> // Do some computations >>> sc.parallelize(1 to 1).take(10).foreach(println) >>> >>> //Now return the exit status >>> System.exit(Some number) >>> >>> Now, make your workflow manager to trigger *sbt run* on the project >>> ins

Re: Spark Client

2015-06-03 Thread Oleg Zhurakousky
un 3, 2015 at 2:18 PM, pavan kumar Kolamuri < >> pavan.kolam...@gmail.com> wrote: >> >>> Hi akhil , sorry i may not conveying the question properly . Actually >>> we are looking to Launch a spark job from a long running workflow manager, >>> which invokes

Re: Spark Client

2015-06-03 Thread Richard Marscher
.com> wrote: > >> Hi akhil , sorry i may not conveying the question properly . Actually we >> are looking to Launch a spark job from a long running workflow manager, >> which invokes spark client via SparkSubmit. Unfortunately the client upon >> successful complet

Re: Spark Client

2015-06-03 Thread Akhil Das
ay not conveying the question properly . Actually we > are looking to Launch a spark job from a long running workflow manager, > which invokes spark client via SparkSubmit. Unfortunately the client upon > successful completion of the application exits with a System.exit(0) or > Syst

Re: Spark Client

2015-06-03 Thread pavan kumar Kolamuri
Hi akhil , sorry i may not conveying the question properly . Actually we are looking to Launch a spark job from a long running workflow manager, which invokes spark client via SparkSubmit. Unfortunately the client upon successful completion of the application exits with a System.exit(0) or

Re: Spark Client

2015-06-03 Thread Akhil Das
i don't want it to be exit with System.exit . Is there > any other spark client which is api friendly other than SparkSubmit which > shouldn't exit with system.exit. Please correct me if i am missing > something. > > Thanks in advance > > > > > -- > Regards > Pavan Kumar Kolamuri > >

Spark Client

2015-06-02 Thread pavan kumar Kolamuri
Hi guys , i am new to spark . I am using sparksubmit to submit spark jobs. But for my use case i don't want it to be exit with System.exit . Is there any other spark client which is api friendly other than SparkSubmit which shouldn't exit with system.exit. Please correct me if i

Re: How to send user variables from Spark client to custom InputFormat or RecordReader ?

2015-02-22 Thread hnahak
Instead of setting in SparkConf , set it into SparkContext.hadoopconfiguration.set(key,value) and from JobContext extract same key. --Harihar -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-send-user-variables-from-Spark-client-to-custom

Re: How to send user variables from Spark client to custom InputFormat or RecordReader ?

2015-02-22 Thread haihar nahak
;s HadoopRDD and create a JobConf (with whatever variables you > want), and then grab them out of the JobConf in your RecordReader. > > On Sun, Feb 22, 2015 at 4:28 PM, hnahak wrote: > >> Hi, >> >> I have written custom InputFormat and RecordReader for Spark, I need

Re: How to send user variables from Spark client to custom InputFormat or RecordReader ?

2015-02-22 Thread Tom Vacek
have written custom InputFormat and RecordReader for Spark, I need to > use > user variables from spark client program. > > I added them in SparkConf > > val sparkConf = new > SparkConf().setAppName(args(0)).set("developer","MyName") > > *and in Input

How to send user variables from Spark client to custom InputFormat or RecordReader ?

2015-02-22 Thread hnahak
Hi, I have written custom InputFormat and RecordReader for Spark, I need to use user variables from spark client program. I added them in SparkConf val sparkConf = new SparkConf().setAppName(args(0)).set("developer","MyName") *and in InputFormat class*

Re: Hadoop 2.X Spark Client Jar 0.9.0 problem

2014-04-04 Thread Erik Freed
gt; Thanks, > Rahul Singhal > > From: Erik Freed > Reply-To: "user@spark.apache.org" > Date: Friday 4 April 2014 7:58 PM > To: "user@spark.apache.org" > Subject: Hadoop 2.X Spark Client Jar 0.9.0 problem > > Hi All, > > I am not sure if this i

Re: Hadoop 2.X Spark Client Jar 0.9.0 problem

2014-04-04 Thread Amit Tewari
I believe you got to set following SPARK_HADOOP_VERSION=2.2.0 (or whatever your version is) SPARK_YARN=true then type sbt/sbt assembly If you are using Maven to compile mvn -Pyarn -Dhadoop.version=2.2.0 -Dyarn.version=2.2.0 -DskipTests clean package Hope this helps -A On Fri, Apr 4, 2014 a

Re: Hadoop 2.X Spark Client Jar 0.9.0 problem

2014-04-04 Thread Rahul Singhal
quot; mailto:user@spark.apache.org>> Date: Friday 4 April 2014 7:58 PM To: "user@spark.apache.org<mailto:user@spark.apache.org>" mailto:user@spark.apache.org>> Subject: Hadoop 2.X Spark Client Jar 0.9.0 problem Hi All, I am not sure if this is a 0.9.0 problem to be fixed

Hadoop 2.X Spark Client Jar 0.9.0 problem

2014-04-04 Thread Erik Freed
Hi All, I am not sure if this is a 0.9.0 problem to be fixed in 0.9.1 so perhaps already being addressed, but I am having a devil of a time with a spark 0.9.0 client jar for hadoop 2.X. If I go to the site and download: - Download binaries for Hadoop 2 (HDP2, CDH5): find an Apache mirror