I have a python script that is used to submit spark jobs using the
spark-submit tool. I want to execute the command and write the output both
to STDOUT and a logfile in real time. i'm using python 2.7 on a ubuntu
server.
This is what I have so far in my SubmitJob.py script
#!/usr/bin/python
# Sub
No we are using standard spark w/ datastax cassandra. I'm able to see some
json when I do http://10.1.40.16:7080/json/v1/applications
but getting the following errors when I do
http://10.1.40.16:7080/api/v1/applications
HTTP ERROR 503
Problem accessing /api/v1/applications. Reason:
Service
Hi Prateek,
Were you able to figure why this is happening? I'm seeing the same error on
my spark standalone cluster.
Any pointers anyone?
On Fri, Dec 11, 2015 at 2:05 PM, prateek arora
wrote:
>
>
> Hi
>
> I am trying to access Spark Using REST API but got below error :
>
> Command :
>
> curl ht
I tried adding shutdown hook to my code but it didn't help. Still same issue
On Fri, Nov 20, 2015 at 7:08 PM, Ted Yu wrote:
> Which Spark release are you using ?
>
> Can you pastebin the stack trace of the process running on your machine ?
>
> Thanks
>
> On Nov 20
l was shutdown when spark stopped
>
> I hopes this help
>
> Stephane
>
>
> On Fri, Nov 20, 2015 at 7:46 PM, Vikram Kone > wrote:
>
>> Hi,
>> I'm seeing a strange problem. I have a spark cluster in standalone mode.
>> I submit spark jobs fr
Spark 1.4.1
On Friday, November 20, 2015, Ted Yu wrote:
> Which Spark release are you using ?
>
> Can you pastebin the stack trace of the process running on your machine ?
>
> Thanks
>
> On Nov 20, 2015, at 6:46 PM, Vikram Kone > wrote:
>
> Hi,
> I'm se
Hi,
I'm seeing a strange problem. I have a spark cluster in standalone mode. I
submit spark jobs from a remote node as follows from the terminal
spark-submit --master spark://10.1.40.18:7077 --class com.test.Ping
spark-jobs.jar
when the app is running , when I press ctrl-C on the console termina
Hi Feng,
Does airflow allow remote submissions of spark jobs via spark-submit?
On Wed, Nov 18, 2015 at 6:01 PM, Fengdong Yu
wrote:
> Hi,
>
> we use ‘Airflow' as our job workflow scheduler.
>
>
>
>
> On Nov 19, 2015, at 9:47 AM, Vikram Kone wrote:
>
> Hi
t 2015 à 17:51, Hien Luu a
>> écrit :
>>
>>> Looks like Oozie can satisfy most of your requirements.
>>>
>>>
>>>
>>> On Fri, Aug 7, 2015 at 8:43 AM, Vikram Kone
>>> wrote:
>>>
>>>> Hi,
>>>> I
95. You can leverage
> the SLA feature to kill a job if it ran longer than expected.
>
> BTW, we just solved the scalability issue by supporting multiple
> executors. Within a week or two, the code for that should be merged in the
> main trunk.
>
> Hien
>
> On Tue, Oct 6, 2015
Does Azkaban support scheduling long running jobs like spark steaming jobs?
Will Azkaban kill a job if it's running for a long time.
On Friday, August 7, 2015, Vikram Kone wrote:
> Hien,
> Is Azkaban being phased out at linkedin as rumored? If so, what's linkedin
> going
We are using Monit to kick off spark streaming jobs n seems to work fine.
On Monday, September 28, 2015, Chen Song wrote:
> I am also interested specifically in monitoring and alerting on Spark
> streaming jobs. It will be helpful to get some general guidelines or advice
> on this, from people w
Hi,
We are planning to install Spark in stand alone mode on cassandra cluster.
The problem, is since Cassandra has a no-SPOF architecture ie any node can
become the master for the cluster, it creates the problem for Spark master
since it's not a peer-peer architecture where any node can become the
nts.
This became a bit of a brain dump on the topic. I hope that it is
useful. Don't hesitate to get back if I can help.
Regards,
Lars Albertsson
On Fri, Aug 7, 2015 at 5:43 PM, Vikram Kone wrote:
> Hi,
> I'm looking for open source workflow tools/engines that allow us to sche
nd it way better than Oozie. I haven't tried Chronos, and it seemed
>>>> quite involved to set up. Haven't tried Luigi either.
>>>>
>>>> Spark job server is good but as you say lacks some stuff like
>>>> scheduling and DAG type workflows (independent of
workflows (independent of spark-defined job flows).
>>
>>
>> On Fri, Aug 7, 2015 at 7:00 PM, Jörn Franke > > wrote:
>>
>>> Check also falcon in combination with oozie
>>>
>>> Le ven. 7 août 2015 à 17:51, Hien Luu a
>>> écrit :
>&g
ore than
full feature set
On Friday, August 7, 2015, Hien Luu wrote:
> Looks like Oozie can satisfy most of your requirements.
>
>
>
> On Fri, Aug 7, 2015 at 8:43 AM, Vikram Kone > wrote:
>
>> Hi,
>> I'm looking for open source workflow tools/engines that al
Hi,
I'm looking for open source workflow tools/engines that allow us to
schedule spark jobs on a datastax cassandra cluster. Since there are tonnes
of alternatives out there like Ozzie, Azkaban, Luigi , Chronos etc, I
wanted to check with people here to see what they are using today.
Some of the r
18 matches
Mail list logo