Have a look at creating a scheduler allocation file with fair scheduling.

<?xml version="1.0"?>
<allocations>
    <pool name="default">
        <schedulingMode>FAIR</schedulingMode>
        <weight>1</weight>
        <minShare>2</minShare>
    </pool>
    <pool name="my _pool">
        <schedulingMode>FAIR</schedulingMode>
        <weight>1</weight>
        <minShare>2</minShare>
    </pool>
</allocations>

Set the following:
def settingsMap = Map(("spark.scheduler.allocation.file", 
schedulerAllocationFile),
      ("spark.scheduler.mode", "FAIR"),
      ("spark.streaming.concurrentJobs", "5"))


Thanks,

From: prateek arora [mailto:prateek.arora...@gmail.com]
Sent: Thursday, 10 December 2015 8:07 AM
To: Ted Yu
Cc: user
Subject: Re: can i process multiple batch in parallel in spark streaming

Hi Thanks

In my scenario batches are independent .so is it safe to use in production 
environment ?

Regards
Prateek

On Wed, Dec 9, 2015 at 11:39 AM, Ted Yu 
<yuzhih...@gmail.com<mailto:yuzhih...@gmail.com>> wrote:
Have you seen this thread ?

http://search-hadoop.com/m/q3RTtgSGrobJ3Je

On Wed, Dec 9, 2015 at 11:12 AM, prateek arora 
<prateek.arora...@gmail.com<mailto:prateek.arora...@gmail.com>> wrote:
Hi

when i run my spark streaming application .. following information show on
application streaming UI.
i am using spark 1.5.0


Batch Time              Input Size   Scheduling Delay (?) Processing Time (?)
Status
2015/12/09 11:00:42     107 events      -                       -               
queued
2015/12/09 11:00:41     103 events      -                       -               
queued
2015/12/09 11:00:40     107 events      -                       -               
queued
2015/12/09 11:00:39     105 events      -                       -               
queued
2015/12/09 11:00:38     109 events      -                       -               
queued
2015/12/09 11:00:37     106 events      -                       -               
queued
2015/12/09 11:00:36     109 events      -                       -               
queued
2015/12/09 11:00:35     113 events      -                       -               
queued
2015/12/09 11:00:34     109 events      -                       -               
queued
2015/12/09 11:00:33     107 events      -                       -               
queued
2015/12/09 11:00:32     99 events       42 s                    -               
processing



it seems batches push into queue and work like FIFO manner  . is it possible
all my Active batches start processing in parallel.

Regards
Prateek



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/can-i-process-multiple-batch-in-parallel-in-spark-streaming-tp25653.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>
For additional commands, e-mail: 
user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>



_____________________________________________________________________

The information transmitted in this message and its attachments (if any) is 
intended 
only for the person or entity to which it is addressed.
The message may contain confidential and/or privileged material. Any review, 
retransmission, dissemination or other use of, or taking of any action in 
reliance 
upon this information, by persons or entities other than the intended recipient 
is 
prohibited.

If you have received this in error, please contact the sender and delete this 
e-mail 
and associated material from any computer.

The intended recipient of this e-mail may only use, reproduce, disclose or 
distribute 
the information contained in this e-mail and any attached files, with the 
permission 
of the sender.

This message has been scanned for viruses.
_____________________________________________________________________

Reply via email to