You can still implement your own logic with akka actors for instance. Based on some threshold the actor can launch spark batch mode using the same spark context... It's only an idea , no real experience.
Le 20 oct. 2016 1:31 PM, "Paulo Candido" <pcandid...@gmail.com> a écrit : > In this case I haven't any alternatives to get microbatches with same > length? Using another class or any configuration? I'm using socket. > > Thank you for attention. > > Em qui, 20 de out de 2016 às 09:24, 王贺(Gabriel) <gabriel.wang...@gmail.com> > escreveu: > >> The interval is for time, so you won't get micro-batches in same data >> size but same time length. >> >> Yours sincerely, >> Gabriel (王贺) >> Mobile: +86 18621263813 <+86%20186%202126%203813> >> >> >> On Thu, Oct 20, 2016 at 6:38 PM, pcandido <pcandid...@gmail.com> wrote: >> >> Hello folks, >> >> I'm using Spark Streaming. My question is simple: >> The documentation says that microbatches arrive in intervals. The >> intervals >> are in real time (minutes, seconds). I want to get microbatches with same >> length, so, I can configure SS to return microbatches when it reach a >> determined length? >> >> Thanks. >> >> >> >> -- >> View this message in context: http://apache-spark-user-list. >> 1001560.n3.nabble.com/Microbatches-length-tp27927.html >> Sent from the Apache Spark User List mailing list archive at Nabble.com. >> >> --------------------------------------------------------------------- >> To unsubscribe e-mail: user-unsubscr...@spark.apache.org >> >> >> -- > > Paulo Cândido >