Hi Team,
Is it ok to spawn multiple spark jobs within a main spark job, my main
spark job's driver which was launched on yarn cluster, will do some
preprocessing and based on it, it needs to launch multilple spark jobs on
yarn cluster. Not sure if this right pattern.
Please share your thoughts.
S
> Anyway, If you run spark applicaction you would have multiple jobs, which
> makes sense that it is not a problem.
>
>
>
> Thanks David.
>
>
>
> *From:* Naveen [mailto:hadoopst...@gmail.com]
> *Sent:* Wednesday, December 21, 2016 9:18 AM
> *To:* dev@spark.apache.o
launching the
> jobs?
> You can use SparkLauncher in a normal app and just listen for state
> transitions
>
> On Wed, 21 Dec 2016, 11:44 Naveen, wrote:
>
>> Hi Team,
>>
>> Thanks for your responses.
>> Let me give more details in a picture of how I am trying
r these
spawned sparkcontexts will get different nodes / executors from resource
manager?
On Wed, Dec 21, 2016 at 6:43 PM, Naveen wrote:
> Hi Sebastian,
>
> Yes, for fetching the details from Hive and HBase, I would want to use
> Spark's HiveContext etc.
> However, based on your point,
Thanks Liang, Vadim and everyone for your inputs!!
With this clarity, I've tried client modes for both main and sub-spark
jobs. Every main spark job and its corresponding threaded spark jobs are
coming up on the YARN applications list and the jobs are getting executed
properly. I need to now test
with data parallelism -- how can we leverage
Spark's map reduce model to fit distributed training. model of execution
here is more of iterative in nature.
Please let me know.
Thanks, Naveen
On Tue, May 8, 2018 at 8:53 AM, Shivaram Venkataraman <
shiva...@eecs.berkeley.edu> wrote:
>
>
Hi All,
I am starting to use Spark. I am having trouble getting the latest code
from git.
I am using Intellij as suggested in the below link,
https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark#ContributingtoSpark-StarterTasks
The below link isn't working as well,
http://sp
orrect docs link is:
> https://spark.apache.org/docs/1.2.0/building-spark.html
>
> Where did you get that bad link from?
>
> Nick
>
>
>
> On Thu Dec 25 2014 at 12:00:53 AM Naveen Madhire
> wrote:
>
>> Hi All,
>>
>> I am starting to use Spark.
hanks for help
-Naveen
-windows-7
Now it is working fine.
Thanks all.
On Sun, Dec 28, 2014 at 6:10 PM, Naveen Madhire
wrote:
> Hi All,
>
> I am getting the below error while running a simple spark application from
> Eclipse.
>
> I am using Eclipse, Maven, Java.
>
> I've spark running lo
2.0
Thanks
Naveen
the test failure. This would
> have been logged earlier. You would need to say how you ran tests too. The
> tests for 1.2.0 pass for me on several common permutations.
> On Dec 29, 2014 3:22 AM, "Naveen Madhire" wrote:
>
>> Hi,
>>
>> I am follow the below li
Hi All,
I am trying to run a sample Spark program using Scala SBT,
Below is the program,
def main(args: Array[String]) {
val logFile = "E:/ApacheSpark/usb/usb/spark/bin/README.md" // Should
be some file on your system
val sc = new SparkContext("local", "Simple App",
"E:/ApacheSpark/
13 matches
Mail list logo