ess:
akka.tcp://sparkExecutor@Precision-T3610:54047
Caused by: akka.remote.transport.Transport$InvalidAssociationException: The
remote system terminated the association because it is shutting down.
]
Am i missing some config ? Can some one please help . Thanks in advance
--
Regards
Pavan Ku
>>
>>> Create an sbt project like:
>>>
>>> // Create your context
>>> val sconf = new
>>> SparkConf().setAppName("Sigmoid").setMaster("spark://sigmoid:7077")
>>> val sc = new SparkContext(sconf)
>>>
one application. Create an sbt project and do sbt run?
>
> Thanks
> Best Regards
>
> On Wed, Jun 3, 2015 at 11:36 AM, pavan kumar Kolamuri <
> pavan.kolam...@gmail.com> wrote:
>
>> Hi guys , i am new to spark . I am using sparksubmit to submit spark
>> jobs. But for m
am missing
something.
Thanks in advance
--
Regards
Pavan Kumar Kolamuri
Hi Everyone,
I saved a 2GB pdf file into MongoDB using GridFS. now i want process those
GridFS collection data using Java Spark Mapreduce. previously i have
successfully processed normal mongoDB collections(not GridFS) with Apache
spark using Mongo-Hadoop connector. now i'm unable to handle input