Ok,
but what do I need for the program to run.
In python sparkcontext = SparkContext(conf) only works when you have
spark installed locally.
AFAIK there is no *pyspark *package for python that you can install doing
pip install pyspark.
You actually need to install spark to get it running (e.g :
h
oing a HTTP request to integrate with your Spark application.
>>
>> To get you started, this might be a useful reference
>>
>>
>> http://blog.michaelhamrah.com/2013/06/scala-web-apis-up-and-running-with-spray-and-akka/
>>
>>
>> On Sun, Feb 28, 201
Hi Alex,
thanks for the link. Will check it.
Does someone know of a more streamlined approach ?
Le lun. 29 févr. 2016 à 10:28, Alex Dzhagriev a écrit :
> Hi Moshir,
>
> I think you can use the rest api provided with Spark:
> https://github.com/apache/spark/blob/master/core/src/main/scala/org/
s your needs it has a bunch of
> integrations. Thus, the source for the jobs could be Kafka, Flume or Akka.
>
> Cheers, Alex.
>
> On Mon, Feb 29, 2016 at 2:48 PM, moshir mikael
> wrote:
>
>> Hi Alex,
>> thanks for the link. Will check it.
>> Does someone know