h variables for my laptop are like
>>>>>>> SPARK_HOME="C:\SPARK-1.3.0\BIN", JAVA_HOME="C:\PROGRAM
>>>>>>> FILES\JAVA\JDK1.7.0_79", HADOOP_HOME="D:\WINUTILS",
>>>>>>> M2_HOME="D:\MAVEN\BIN",
>>>
lthough
>>>>>> I
>>>>>> agree its great :-).
>>>>>>
>>>>>> -sujit
>>>>>>
>>>>>>
>>>>>> On Wed, Jul 8, 2015 at 10:36 AM, Davies Liu
>>>>>> wro
t share the code, but the basic approach is covered in
>>>>> this blog
>>>>> > post - scroll down to the section "Writing a Spark Application".
>>>>> >
>>>>> >
>>>>> https://d
sujit
>>>> >
>>>> >
>>>> > On Wed, Jul 8, 2015 at 7:46 AM, Julian
>>>> wrote:
>>>> >>
>>>> >> Hey.
>>>> >>
>>>> >> Is there a reso
I have something that pretty much works and can
>> publish it, but I'm not a heavy Spark user, so there may be some things
>> I've
>> left out that I haven't hit because of how little of pyspark I'm playing
>> with.
>>
>> Thanks,
>> J
without using the PySpark shell?
>>> >>
>>> >> I can reverse engineer (by following the tracebacks and reading the
>>> shell
>>> >> source) what the relevant Java imports needed are, but I would assume
>>> >> someone has attempted
cebacks and reading the
>> shell
>> >> source) what the relevant Java imports needed are, but I would assume
>> >> someone has attempted this before and just published something I can
>> >> either
>> >> follow or install? If not, I have somethin
and just published something I can
> >> either
> >> follow or install? If not, I have something that pretty much works and
> can
> >> publish it, but I'm not a heavy Spark user, so there may be some things
> >> I've
> >> left out that I haven&
g that pretty much works and can
>> publish it, but I'm not a heavy Spark user, so there may be some things
>> I've
>> left out that I haven't hit because of how little of pyspark I'm playing
>> with.
>>
>> Thanks,
>> Julian
>>
t; with.
>
> Thanks,
> Julian
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/PySpark-without-PySpark-tp23719.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
>
ith.
Thanks,
Julian
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/PySpark-without-PySpark-tp23719.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsub
11 matches
Mail list logo