ore
While _jvm at the driver end looks fine:
>>> SparkContext._active_spark_context._jvm.java.lang.Integer.valueOf("123".strip())
123
The program is trivial, I just wonder what is the right way to reach
JVM in python. Any help would be appreciated.
Thanks
--
Yizhi Liu
Se
Hi Ted,
Thank you for reply. The sc works at driver, but how can I reach the
JVM in rdd.map ?
2015-09-29 11:26 GMT+08:00 Ted Yu :
>>>> sc._jvm.java.lang.Integer.valueOf("12")
> 12
>
> FYI
>
> On Mon, Sep 28, 2015 at 8:08 PM, YiZhi Liu wrote:
>>
>
alue to workers, you can use broadcast variable.
>
> Cheers
>
> On Mon, Sep 28, 2015 at 10:31 PM, YiZhi Liu wrote:
>>
>> Hi Ted,
>>
>> Thank you for reply. The sc works at driver, but how can I reach the
>> JVM in rdd.map ?
>>
>> 2015-09
N directly? Instead,
it uses breeze.optimize.LBFGS and re-implements most of the procedures
in mllib.optimization.{LBFGS,OWLQN}.
Thank you.
Best,
--
Yizhi Liu
Senior Software Engineer / Data Mining
www.mvad.com, Shanghai, China
--
any problem. Thank you!
2015-10-08 1:15 GMT+08:00 Joseph Bradley :
> Hi YiZhi Liu,
>
> The spark.ml classes are part of the higher-level "Pipelines" API, which
> works with DataFrames. When creating this API, we decided to separate it
> from the old API to avoid confusion
working code now, so it's time to
> try to refactor those code to share more.)
>
>
> Sincerely,
>
> DB Tsai
> --
> Blog: https://www.dbtsai.com
> PGP Key ID: 0xAF08DF8D
>
> On Mon, Oct 12, 2015 at 1:24 AM,
s I introduced more conflict, but I couldn't figure out which
one caused this failure.
Interestingly, when I ran mvn test in my project, which test spark job
in locally mode, all worked fine.
So what is the right way to take user jars precedence over Spark jars?
--
Yizhi Liu
Senior Software En
ue --conf
> spark.executor.userClassPathFirst=true
>
> Cheers
>
> On Mon, Oct 19, 2015 at 5:07 AM, YiZhi Liu wrote:
>>
>> I'm trying to read a Thrift object from SequenceFile, using
>> elephant-bird's ThriftWritable. My code looks like
>>
>> val rawData = sc.sequenceFile[B
---
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>
> -
> To unsubscr