I love working with the Python community & I've heard similar requests in
the past few months so its good to have a solid reason to try and add this
functionality :)
Just to be clear though I'm not a Spark committer so when I work on stuff
getting in it very much dependent on me finding a committe
thank u so much for this! Great to see that u listen to the community.
On Thu, Nov 24, 2016 at 12:10 PM, Holden Karau wrote:
> https://issues.apache.org/jira/browse/SPARK-18576
>
> On Thu, Nov 24, 2016 at 2:05 AM, Holden Karau
> wrote:
>
>> Cool - thanks. I'll circle back with the JIRA number o
https://issues.apache.org/jira/browse/SPARK-18576
On Thu, Nov 24, 2016 at 2:05 AM, Holden Karau wrote:
> Cool - thanks. I'll circle back with the JIRA number once I've got it
> created - will probably take awhile before it lands in a Spark release
> (since 2.1 has already branched) but better de
Cool - thanks. I'll circle back with the JIRA number once I've got it
created - will probably take awhile before it lands in a Spark release
(since 2.1 has already branched) but better debugging information for
Python users is certainly important/useful.
On Thu, Nov 24, 2016 at 2:03 AM, Ofer Elias
Since we can't work with log4j in pyspark executors we build our own
logging infrastructure (based on logstash/elastic/kibana).
Would help to have TID in the logs, so we can drill down accordingly.
On Thu, Nov 24, 2016 at 11:48 AM, Holden Karau wrote:
> Hi,
>
> The TaskContext isn't currently e
Hi,
The TaskContext isn't currently exposed in PySpark but I've been meaning to
look at exposing at least some of TaskContext for parity in PySpark. Is
there a particular use case which you want this for? Would help with
crafting the JIRA :)
Cheers,
Holden :)
On Thu, Nov 24, 2016 at 1:39 AM, of