Another question on pyspark code -- how come there is no logging at all?
does python logging have an unreasonable overhead, or its impossible to
configure or something?

I'm really surprised nobody has ever wanted to me able to turn on some
debug or trace logging in pyspark by just configuring a logging level.

For me, I wanted this during debugging while developing -- I'd work on some
part of the code and drop in a bunch of print statements.  Then I'd rip
those out when I think I'm ready to submit a patch.  But then I realize I
forgot some case, then more debugging -- oh gotta add those print
statements in again ...

does somebody jsut need to setup the configuration properly, or is there a
bigger reason to avoid logging in python?

thanks,
Imran

Reply via email to