It's not oly the linter, but also autocompletion and help.
Aside, in the module some functions are declared statically and the difference
is not clear.
On 2019/04/17 11:35:53, Sean Owen wrote:
> I use IntelliJ and have never seen an issue parsing the pyspark
> functions... you're just saying
+1 I'm good with changing too.
On Thu, 18 Apr 2019, 01:18 Reynold Xin, wrote:
> Are you talking about the ones that are defined in a dictionary? If yes,
> that was actually not that great in hindsight (makes it harder to read &
> change), so I'm OK changing it.
>
> E.g.
>
> _functions = {
>
Are you talking about the ones that are defined in a dictionary? If yes, that
was actually not that great in hindsight (makes it harder to read & change), so
I'm OK changing it.
E.g.
_functions = {
'lit': _lit_doc,
'col': 'Returns a :class:`Column` based on the given column name.',
I use IntelliJ and have never seen an issue parsing the pyspark
functions... you're just saying the linter has an optional inspection
to flag it? just disable that?
I don't think we want to complicate the Spark code just for this. They
are declared at runtime for a reason.
On Wed, Apr 17, 2019 at
Hi,
I'm aware of various workarounds to make this work smoothly in various IDEs,
but wouldn't better to solve the root cause?
I've seen the code and don't see anything that requires such level of dynamic
code, the translation is 99% trivial.
On 2019/04/16 12:16:41, 880f0464 <880f0...@protonmai
Hi.
That's a problem with Spark as such and in general can be addressed on IDE to
IDE basis - see for example https://stackoverflow.com/q/40163106 for some hints.
Sent with ProtonMail Secure Email.
‐‐‐ Original Message ‐‐‐
On Tuesday, April 16, 2019 2:10 PM, educhana wrote:
> Hi,
>
>