It's not oly the linter, but also autocompletion and help. 

Aside, in the module some functions are declared statically and the difference 
is not clear. 

On 2019/04/17 11:35:53, Sean Owen <sro...@gmail.com> wrote: 
> I use IntelliJ and have never seen an issue parsing the pyspark
> functions... you're just saying the linter has an optional inspection
> to flag it? just disable that?
> I don't think we want to complicate the Spark code just for this. They
> are declared at runtime for a reason.
> 
> On Wed, Apr 17, 2019 at 6:27 AM educh...@gmail.com <educh...@gmail.com> wrote:
> >
> > Hi,
> >
> > I'm aware of various workarounds to make this work smoothly in various 
> > IDEs, but wouldn't better to solve the root cause?
> >
> > I've seen the code and don't see anything that requires such level of 
> > dynamic code, the translation is 99% trivial.
> >
> > On 2019/04/16 12:16:41, 880f0464 <880f0...@protonmail.com.INVALID> wrote:
> > > Hi.
> > >
> > > That's a problem with Spark as such and in general can be addressed on 
> > > IDE to IDE basis - see for example https://stackoverflow.com/q/40163106 
> > > for some hints.
> > >
> > >
> > > Sent with ProtonMail Secure Email.
> > >
> > > ‐‐‐‐‐‐‐ Original Message ‐‐‐‐‐‐‐
> > > On Tuesday, April 16, 2019 2:10 PM, educhana <educh...@gmail.com> wrote:
> > >
> > > > Hi,
> > > >
> > > > Currently using pyspark.sql.functions from an IDE like PyCharm is 
> > > > causing
> > > > the linters complain due to the functions being declared at runtime.
> > > >
> > > > Would a PR fixing this be welcomed? Is there any problems/difficulties 
> > > > I'm
> > > > unaware?
> > > >
> > > >
> > > > ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
> > > >
> > > > Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
> > > >
> > > > ----------------------------------------------------------------------
> > > >
> > > > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> > >
> > >
> > >
> > > ---------------------------------------------------------------------
> > > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> > >
> > >
> >
> > ---------------------------------------------------------------------
> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >
> 
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> 
> 

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to