Re: Re: How to get all input tables of a SPARK SQL 'select' statement

2019-01-25 Thread Ramandeep Singh Nanda
Hi, You don't have to run the SQL statement. You can parse it, that will be the logical parsing. val logicalPlan = ss.sessionState.sqlParser.parsePlan(sqlText = query) println(logicalPlan.prettyJson) [ { "class" : "org.apache.spark.sql.catalyst.plans.logical.Project", "num-children" : 1, "

Re: SPIP: DataFrame-based Property Graphs, Cypher Queries, and Algorithms

2019-01-25 Thread HJSC
Hi, I am a sw engineer that works with historical FAA and pilot data and think that Cypher will be a good addition to the Spark ecosystem. I support this proposal and am looking forward to giving it a try. Regards, HJ -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

Re: [PySpark] Revisiting PySpark type annotations

2019-01-25 Thread Reynold Xin
If we can make the annotation compatible with Python 2, why don’t we add type annotation to make life easier for users of Python 3 (with type)? On Fri, Jan 25, 2019 at 7:53 AM Maciej Szymkiewicz wrote: > > Hello everyone, > > I'd like to revisit the topic of adding PySpark type annotations in 3.

[PySpark] Revisiting PySpark type annotations

2019-01-25 Thread Maciej Szymkiewicz
Hello everyone, I'd like to revisit the topic of adding PySpark type annotations in 3.0. It has been discussed before ( http://apache-spark-developers-list.1001551.n3.nabble.com/Python-friendly-API-for-Spark-3-0-td25016.html and http://apache-spark-developers-list.1001551.n3.nabble.com/PYTHON-PySp

答复: Re: How to get all input tables of a SPARK SQL 'select' statement

2019-01-25 Thread luby
Hi, All, I tried the suggested approach and it works, but it requires to 'run' the SQL statement first. I just want to parse the SQL statement without running it, so I can do this in my laptop without connecting to our production environment. I tried to write a tool which uses the SqlBase.g4 b