Hi Jincheng,

>
> Yes, we can add use case examples in both google doc and FLIP, I had
> already add the simple usage in the google doc, here I want to know which
> kind of examples you want? :)
>

Do you have use cases where the Python table API can be applied without UDF
support?

(And where the same could not be accomplished with just SQL.)


> The very short answer to UDF support is Yes. As you said, we need UDF
> support on the Python Table API, including (UDF, UDTF, UDAF). This needs to
> be discussed after basic Python TableAPI supported. Because UDF involves
> the management of the python environment, Runtime level Java and Runtime
> communication, and UDAF in Flink also involves the application of State, so
> this is a topic that is worth discussing in depth in a separate thread.
>

The current proposal for job submission touches something that Beam
portability already had to solve.

If we think that the Python table API will only be useful with UDF support
(question above), then it may be better to discuss the first step with the
final goal in mind. If we find that Beam can be used for the UDF part then
approach 1 vs. approach 2 in the doc (for the client side language
boundary) may look different.


>
> I think that no matter how the Flink and Beam work together on the UDF
> level, it will not affect the current Python API (interface), we can first
> support the Python API in Flink. Then start the UDX (UDF/UDTF/UDAF)
> support.
>
>
I agree that the client side API should not be affected.


> And great thanks for your valuable comments in Google doc! I will feedback
> you in the google doc. :)
>
>
> Regards,
> Jincheng
>
> Thomas Weise <t...@apache.org> 于2019年4月4日周四 上午8:03写道:
>
> > Thanks for putting this proposal together.
> >
> > It would be nice, if you could share a few use case examples (maybe add
> > them as section to the FLIP?).
> >
> > The reason I ask: The table API is immensely useful, but it isn't clear
> to
> > me what value other language bindings provide without UDF support. With
> > FLIP-38 it will be possible to write a program in Python, but not execute
> > Python functions. Without UDF support, isn't it possible to achieve
> roughly
> > the same with plain SQL? In which situation would I use the Python API?
> >
> > There was related discussion regarding UDF support in [1]. If the
> > assumption is that such support will be added later, then I would like to
> > circle back to the question why this cannot be built on top of Beam? It
> > would be nice to clarify the bigger goal before embarking for the first
> > milestone.
> >
> > I'm going to comment on other things in the doc.
> >
> > [1]
> >
> >
> https://lists.apache.org/thread.html/f6f8116b4b38b0b2d70ed45b990d6bb1bcb33611fde6fdf32ec0e840@%3Cdev.flink.apache.org%3E
> >
> > Thomas
> >
> >
> > On Wed, Apr 3, 2019 at 12:35 PM Shuyi Chen <suez1...@gmail.com> wrote:
> >
> > > Thanks a lot for driving the FLIP, jincheng. The approach looks
> > > good. Adding multi-lang support sounds a promising direction to expand
> > the
> > > footprint of Flink. Do we have plan for adding Golang support? As many
> > > backend engineers nowadays are familiar with Go, but probably not Java
> as
> > > much, adding Golang support would significantly reduce their friction
> to
> > > use Flink. Also, do we have a design for multi-lang UDF support, and
> > what's
> > > timeline for adding DataStream API support? We would like to help and
> > > contribute as well as we do have similar need internally at our
> company.
> > > Thanks a lot.
> > >
> > > Shuyi
> > >
> > > On Tue, Apr 2, 2019 at 1:03 AM jincheng sun <sunjincheng...@gmail.com>
> > > wrote:
> > >
> > > > Hi All,
> > > > As Xianda brought up in the previous email, There are a large number
> of
> > > > data analysis users who want flink to support Python. At the Flink
> API
> > > > level, we have DataStreamAPI/DataSetAPI/TableAPI&SQL, the Table API
> > will
> > > > become the first-class citizen. Table API is declarative and can be
> > > > automatically optimized, which is mentioned in the Flink mid-term
> > roadmap
> > > > by Stephan. So we first considering supporting Python at the Table
> > level
> > > to
> > > > cater to the current large number of analytics users. For further
> > promote
> > > > Python support in flink table level. Dian, Wei and I discussed
> offline
> > a
> > > > bit and came up with an initial features outline as follows:
> > > >
> > > > - Python TableAPI Interface
> > > >   Introduce a set of Python Table API interfaces, including interface
> > > > definitions such as Table, TableEnvironment, TableConfig, etc.
> > > >
> > > > - Implementation Architecture
> > > >   We will offer two alternative architecture options, one for pure
> > Python
> > > > language support and one for extended multi-language design.
> > > >
> > > > - Job Submission
> > > >   Provide a way that can submit(local/remote) Python Table API jobs.
> > > >
> > > > - Python Shell
> > > >   Python Shell is to provide an interactive way for users to write
> and
> > > > execute flink Python Table API jobs.
> > > >
> > > >
> > > > The design document for FLIP-38 can be found here:
> > > >
> > > >
> > > >
> > >
> >
> https://docs.google.com/document/d/1ybYt-0xWRMa1Yf5VsuqGRtOfJBz4p74ZmDxZYg3j_h8/edit?usp=sharing
> > > >
> > > > I am looking forward to your comments and feedback.
> > > >
> > > > Best,
> > > > Jincheng
> > > >
> > >
> >
>

Reply via email to