Hi Forward,

Thanks for updating the documentation. I think the design doc looks good to
me now. I think you can convert it into FLIP wiki and start a vote then.

Best,Jark

Jark Wu <imj...@gmail.com>于2019年12月20日 周五23:07写道:

> Hi Forward,
>
> Thanks for updating the documentation. I think it is
>
> On Mon, 2 Dec 2019 at 14:21, Forward Xu <forwardxu...@gmail.com> wrote:
>
>> Hi Jark,
>> Thank you very much,I will improve this document as soon as possible.
>> My confluence username is ForwardXu.
>>
>> Best,
>> Forward
>>
>> Jark Wu <imj...@gmail.com> 于2019年12月2日周一 上午11:56写道:
>>
>> > Hi Forward,
>> >
>> > Sorry for the late reply.
>> > As I said before, it would be better to include the JSON functions API
>> for
>> > Table API.
>> > For example, what the "json_exist", "json_value", "json_query", etc...
>> > functions in Table API looks like.
>> > And it would be better to follow FLIP template [1] which include "Public
>> > Interface" (brief list of public interface), and "Proposed changes"
>> > (detailed proposal).
>> >
>> > Once the design doc looks good, you can update it to FLIP and start a
>> vote.
>> >
>> > Btw, what's your username in confluence?
>> >
>> >
>> > Best,
>> > Jark
>> >
>> >
>> > [1]: https://cwiki.apache.org/confluence/display/FLINK/FLIP+Template
>> >
>> >
>> > On Mon, 2 Dec 2019 at 11:44, Forward Xu <forwardxu...@gmail.com> wrote:
>> >
>> > > Hi Jingsong Lee :
>> > >
>> > > Thank you very much, I need to apply for FLIP permission. Do I need to
>> > > create a FLIP for this?
>> > >
>> > > Best,
>> > > Forward
>> > >
>> > > Jingsong Li <jingsongl...@gmail.com> 于2019年12月2日周一 上午11:40写道:
>> > >
>> > > > Hi Forward:
>> > > >
>> > > > Document looks good to me.
>> > > > I think you can just start doing this.
>> > > > They all work very independently, so I don't think there's any
>> obvious
>> > > > blocking.
>> > > >
>> > > > Best,
>> > > > Jingsong Lee
>> > > >
>> > > > On Sat, Nov 30, 2019 at 10:59 AM Forward Xu <forwardxu...@gmail.com
>> >
>> > > > wrote:
>> > > >
>> > > > > Hi everyone, It's been a long time since I started this
>> discussion.
>> > Do
>> > > > you
>> > > > > have anything to add and improve?
>> > > > > Best,
>> > > > > Forward
>> > > > >
>> > > > > Forward Xu <forwardxu...@gmail.com> 于2019年9月22日周日 下午6:30写道:
>> > > > >
>> > > > > > Hi Jack,
>> > > > > > Thank you very much for your reply, google doc I have updated,
>> and
>> > > some
>> > > > > of
>> > > > > > your questions I replied.
>> > > > > >  In addition, I want to apply for Flip permissions for this
>> > purpose.
>> > > > > >
>> > > > > > Best,
>> > > > > > Forward
>> > > > > >
>> > > > > > Jark Wu <imj...@gmail.com> 于2019年9月20日周五 下午9:53写道:
>> > > > > >
>> > > > > >> Hi Forward,
>> > > > > >>
>> > > > > >> Sorry for the late reply. I have went through the design doc
>> and I
>> > > > think
>> > > > > >> it
>> > > > > >> is very nice.
>> > > > > >>
>> > > > > >> Here are my thoughts and suggestions:
>> > > > > >>
>> > > > > >> 0) I think support JSON functions in SQL is not complicated.
>> > Because
>> > > > > >> Calcite already supports the parser part and the runtime part.
>> > > > > >>     We only need to integrate it in Flink and add good coverage
>> > > tests.
>> > > > > >> 1) However, I think we should also design the corresponding
>> JSON
>> > > > > Functions
>> > > > > >> API for Table API which is very important.
>> > > > > >>     I don't have a clear idea about how to support all the JSON
>> > > > Function
>> > > > > >> syntax in Table API. And this may need more discussions.
>> > > > > >> 2) IMO, it deserves a FLIP (especially for the Table API part).
>> > You
>> > > > can
>> > > > > >> follow the FLIP process [1] to start a FLIP proposal.
>> > > > > >> 3) I think we only need to implement it in blink planner as we
>> are
>> > > > going
>> > > > > >> to
>> > > > > >> deprecate old planner.
>> > > > > >>    So could you update the implementation section in the doc
>> > because
>> > > > the
>> > > > > >> implementation in blink planner should be different.
>> > > > > >> 4) It would be better to have an implementation plan to
>> priority
>> > the
>> > > > > >> sub-tasks.
>> > > > > >>     From my point of view, JSON_VALUE is the most important and
>> > > > > JSON_TABLE
>> > > > > >> gets the least priority.
>> > > > > >>
>> > > > > >> I also left some comments in the google doc.
>> > > > > >>
>> > > > > >> Hi @JingsongLee <lzljs3620...@aliyun.com> ,
>> > > > > >>
>> > > > > >> I think we don't need to wait for FLIP-51. As we don't have a
>> > clear
>> > > > > >> progress of FLIP-51.
>> > > > > >> And as far as I know, it will add a few of PlannerExpressions
>> > which
>> > > > can
>> > > > > be
>> > > > > >> refactored easily during FLIP-51.
>> > > > > >>
>> > > > > >>
>> > > > > >> Cheers,
>> > > > > >> Jark
>> > > > > >>
>> > > > > >> [1]:
>> > > > > >>
>> > > > > >>
>> > > > >
>> > > >
>> > >
>> >
>> https://cwiki.apache.org/confluence/display/FLINK/Flink+Improvement+Proposals
>> > > > > >>
>> > > > > >>
>> > > > > >>
>> > > > > >> On Thu, 5 Sep 2019 at 19:29, vino yang <yanghua1...@gmail.com>
>> > > wrote:
>> > > > > >>
>> > > > > >> > +1 to have JSON functions in Flink SQL
>> > > > > >> >
>> > > > > >> > JingsongLee <lzljs3620...@aliyun.com.invalid> 于2019年9月5日周四
>> > > > 下午4:46写道:
>> > > > > >> >
>> > > > > >> > > +1
>> > > > > >> > > Nice document. I think it is easier to do after expression
>> > > > > >> reworking[1].
>> > > > > >> > > By the way, which planner do you want to start?
>> > > > > >> > >
>> > > > > >> > > [1]
>> > > > > >> > >
>> > > > > >> >
>> > > > > >>
>> > > > >
>> > > >
>> > >
>> >
>> https://cwiki.apache.org/confluence/display/FLINK/FLIP-51%3A+Rework+of+the+Expression+Design
>> > > > > >> > >
>> > > > > >> > > Best,
>> > > > > >> > > Jingsong Lee
>> > > > > >> > >
>> > > > > >> > >
>> > > > > >> > >
>> > > ------------------------------------------------------------------
>> > > > > >> > > From:TANG Wen-hui <winifred.wenhui.t...@gmail.com>
>> > > > > >> > > Send Time:2019年9月5日(星期四) 14:36
>> > > > > >> > > To:dev <dev@flink.apache.org>
>> > > > > >> > > Subject:Re: Re: [DISCUSS] Support JSON functions in Flink
>> SQL
>> > > > > >> > >
>> > > > > >> > > +1
>> > > > > >> > > I have done similar work before.
>> > > > > >> > > Looking forward to discussing this feature.
>> > > > > >> > >
>> > > > > >> > > Best
>> > > > > >> > > wenhui
>> > > > > >> > >
>> > > > > >> > >
>> > > > > >> > >
>> > > > > >> > > winifred.wenhui.t...@gmail.com
>> > > > > >> > >
>> > > > > >> > > From: Kurt Young
>> > > > > >> > > Date: 2019-09-05 14:00
>> > > > > >> > > To: dev
>> > > > > >> > > CC: Anyang Hu
>> > > > > >> > > Subject: Re: [DISCUSS] Support JSON functions in Flink SQL
>> > > > > >> > > +1 to add JSON support to Flink. We also see lots of
>> > > requirements
>> > > > > for
>> > > > > >> > JSON
>> > > > > >> > > related functions in our internal platform. Since these are
>> > > > already
>> > > > > >> SQL
>> > > > > >> > > standard, I think it's a good time to add them to Flink.
>> > > > > >> > >
>> > > > > >> > > Best,
>> > > > > >> > > Kurt
>> > > > > >> > >
>> > > > > >> > >
>> > > > > >> > > On Thu, Sep 5, 2019 at 10:37 AM Qi Luo <luoqi...@gmail.com
>> >
>> > > > wrote:
>> > > > > >> > >
>> > > > > >> > > > We also see strong demands from our SQL users for
>> JSON/Date
>> > > > > related
>> > > > > >> > > > functions.
>> > > > > >> > > >
>> > > > > >> > > > Also +Anyang Hu <huanyang1...@gmail.com>
>> > > > > >> > > >
>> > > > > >> > > > On Wed, Sep 4, 2019 at 9:51 PM Jark Wu <imj...@gmail.com
>> >
>> > > > wrote:
>> > > > > >> > > >
>> > > > > >> > > > > Hi Forward,
>> > > > > >> > > > >
>> > > > > >> > > > > Thanks for bringing this discussion and preparing the
>> nice
>> > > > > design.
>> > > > > >> > > > > I think it's nice to have the JSON functions in the
>> next
>> > > > > release.
>> > > > > >> > > > > We have received some requirements for this feature.
>> > > > > >> > > > >
>> > > > > >> > > > > I can help to shepherd this JSON functions effort and
>> will
>> > > > leave
>> > > > > >> > > comments
>> > > > > >> > > > >  in the design doc in the next days.
>> > > > > >> > > > >
>> > > > > >> > > > > Hi Danny,
>> > > > > >> > > > >
>> > > > > >> > > > > The new introduced JSON functions are from SQL:2016,
>> not
>> > > from
>> > > > > >> MySQL.
>> > > > > >> > > > > So there no JSON type is needed. According to the
>> > SQL:2016,
>> > > > the
>> > > > > >> > > > > representation of JSON data can be "character string"
>> > which
>> > > is
>> > > > > >> also
>> > > > > >> > > > > the current implementation in Calcite[1].
>> > > > > >> > > > >
>> > > > > >> > > > > Best,
>> > > > > >> > > > > Jark
>> > > > > >> > > > >
>> > > > > >> > > > >
>> > > > > >> > > > > [1]:
>> > > > > >> https://calcite.apache.org/docs/reference.html#json-functions
>> > > > > >> > > > >
>> > > > > >> > > > >
>> > > > > >> > > > > On Wed, 4 Sep 2019 at 21:22, Xu Forward <
>> > > > forwardxu...@gmail.com
>> > > > > >
>> > > > > >> > > wrote:
>> > > > > >> > > > >
>> > > > > >> > > > > > hi Danny Chan ,Thank you very much for your reply,
>> your
>> > > help
>> > > > > can
>> > > > > >> > help
>> > > > > >> > > > me
>> > > > > >> > > > > > further improve this discussion.
>> > > > > >> > > > > > Best
>> > > > > >> > > > > > forward
>> > > > > >> > > > > >
>> > > > > >> > > > > > Danny Chan <yuzhao....@gmail.com> 于2019年9月4日周三
>> > 下午8:50写道:
>> > > > > >> > > > > >
>> > > > > >> > > > > > > Thanks Xu Forward for bring up this topic, I think
>> the
>> > > > JSON
>> > > > > >> > > functions
>> > > > > >> > > > > are
>> > > > > >> > > > > > > very useful especially for those MySQL users.
>> > > > > >> > > > > > >
>> > > > > >> > > > > > > I saw that you have done some work within the
>> Apache
>> > > > > Calcite,
>> > > > > >> > > that’s
>> > > > > >> > > > a
>> > > > > >> > > > > > > good start, but this is one concern from me, Flink
>> > > doesn’t
>> > > > > >> > support
>> > > > > >> > > > JSON
>> > > > > >> > > > > > > type internal, so how to represent a JSON object in
>> > > Flink
>> > > > > >> maybe a
>> > > > > >> > > key
>> > > > > >> > > > > > point
>> > > > > >> > > > > > > we need to resolve. In Calcite, we use ANY type to
>> > > > represent
>> > > > > >> as
>> > > > > >> > the
>> > > > > >> > > > > JSON,
>> > > > > >> > > > > > > but I don’t think it is the right way to go, maybe
>> we
>> > > can
>> > > > > >> have a
>> > > > > >> > > > > > discussion
>> > > > > >> > > > > > > here.
>> > > > > >> > > > > > >
>> > > > > >> > > > > > > Best,
>> > > > > >> > > > > > > Danny Chan
>> > > > > >> > > > > > > 在 2019年9月4日 +0800 PM8:34,Xu Forward <
>> > > > forwardxu...@gmail.com
>> > > > > >> >,写道:
>> > > > > >> > > > > > > > Hi everybody,
>> > > > > >> > > > > > > >
>> > > > > >> > > > > > > > I'd like to kick off a discussion on Support JSON
>> > > > > functions
>> > > > > >> in
>> > > > > >> > > > Flink
>> > > > > >> > > > > > SQL.
>> > > > > >> > > > > > > >
>> > > > > >> > > > > > > > The entire plan is divided into two steps:
>> > > > > >> > > > > > > > 1. Implement Support SQL 2016-2017 JSON
>> functions in
>> > > > Flink
>> > > > > >> > > SQL[1].
>> > > > > >> > > > > > > > 2. Implement non-Support SQL 2016-2017 JSON
>> > functions
>> > > in
>> > > > > >> Flink
>> > > > > >> > > SQL,
>> > > > > >> > > > > > such
>> > > > > >> > > > > > > as
>> > > > > >> > > > > > > > JSON_TYPE in Mysql, JSON_LENGTH, etc. Very useful
>> > JSON
>> > > > > >> > functions.
>> > > > > >> > > > > > > >
>> > > > > >> > > > > > > > Would love to hear your thoughts.
>> > > > > >> > > > > > > >
>> > > > > >> > > > > > > > [1]
>> > > > > >> > > > > > > >
>> > > > > >> > > > > > >
>> > > > > >> > > > > >
>> > > > > >> > > > >
>> > > > > >> > > >
>> > > > > >> > >
>> > > > > >> >
>> > > > > >>
>> > > > >
>> > > >
>> > >
>> >
>> https://docs.google.com/document/d/1JfaFYIFOAY8P2pFhOYNCQ9RTzwF4l85_bnTvImOLKMk/edit#heading=h.76mb88ca6yjp
>> > > > > >> > > > > > > >
>> > > > > >> > > > > > > > Best,
>> > > > > >> > > > > > > > ForwardXu
>> > > > > >> > > > > > >
>> > > > > >> > > > > >
>> > > > > >> > > > >
>> > > > > >> > > >
>> > > > > >> > >
>> > > > > >> >
>> > > > > >>
>> > > > > >
>> > > > >
>> > > >
>> > > >
>> > > > --
>> > > > Best, Jingsong Lee
>> > > >
>> > >
>> >
>>
>

Reply via email to