Hi everyone,
I updated FLIP-28 according to the feedback that I received (online and
offline).
The biggest change is that a user now needs to add two dependencies (api
and planner) if a table program should be runnable in an IDE (as
Aljoscha suggested). This allows for a clear separation of
Hi Aljoscha,
thanks for your feedback. I also don't like the fact that an API depends
on runtime. I will try to come up with a better design while
implementing a PoC. The general goal should be to make table programs
still runnable in an IDE. So maybe there is a better way of doing it.
Regar
Hi,
this is a very nice effort!
There is one thing that we should change, though. In the batch API we have a
clear separation between API and runtime, and using the API (depending on
flink-batch) does not "expose" the runtime classes that are in flink-runtime.
For the streaming API, we made th
Thanks for the feedback, everyone!
I created a FLIP for these efforts:
https://cwiki.apache.org/confluence/display/FLINK/FLIP-28%3A+Long-term+goal+of+making+flink-table+Scala-free
I will open an umbrella Jira ticket for FLIP-28 with concrete subtasks
shortly.
Thanks,
Timo
Am 29.11.18 um 12
Thanks Timo,
That makes sense to me. And I left the comment about code generation in doc.
Looking forward to participate in it!
Best,
Jark
On Thu, 29 Nov 2018 at 16:42, Timo Walther wrote:
> @Kurt: Yes, I don't think that that forks of Flink will have a hard time
> to keep up with the porting
@Kurt: Yes, I don't think that that forks of Flink will have a hard time
to keep up with the porting. That is also why I called this `long-term
goal` because I don't see big resources for the porting to happen
quicker. But at least new features, API, and runtime profit from Java to
Scala conver
Hi Timo,
Thanks for the great work!
Moving flink-table to Java is a long-awaited things but will involve much
effort. Agree with that we should make it as a long-term goal.
I have read the google doc and +1 for the proposal. Here I have some
questions:
1. Where should the flink-table-common mod
Hi Timo and Vino,
I agree that table is very active and there is no guarantee for not
producing any conflicts if you decide
to develop based on community version. I think this part is the risk what
we can imagine in the first place. But massively
language replacing is something you can not imagine
Hi Kurt,
I understand your concerns. However, there is no concrete roadmap for
Flink 2.0 and (as Vino said) the flink-table is developed very actively.
Major refactorings happened in the past and will also happen with or
without Scala migration. A good example, is the proper catalog support
w
Hi Kurt,
Currently, there is still a long time to go from flink 2.0. Considering
that the flink-table
is one of the most active modules in the current flink project, each
version has
a number of changes and features added. I think that refactoring faster
will reduce subsequent
complexity and workl
Hi Timo,
Thanks for writing up the document. I'm +1 for reorganizing the module
structure and make table scala free. But I have
a little concern abount the timing. Is it more appropriate to get this done
when Flink decide to bump to next big version, like 2.x.
It's true you can keep all the class'
Hi Hequn,
thanks for your feedback. Yes, migrating the test cases is another issue
that is not represented in the document but should naturally go along
with the migration.
I agree that we should migrate the main API classes quickly within this
1.8 release after the module split has been per
Hi hequn,
I am very glad to hear that you are interested in this work.
As we all know, this process involves a lot.
Currently, the migration work has begun. I started with the
Kafka connector's dependency on flink-table and moved the
related dependencies to flink-table-common.
This work is tracked
Hi Timo,
Thanks for the effort and writing up this document. I like the idea to make
flink-table scala free, so +1 for the proposal!
It's good to make Java the first-class citizen. For a long time, we have
neglected java so that many features in Table are missed in Java Test
cases, such as this o
Hi everyone,
thanks for the great feedback so far. I updated the document with the
input I got so far
@Fabian: I moved the porting of flink-table-runtime classes up in the list.
@Xiaowei: Could you elaborate what "interface only" means to you? Do you
mean a module containing pure Java `inter
Hi Timo,
Thanks for writing this down +1 from my side :)
> I'm wondering that whether we can have rule in the interim when Java and
> Scala coexist that dependency can only be one-way. I found that in the
> current code base there are cases where a Scala class extends Java and vise
> versa. Th
Hi Timo,
Thanks for the effort and the Google writeup. During our external catalog
rework, we found much confusion between Java and Scala, and this Scala-free
roadmap should greatly mitigate that.
I'm wondering that whether we can have rule in the interim when Java and Scala
coexist that depen
Hi Timo,
Thanks for initiating this great discussion.
Currently when using SQL/TableAPI should include many dependence. In
particular, it is not necessary to introduce the specific implementation
dependencies which users do not care about. So I am glad to see your
proposal, and hope when we consid
Hi Timo, thanks for driving this! I think that this is a nice thing to do.
While we are doing this, can we also keep in mind that we want to
eventually have a TableAPI interface only module which users can take
dependency on, but without including any implementation details?
Xiaowei
On Thu, Nov 2
Hi Timo,
Thanks for writing up this document.
I like the new structure and agree to prioritize the porting of the
flink-table-common classes.
Since flink-table-runtime is (or should be) independent of the API and
planner modules, we could start porting these classes once the code is
split into the
Hi everyone,
I would like to continue this discussion thread and convert the outcome
into a FLIP such that users and contributors know what to expect in the
upcoming releases.
I created a design document [1] that clarifies our motivation why we
want to do this, how a Maven module structure c
Hi Piotr,
thanks for bumping this thread and thanks for Xingcan for the comments.
I think the first step would be to separate the flink-table module into
multiple sub modules. These could be:
- flink-table-api: All API facing classes. Can be later divided further
into Java/Scala Table API/SQL
-
Hi all,
I also think about this problem these days and here are my thoughts.
1) We must admit that it’s really a tough task to interoperate with Java and
Scala. E.g., they have different collection types (Scala collections v.s.
java.util.*) and in Java, it's hard to implement a method which tak
Bumping the topic.
If we want to do this, the sooner we decide, the less code we will have to
rewrite. I have some objections/counter proposals to Fabian's proposal of doing
it module wise and one module at a time.
First, I do not see a problem of having java/scala code even within one module,
Hi,
In general, I think this is a good effort. However, it won't be easy and I
think we have to plan this well.
I don't like the idea of having the whole code base fragmented into Java
and Scala code for too long.
I think we should do this one step at a time and focus on migrating one
module at a
I think that is a noble and honorable goal and we should strive for it.
This, however, must be an iterative process given the sheer size of the
code base. I like the approach to define common Java modules which are used
by more specific Scala modules and slowly moving classes from Scala to
Java. Th
Hi,
I do not have an experience with how scala and java interacts with each other,
so I can not fully validate your proposal, but generally speaking +1 from me.
Does it also mean, that we should slowly migrate `flink-table-core` to Java?
How would you envision it? It would be nice to be able to
27 matches
Mail list logo