Stephan Ewen created FLINK-4298:
---
Summary: Clean up Storm Compatibility Dependencies
Key: FLINK-4298
URL: https://issues.apache.org/jira/browse/FLINK-4298
Project: Flink
Issue Type: Bug
es allows to easily reuse and execute Storm
> >>>>> Topologies on Flink (what is the most important feature we need to
> >> have).
> >>>>>
> >>>>> I hope to get some more feedback from the community, if the
> >>>>> Strom-compatibility should be more "stormy" or more "
more "stormy" or more "flinky". Bot
>>>>> approaches make sense to me.
>>>>>
>>>>>
>>>>> I view minor comments:
>>>>>
>>>>> * FileSpout vs FiniteFileSpout
>>>>> -> FileSpout
ake sense from a Storm point of view (there is no
> >>> such thing as a finite spout)
> >>> Thus, this example shows how a regular Storm spout can be improved
> >>> using FiniteSpout interface -- I would keep it as is (even if seems to
> >>> be unnecessa
out interface -- I would keep it as is (even if seems to
>>> be unnecessary complicated -- imagine that you don't have the code of
>>> FileSpout)
>>>
>>> * You changed examples to use finite-spouts -- from a testing point of
>>> view this makes sense.
local copy "unprocessedBolts" when creating a Flink
>> program to allow to re-submit the same topology object twice (or alter
>> it after submission). If you don't make the copy, submitting/translating
>> the topology into a Flink job alters the object (which should
hy did you change the dop from 4 to 1 WordCountTopology ? We should
> test in parallel fashion...
>
> * Too many reformatting changes ;) You though many classes without any
> actual code changes.
>
>
>
>
>
>
> Forwarded Message
> Subject: Re: Storm C
I would be against adding anything Storm-specific in the core (streaming is
core as well) Flink APIs. If we add stuff there we have to stick to it and I
don’t see a lot of use for reusing single Bolts/Spouts.
I’m very excited about the work on Storm compatibility in general, though. :D
> On
not matter.
>
> * Why did you change the dop from 4 to 1 WordCountTopology ? We should
> test in parallel fashion...
>
> * Too many reformatting changes ;) You though many classes without any
> actual code changes.
>
>
>
>
>
>
> Forwarded Me
you change the dop from 4 to 1 WordCountTopology ? We should
test in parallel fashion...
* Too many reformatting changes ;) You though many classes without any
actual code changes.
---- Forwarded Message
Subject: Re: Storm Compatibility
Date: Fri, 13 Nov 2015 12:15:19 +0100
From:
Stephan Ewen created FLINK-2801:
---
Summary: Rework Storm Compatibility Tests
Key: FLINK-2801
URL: https://issues.apache.org/jira/browse/FLINK-2801
Project: Flink
Issue Type: Bug
anEwen @rmetzger
> Can you give me some suggestion about my idea of how to transfer
> user-defined class code to task?
> Thank you very much!
>
> Regards
> Fengbin Fang
>
> #####
>
> Dear all,
>
> I am work on task hooks of storm-compat
Hi
@mjsax @StephanEwen @rmetzger
Can you give me some suggestion about my idea of how to transfer user-defined
class code to task?
Thank you very much!
Regards
Fengbin Fang
#
Dear all,
I am work on task hooks of storm-compatibility. Storm support add hook through
the Storm
Dear all,
I am work on task hooks of storm-compatibility. Storm support add hook through
the Storm configuration using the "topology.auto.task.hooks" config. Users can
use user-defined hooks class names as value of "topology.auto.task.hooks" of
Configuration Map. These hoo
Stephan Ewen created FLINK-2586:
---
Summary: Unstable Storm Compatibility Tests
Key: FLINK-2586
URL: https://issues.apache.org/jira/browse/FLINK-2586
Project: Flink
Issue Type: Bug
fangfengbin created FLINK-2571:
--
Summary: Add task hooks support in Storm-compatibility
Key: FLINK-2571
URL: https://issues.apache.org/jira/browse/FLINK-2571
Project: Flink
Issue Type: New
orrect program also doesn't work right now.
> When I try to execute it I get this:
> bin/flink run --jarfile
>
> /Users/aljoscha/Dev/work/flink/flink-contrib/flink-storm-compatibility/flink-storm-compatibility-examples/targe
.
Hi Matthias,
Thank you for the help.
I`ll try.
===
Hi,
I'm afraid submitting the correct program also doesn't work right now. When I
try to execute it I get this:
bin/flink run --jarfile
/Users/aljoscha/Dev/work/flink/flink-contrib/flink-storm-compatibility/f
re not to break them...
-Matthias
On 08/20/2015 01:59 PM, Aljoscha Krettek wrote:
> Hi,
> I'm afraid submitting the correct program also doesn't work right now. When
> I try to execute it I get this:
> bin/flink run --jarfile
> /Users/aljoscha/Dev/work/flink/flink-contrib/flink-
Hi,
I'm afraid submitting the correct program also doesn't work right now. When
I try to execute it I get this:
bin/flink run --jarfile
/Users/aljoscha/Dev/work/flink/flink-contrib/flink-storm-compatibility/flink-storm-compatibility-examples/target/flink-storm-compatibility-examples-0.1
ven -DskipTests package to build the new jar file
If you have further problems, just let us know.
-Matthias
On 08/20/2015 01:19 PM, huangwei (G) wrote:
> Hi,
>
> I got some new problems about the storm compatibility currently.
> These occurred when I ran the “storm-wordcount” in the stor
t;bin/flink".
Please get back to us if it still doesn't work or if you have further
questions.
Greetings,
Aljoscha
On Thu, 20 Aug 2015 at 13:19 huangwei (G) wrote:
> Hi,
>
> I got some new problems about the storm compatibility currently.
> These occurred when I ran the “st
Hi,
I got some new problems about the storm compatibility currently.
These occurred when I ran the “storm-wordcount” in the storm compatibility on a
flink-0.10SNAPSHOT which I built it over a latest flink project.
First, I start a local flink:
$ cd bin
$ ./start-local.sh
Then I ran the
fangfengbin created FLINK-2525:
--
Summary: Add configuration support in Storm-compatibility
Key: FLINK-2525
URL: https://issues.apache.org/jira/browse/FLINK-2525
Project: Flink
Issue Type: New
Matthias J. Sax created FLINK-2337:
--
Summary: Multiple SLF4J bindings using Storm compatibility layer
Key: FLINK-2337
URL: https://issues.apache.org/jira/browse/FLINK-2337
Project: Flink
Matthias J. Sax created FLINK-2306:
--
Summary: Add support for named streams in Storm compatibility layer
Key: FLINK-2306
URL: https://issues.apache.org/jira/browse/FLINK-2306
Project: Flink
Matthias J. Sax created FLINK-2305:
--
Summary: Add documenation about Storm compatibility layer
Key: FLINK-2305
URL: https://issues.apache.org/jira/browse/FLINK-2305
Project: Flink
Issue
Matthias J. Sax created FLINK-2304:
--
Summary: Add named attribute access to Storm compatibility layer
Key: FLINK-2304
URL: https://issues.apache.org/jira/browse/FLINK-2304
Project: Flink
m does? Keep a map from the field names
>>> to
>>>> indexes somewhere (make this accessible from the tuple) and then you can
>>>> just use a simple Flink tuple.
>>>>
>>>> I think this is what's happening in storm, they get the index from
and then you can
>> > just use a simple Flink tuple.
>> >
>> > I think this is what's happening in storm, they get the index from the
>> > context, which knows the declared output fields.
>> >
>> > Gyula
>> >
>> > Matth
you can
> > just use a simple Flink tuple.
> >
> > I think this is what's happening in storm, they get the index from the
> > context, which knows the declared output fields.
> >
> > Gyula
> >
> > Matthias J. Sax ezt írta (időpont: 2015.
> >
> jún. 29., H, 18:08):
>
>> Hi,
>>
>> I started to work on a missing feature for the Storm compatibility
>> layer: named attribute access
>>
>> In Storm, each attribute of an input tuple can be accessed via index or
>> by name. Currently, only index
's happening in storm, they get the index from the
context, which knows the declared output fields.
Gyula
Matthias J. Sax ezt írta (időpont: 2015.
jún. 29., H, 18:08):
> Hi,
>
> I started to work on a missing feature for the Storm compatibility
> layer: named attribute access
&g
Hi,
I started to work on a missing feature for the Storm compatibility
layer: named attribute access
In Storm, each attribute of an input tuple can be accessed via index or
by name. Currently, only index access is supported. In order to support
this feature in Flink (embedded Bolt in Flink
Péter Szabó created FLINK-2243:
--
Summary: Add finite spout functionality to Storm compatibility
layer
Key: FLINK-2243
URL: https://issues.apache.org/jira/browse/FLINK-2243
Project: Flink
Issue
;
>>>>>> "@return The DataStream with shuffle partitioning set."
>>>>>
>>>>> (Looks like a copy&past error to me.)
>>>>>
>>>>> I am also wondering, if RebalancePartitioner has a bug. It seems, that
>>>
rebalance()", and "global()". They all state
> >>>
> >>>> "@return The DataStream with shuffle partitioning set."
> >>>
> >>> (Looks like a copy&past error to me.)
> >>>
> >>> I am also wondering, if
t."
>>>
>>> (Looks like a copy&past error to me.)
>>>
>>> I am also wondering, if RebalancePartitioner has a bug. It seems, that
>>> it never evaluates its member "forward". Thus, local forward
>>> ("DataStream.forward()")
member "forward". Thus, local forward
> > ("DataStream.forward()") would not work correctly.
> >
> > Please correct me, if I got something mixed up.
> >
> >
> > -Matthias
> >
> > On 06/10/2015 02:42 PM, Márto
l forward
> ("DataStream.forward()") would not work correctly.
>
> Please correct me, if I got something mixed up.
>
>
> -Matthias
>
> On 06/10/2015 02:42 PM, Márton Balassi wrote:
> > Hey,
> >
> > As the storm-compatibility-core b
not work correctly.
Please correct me, if I got something mixed up.
-Matthias
On 06/10/2015 02:42 PM, Márton Balassi wrote:
> Hey,
>
> As the storm-compatibility-core build goes fine this is a dependency issue
> with storm-compatibility-examples. As a first try replace:
>
Hey,
As the storm-compatibility-core build goes fine this is a dependency issue
with storm-compatibility-examples. As a first try replace:
org.apache.flink
flink-streaming-core
${project.version}
test
tests
with
org.apache.flink
flink-streaming-core
${project.version}
test
Travis caches Maven dependendies and sometimes fails to update them.
Try to clear you Travis cache via "Settings" (up right) -> "Caches"
Cheers, Fabian
2015-06-10 14:22 GMT+02:00 Matthias J. Sax :
> Hi,
>
> the current PR of storm compatibility layer builds
This seems like a version mismatch. For example,
DataStream.distribute() was changed to DataStream.rebalance()
recently. Maybe your build getting some outdated jars from the travis
cache.
On Wed, Jun 10, 2015 at 2:22 PM, Matthias J. Sax
wrote:
> Hi,
>
> the current PR of storm comp
Hi,
the current PR of storm compatibility layer builds successfully on my
laptop (mvn clean install). However, on travis I get strange error
messages in the IT-Cases:
https://travis-ci.org/mjsax/flink/builds/66137928
For example:
> Caused by: java.lang.AbstractMethodEr
It looks like there is a now a PR request available for the storm
compatibility: https://github.com/apache/flink/pull/764
It seems were are not the only new stream processing system with
compatibility to Storm: http://dl.acm.org/citation.cfm?id=2742788
On Tue, Jun 2, 2015 at 11:09 AM, Szabó
and make comments in the afternoon.
Peter
2015-06-01 21:46 GMT+02:00 Robert Metzger :
> Great to see that you two are working together on the storm compatibility
> layer.
>
> Please let the other Flink committers know when Matthias PR is in a state
> that we can review it again (=
Great to see that you two are working together on the storm compatibility
layer.
Please let the other Flink committers know when Matthias PR is in a state
that we can review it again (= when you think its ready).
Given the feedback from Peter and the long list of missing features and the
current
mbalassi/flink/tree/storm-backup
- storm (last clean state of my work on the flink-storm-compatibility pull
request, including code cleanup & refactor and one or two simple examples):
https://github.com/mbalassi/flink/tree/storm
Peter
2015-05-29 10:36 GMT+02:00 Matthias J. Sax :
> Hi Peter
lity of the layer. Can you please
>> share your example with me, so I can see what the problem is and fix it?
>>
>> I am pretty sure, that the fix will be merged later on, too. There are
>> many other limitation in the layer. Right now, it is still in beta state.
>> ;)
>&
;
> I am pretty sure, that the fix will be merged later on, too. There are
> many other limitation in the layer. Right now, it is still in beta state.
> ;)
>
> -Matthias
>
>
> On 05/27/2015 03:48 PM, Szabó Péter wrote:
> > Hey everyone,
> >
> > I experimented w
t the problem is and fix it?
I am pretty sure, that the fix will be merged later on, too. There are
many other limitation in the layer. Right now, it is still in beta state. ;)
-Matthias
On 05/27/2015 03:48 PM, Szabó Péter wrote:
> Hey everyone,
>
> I experimented with the Storm compa
Hey everyone,
I experimented with the Storm compatibility layer Matthias wrote, and ran
some Storm examples on Flink. I found that Storm's SimpleJoin example does
not work. I suppose it is because of the multiple input streams. I'm
willing to add another example instead.
Right now, I
roject that we do not consider part of the core flink functionality,
> > but
> > >> provide useful tools around it. In general code placed here has to
> meet
> > >> less requirements in terms of covering all corner cases if it
> provides a
> > >> nic
ream (by Gabor
> >> Gevay).
> >>
> >> The pull request for the Storm compatibility layer (by Matthias J. Sax)
> [1]
> >> raises the issue as it is way more code to maintain and is more complex
> in
> >> general that how the community would like to handle
as two small utilities, the TweetInputFormat (by Mustafa
>> Elbehery) and the collect functionality for the DataStream (by Gabor
>> Gevay).
>>
>> The pull request for the Storm compatibility layer (by Matthias J. Sax) [1]
>> raises the issue as it is way more code to mainta
; nice solution for a set of well defined problems.
>
> As of today it has two small utilities, the TweetInputFormat (by Mustafa
> Elbehery) and the collect functionality for the DataStream (by Gabor
> Gevay).
>
> The pull request for the Storm compatibility layer (by Matthias J. Sax
here has to meet
> less requirements in terms of covering all corner cases if it provides a
> nice solution for a set of well defined problems.
>
> As of today it has two small utilities, the TweetInputFormat (by Mustafa
> Elbehery) and the collect functionality for the DataStream (by
solution for a set of well defined problems.
As of today it has two small utilities, the TweetInputFormat (by Mustafa
Elbehery) and the collect functionality for the DataStream (by Gabor Gevay).
The pull request for the Storm compatibility layer (by Matthias J. Sax) [1]
raises the issue as it is way
ld help a lot when merging.
> >>>>>
> >>>>>
> >>>>>
> >>>>> On Thu, Apr 2, 2015 at 9:19 PM, Fabian Hueske >>>>> > wrote:
> >>>>>
> >>>>>> Hi Matthias,
> >>>>&
wrote:
>>>>>
>>>>>> Hi Matthias,
>>>>>>
>>>>>> this is really cool!I especially like that you can use Storm code
>>> within
>>>>> a
>>>>>> Flink streaming program :-)
>>>>>>
>&g
t;>> One thing that might be good to do rather soon is to collect all your
> > >>> commits and put them on top of a fresh forked Flink master branch.
> > >>> When merging we cannot change the history and try to put only
> > >> fast-forward
> > >>
op of a fresh forked Flink master branch.
> >>> When merging we cannot change the history and try to put only
> >> fast-forward
> >>> commits on top of the master branch.
> >>> As time goes on it becomes more likely that you run into merge issues
> &
t;>> When merging we cannot change the history and try to put only
>> fast-forward
>>> commits on top of the master branch.
>>> As time goes on it becomes more likely that you run into merge issues
>> when
>>> cherry-picking the commits.
>>>
>
run into merge issues
> when
> > cherry-picking the commits.
> >
> > 2015-04-02 21:09 GMT+02:00 Robert Metzger >:
> >
> > > Hey Henry,
> > >
> > > you can check out the files here:
> > >
> > >
> >
> https://github.com/mjsax/flink
> > you can check out the files here:
> >
> >
> https://github.com/mjsax/flink/tree/flink-storm-compatibility/flink-staging/flink-streaming/flink-storm-compatibility
> > ... so yes, they are located in the flink-streaming directory .. which
> is a
> > good place for
ithub.com/mjsax/flink/tree/flink-storm-compatibility/flink-staging/flink-streaming/flink-storm-compatibility
> ... so yes, they are located in the flink-streaming directory .. which is a
> good place for now.
> Once we move flink-streaming out of staging, we might want to keep the
> storm
Hey Henry,
you can check out the files here:
https://github.com/mjsax/flink/tree/flink-storm-compatibility/flink-staging/flink-streaming/flink-storm-compatibility
... so yes, they are located in the flink-streaming directory .. which is a
good place for now.
Once we move flink-streaming out of
HI Matthias,
Where do you put the code for the Storm compatibility? Under streams
module directory?
- Henry
On Thu, Apr 2, 2015 at 10:31 AM, Matthias J. Sax
wrote:
> Hi @all,
>
> I started to work on an compatibility layer to run Storm Topologies on
> Flink. I just pushed a first b
Hey Matthias,
a Storm compatibility layer sounds really great!
I'll soon take a closer look into the code, but the features you're listing
sound really amazing! Since the code has already testcases included, I'm
open to merging a first stable version and then continue the dev
Hi @all,
I started to work on an compatibility layer to run Storm Topologies on
Flink. I just pushed a first beta:
https://github.com/mjsax/flink/tree/flink-storm-compatibility
Please check it out, and let me know how you like it. In this first
version, I tried to code without changing too many
71 matches
Mail list logo