Re: Function input type validation

2015-11-08 Thread Aljoscha Krettek
I see Gyula’s point. In the case of the TupleTypeInfo subclass it only works because the equals method of TypleTypeInfo is used, IMHO. Stupid implementation mistakes should be caught by the Java type checker. I don’t think it would allows passing a Map to the map method if the type of the DataS

Re: Function input type validation

2015-11-08 Thread Timo Walther
The reason for input validation is to check if the Function is fully compatible. Actually only the return types are necessary, but it prohibits stupid implementation mistakes and undesired behavior. E.g. if you implement a "class MyMapper extends MapFunctionString>{}" and use it for "env.fromEl

Re: Function input type validation

2015-11-08 Thread Chesnay Schepler
On 08.11.2015 21:28, Gyula Fóra wrote: Let's say I want to implement my own TupleTypeinfo that handles null values, and I pass this typeinfo in the returns call of an operation. This will most likely fail when the next operation validates the input although I think it shouldn't. So i just tried t

Function input type validation

2015-11-08 Thread Gyula Fóra
Hey All, I am wondering what is the reason why Function input types are validated? This might become an issue if the user wants to write his own TypeInfo for a type that flink also handles natively. Let's say I want to implement my own TupleTypeinfo that handles null values, and I pass this type

Long cannot be cast to org.apache.flink.types.CopyableValue

2015-11-08 Thread Vasiliki Kalavri
Hello squirrels, I'm writing a few graph algorithms to test the performance of different iteration models and I am quite stuck with an error. While my sssp example works fine, I get the following in my connected components job (local execution inside eclipse): Exception in thread "main" org.apa

Flink deployment fabric script

2015-11-08 Thread Le Quoc Do
Hi Flinkers, I'm start working with Flink and I would like to contribute to Flink. However, I'm a very new Flinker, so the first thing I could contribute is a one-click style deployment script to deploy Flink, Spark and Hadoop Yarn on cluster and cloud computing environments (OpenStack based Cloud

Re: Web interface to submit jobs

2015-11-08 Thread Sachin Goel
Hi Flavio Yes. Multiple classes are reported properly. ​​ And having a Parameter interface sounds nice, but IMO it'll make the UI too cluttered. What can be done is to, say, search for a method named `getDescription` and display that to the user on frontend? Right now, we only do that for Program i