Congratulations, Theo!
Regards,
Chiwan Park
On 03/22/2017 03:06 AM, Ted Yu wrote:
Congratulations !
On Tue, Mar 21, 2017 at 11:00 AM, Matthias J. Sax wrote:
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA512
Congrats!
On 3/21/17 8:59 AM, Greg Hogan wrote:
Welcome, Theo, and great to have
Hi Robert,
Thanks for clarifying! I’ve filed this [1].
Regards,
Chiwan Park
[1]: https://issues.apache.org/jira/browse/FLINK-4223
> On Jul 14, 2016, at 9:56 PM, Robert Metzger wrote:
>
> Hi Chiwan,
>
> I think that's something we need to address. Probably the scal
Chiwan Park created FLINK-4223:
--
Summary: Rearrange scaladoc and javadoc for Scala API
Key: FLINK-4223
URL: https://issues.apache.org/jira/browse/FLINK-4223
Project: Flink
Issue Type
Hi all,
I just noticed some scaladocs (Gelly Scala API, Streaming Scala API, and
FlinkML) are missing in scaladoc page but found in javadoc page, even though
the APIs are for Scala. Is this intentional?
I think we have to move some documentation to scaladoc.
Regards,
Chiwan Park
Hi all,
+1 for shepherd
I would like to add me to shepherd for FlinkML.
Regards,
Chiwan Park
> On Jun 3, 2016, at 3:29 AM, Henry Saputra wrote:
>
> +1 for shepherd
>
> I would prefer using that term rather than maintainer. It is being used in
> Incubator PMC to help th
/ml/nn/KNNITSuite.scala#L45
Regards,
Chiwan Park
> On May 31, 2016, at 7:09 PM, Stephan Ewen wrote:
>
> Hi Chiwan!
>
> I think the Execution environment is not shared, because what the
> TestEnvironment sets is a Context Environment Factory. E
test cases.
[1]: https://issues.apache.org/jira/browse/FLINK-3994
[2]:
https://github.com/apache/flink/blob/master/flink-test-utils/src/test/scala/org/apache/flink/test/util/FlinkTestBase.scala#L56
Regards,
Chiwan Park
> On May 31, 2016, at 6:05 PM, Maximilian Michels wrote:
>
> Thank
Chiwan Park created FLINK-3994:
--
Summary: Instable KNNITSuite
Key: FLINK-3994
URL: https://issues.apache.org/jira/browse/FLINK-3994
Project: Flink
Issue Type: Bug
Components: Machine
]: https://travis-ci.org/chiwanpark/flink/builds/134104491
Regards,
Chiwan Park
> On May 31, 2016, at 5:43 PM, Chiwan Park wrote:
>
> Maybe it seems about KNN test case which is merged into yesterday. I’ll look
> into ML test.
>
> Regards,
> Chiwan Park
>
>> On
Maybe it seems about KNN test case which is merged into yesterday. I’ll look
into ML test.
Regards,
Chiwan Park
> On May 31, 2016, at 5:38 PM, Ufuk Celebi wrote:
>
> Currently, an ML test is reliably failing and occasionally some HA
> tests. Is someone looking into the ML test?
&g
Thanks for the great work! :-)
Regards,
Chiwan Park
> On May 31, 2016, at 7:47 AM, Flavio Pompermaier wrote:
>
> Awesome work guys!
> And even more thanks for the detailed report...This troubleshooting summary
> will be undoubtedly useful for all our maven projects!
>
> B
Thanks for great suggestion.
+1 for this proposal.
Regards,
Chiwan Park
> On May 13, 2016, at 1:44 AM, Nick Dimiduk wrote:
>
> For what it's worth, this is very close to how HBase attempts to manage the
> community load. We break out components (in Jira), with a list of
Please create a JIRA issue for this and send the PR with JIRA issue number.
Regards,
Chiwan Park
> On May 12, 2016, at 7:15 PM, Flavio Pompermaier wrote:
>
> Do I need to open also a Jira or just the PR?
>
> On Thu, May 12, 2016 at 12:03 PM, Stephan Ewen wrote:
>
>>
AFAIK, FLINK-3701 is about Flink 1.1-SNAPSHOT, not Flink 1.0. We can go forward.
Regards,
Chiwan Park
> On Apr 20, 2016, at 9:33 PM, Trevor Grant wrote:
>
> -1
>
> Not a PMC so my down vote doesn't mean anything but...
>
> https://github.com/apache/
Yes, I know Janino is a pure Java project. I meant if we add Scala code to
flink-core, we should add Scala dependency to flink-core and it could be
confusing.
Regards,
Chiwan Park
> On Apr 18, 2016, at 2:49 PM, Márton Balassi wrote:
>
> Chiwan, just to clarify Janino is a Java pr
I prefer to avoid Scala dependencies in flink-core. If flink-core includes
Scala dependencies, Scala version suffix (_2.10 or _2.11) should be added. I
think that users could be confused.
Regards,
Chiwan Park
> On Apr 17, 2016, at 3:49 PM, Márton Balassi wrote:
>
> Hi Gábor,
>
Note that you should use `createTypeInfomation[T]` method in
`org.apache.flink.api.scala` package object to create `TypeInformation` for
Scala specific types such as case classes or tuples.
Regards,
Chiwan Park
> On Apr 5, 2016, at 1:53 AM, Stephan Ewen wrote:
>
> Hi!
>
>
map(_ / 10).filter(_ > 8)
(subSolution, convergence)
}
result.print()
}
```
Regards,
Chiwan Park
[1]:
https://ci.apache.org/projects/flink/flink-docs-release-1.0/api/java/org/apache/flink/api/java/operators/IterativeDataSet.html#closeWith%28org.apache.flink.api.java.D
+1
Regards,
Chiwan Park
> On Mar 23, 2016, at 11:24 PM, Robert Metzger wrote:
>
> +1
>
> I just went through the master and release-1.0 branch, and most important
> fixes are already in the release-1.0 branch.
> I would also move this commit into the release branch
Chiwan Park created FLINK-3645:
--
Summary: HDFSCopyUtilitiesTest fails in a Hadoop cluster
Key: FLINK-3645
URL: https://issues.apache.org/jira/browse/FLINK-3645
Project: Flink
Issue Type: Bug
Hi Vijay,
Yes, you are right. Flink services (JM & TM) are stopped (not killed)
immediately after the job execution.
Regards,
Chiwan Park
> On Mar 18, 2016, at 7:57 AM, Vijay Srinivasaraghavan
> wrote:
>
> If I start a flink job on YARN with below option, does Flink (JM
AFAIK, you should run `tools/change-scala-version.sh 2.11` before running `mvn
clean install -DskipTests -Dscala-2.11`.
Regards,
Chiwan Park
> On Mar 4, 2016, at 7:20 PM, Stephan Ewen wrote:
>
> Sorry, the flag is "-Dscala-2.11"
>
> On Fri, Mar 4, 2016 at 11:1
://mahout.apache.org/users/basics/algorithms.html
[3]: https://github.com/ariskk/distributedWekaSpark
Regards,
Chiwan Park
> On Feb 12, 2016, at 7:04 PM, Fabian Hueske wrote:
>
> Hi Theo,
>
> thanks for starting this discussion. You are certainly right that the
> development of F
Hi Dongwon,
Yes, the things to do are picking an issue (by assigning the issue to you or
commenting on the issue) and make changes and send a pull request for it.
Welcome! :)
Regards,
Chiwan Park
> On Feb 6, 2016, at 3:31 PM, Dongwon Kim wrote:
>
> Hi Fabian, Matthias, Robert!
&
Chiwan Park created FLINK-3330:
--
Summary: Add SparseVector support to BLAS library in FlinkML
Key: FLINK-3330
URL: https://issues.apache.org/jira/browse/FLINK-3330
Project: Flink
Issue Type
gxiang Li accepted the PMC's offer to become
>> a committer of the Apache Flink project.
>>
>> Please join me in welcoming Chengxiang Li!
>>
>> Best, Fabian
>>
Regards,
Chiwan Park
> Alt-Moabit 91c
> D-10559 Berlin
> Phone: +49 30 23895 1814
>
> E-Mail: hilmi.yildi...@dfki.de
>
> -----
> Deutsches Forschungszentrum fuer Kuenstliche Intelligenz GmbH
> Firmensitz: Trippstadter Strasse 122, D-67663 Kaiserslautern
>
> Geschaeftsfuehrung:
> Prof. Dr. Dr. h.c. mult. Wolfgang Wahlster (Vorsitzender)
> Dr. Walter Olthoff
>
> Vorsitzender des Aufsichtsrats:
> Prof. Dr. h.c. Hans A. Aukes
>
> Amtsgericht Kaiserslautern, HRB 2313
> -
>
Regards,
Chiwan Park
here the
>>>> label is also a vector. After we discussed this issue, I created a new
>>>> class named LabeledSequenceVector with the labels as a Vector. In my use
>>>> case, I want to train a POS-Tagger system, so the "vector" is a vector of
>&g
rsitzender des Aufsichtsrats:
> Prof. Dr. h.c. Hans A. Aukes
>
> Amtsgericht Kaiserslautern, HRB 2313
> -
>
Regards,
Chiwan Park
gt;>>>
>>>>
>>>> Approach 3:
>>>> - Mark Combinable annotation deprecated
>>>> - Mark combine() method in RichGroupReduceFunction as deprecated
>>>> - Effect:
>>>> - There'll be a couple of deprecation warnings.
>>>> - We face the same problem with silent failures as in Approach 1.
>>>> - We have to check if RichGroupReduceFunction's override combine or
>>> not
>>>> (can be done with reflection). If the method is not overridden we do not
>>>> execute it (unless there is a Combinable annotation) and we are fine. If
>>> it
>>>> is overridden and no Combinable annotation has been defined, we have the
>>>> same problem with silent failures as before.
>>>> - After we remove the deprecated annotation and method, we have the
>>>> same effect as with Approach 1.
>>>>
>>>>
>>>>
>>>> There are more alternatives, but these are the most viable, IMO.
>>>>
>>>>
>>>>
>>>> I think, if we want to remove the combinable annotation, we should do it
>>>> now.
>>>>
>>>> Given the three options, would go for Approach 1. Yes, breaks a lot of
>>> code
>>>> and yes there is the possibility of computing incorrect results.
>>> Approach 2
>>>> is safer but would mean another API breaking change in the future.
>>> Approach
>>>> 3 comes with fewer breaking changes but has the same problem of silent
>>>> failures.
>>>>
>>>> IMO, the breaking API changes of Approach 1 are even desirable because
>>> they
>>>> will make users aware that this feature changed.
>>>>
>>>>
>>>>
>>>> What do you think?
>>>>
>>>>
>>>>
>>>> Cheers, Fabian
>>>>
Regards,
Chiwan Park
t;> - branches and tags can be deleted (not sure if this applies as well
>>>>>>> for the master branch)
>>>>>>> - "the 'protected' portions of git to primarily focus on refs/tags/rel
>>>>>>> - thus any tags under rel, will have their entire commit history."
>>>>>>>
>>>>>>> I am not 100% sure which exact parts of the repository are protected
>>>>>>> now as I am not very much into the details of Git.
>>>>>>> However, I believe we need to create new tags under rel for our
>>>>>>> previous releases to protect them.
>>>>>>>
>>>>>>> In addition, I would like to propose to ask Infra to add protection
>>>>>>> for the master branch. I can only recall very few situations where
>>>>>>> changes had to be reverted. I am much more in favor of a reverting
>>>>>>> commit now and then compared to a branch that can be arbitrarily
>>>>> changed.
>>>>>>>
>>>>>>> What do you think about this?
>>>>>>>
>>>>>>> Best, Fabian
>>>>>>>
Regards,
Chiwan Park
you import the project directly?
>
> Regards
> Ram
> -----Original Message-
> From: Chiwan Park [mailto:chiwanp...@apache.org]
> Sent: Tuesday, January 12, 2016 4:54 PM
> To: dev@flink.apache.org
> Subject: Re: Naive question
>
> Because I tested with Scala I
nk-docs-release-0.10/internals/ide_setup.html
>
> On Tue, Jan 12, 2016 at 12:04 PM, Chiwan Park wrote:
>
>> Hi Ram,
>>
>> Because there are some Scala IDE (Eclipse) plugins needed, I recommend to
>> avoid `mvn eclipse:eclipse` command. Could you try just run `
el.com> wrote:
>
>> Thanks to all. I tried with Scala Eclipse IDE with all these
>> 'change-scala-version.sh'. But in vain.
>>
>> So I switched over to Intellij and thing work fine over there. I am
>> new to Intellij so will try using it.
at scala.collection.AbstractTraversable.map(Traversable.scala:104)
>> at
>> org.apache.flink.runtime.minicluster.FlinkMiniCluster.start(FlinkMiniCluster.scala:228)
>> at
>> org.apache.flink.runtime.minicluster.FlinkMiniCluster.start(FlinkMiniCluster.scala:219)
>> at
>> org.apache.flink.streaming.api.environment.LocalStreamEnvironment.execute(LocalStreamEnvironment.java:104)
>> at
>> org.apache.flink.streaming.examples.wordcount.WordCount.main(WordCount
>> .java:80)
>>
>> I know this is a naïve question but I would like to get some help in
>> order to over come this issue. I tried various options like setting
>> scala-2.10 as the compiler for the project (then it shows completely
>> different error) and many of the projects don't even compile. But with
>> 2.11 version I get the above stack trace. Any help here is welcome.
>>
>> Regards
>> Ram
>>
Regards,
Chiwan Park
document). The goal of this task is to have the same
>> functionality
>>> as
>>>>> currently, but with Calcite in the translation process. This is a
>>>> blocking
>>>>> task that we hope to complete soon. Afterwards, we can independently
>>> work
>>>>> on different aspects such as extending the Table API, adding a SQL
>>>>> interface (basically just a parser), integration with external data
>>>>> sources, better code generation, optimization rules, streaming
>> support
>>>> for
>>>>> the Table API, StreamSQL, etc..
>>>>>
>>>>> Timo and I plan to work on a WIP branch to implement Task 1 and merge
>>> it
>>>> to
>>>>> the master branch once the task is completed. Of course, everybody is
>>>>> welcome to contribute to this effort. Please let us know such that we
>>> can
>>>>> coordinate our efforts.
>>>>>
>>>>> Thanks,
>>>>> Fabian
Regards,
Chiwan Park
i.environment.LocalStreamEnvironment.execute(LocalStreamEnvironment.java:104)
>> at
>> org.apache.flink.streaming.examples.wordcount.WordCount.main(WordCount.java:80)
>>
>> I know this is a naïve question but I would like to get some help in order
>> to over come this issue. I tried various options like setting scala-2.10 as
>> the compiler for the project (then it shows completely different error) and
>> many of the projects don't even compile. But with 2.11 version I get the
>> above stack trace. Any help here is welcome.
>>
>> Regards
>> Ram
>>
Regards,
Chiwan Park
basically just a parser), integration with external data
>>>> sources, better code generation, optimization rules, streaming support
>>> for
>>>> the Table API, StreamSQL, etc..
>>>>
>>>> Timo and I plan to work on a WIP branch to implement Task 1 and merge
>> it
>>> to
>>>> the master branch once the task is completed. Of course, everybody is
>>>> welcome to contribute to this effort. Please let us know such that we
>> can
>>>> coordinate our efforts.
>>>>
>>>> Thanks,
>>>> Fabian
>>>>
>>>
>>>
>>
Regards,
Chiwan Park
idelines and
> talking to rmetzger via Github I figured I'd continue the discussion
> through this outlet.
>
> Any guidance would be much appreciated!
>
> --
>
> Enjoy life!
>
> -Adam
Regards,
Chiwan Park
ector is such a core part of the library
> any changes would involve a number of adjustments downstream.
>
> Perhaps having different optimizers etc. for Vectors and double labels
> makes sense, but I haven't put much though into this.
>
>
> On Tue, Jan 5, 2016 at 12
.c. mult. Wolfgang Wahlster (Vorsitzender)
> Dr. Walter Olthoff
>
> Vorsitzender des Aufsichtsrats:
> Prof. Dr. h.c. Hans A. Aukes
>
> Amtsgericht Kaiserslautern, HRB 2313
> -
>
Regards,
Chiwan Park
feature.]
> 918 [very short PR]
> 861 [followed by 710 after a complete rebase] [major work for Histograms
> and Decision Trees]
> 757 [major work for K-Means clustering and initialization schemes]
>
> If I have come across as rude, I apologize.
>
> Happy reviewing and thanks for bearing with me. :)
>
> Cheers!
> Sachin
>
> -- Sachin Goel
> Computer Science, IIT Delhi
> m. +91-9871457685
Regards,
Chiwan Park
> Henry
>>> Fabian
>>>
>>> * non-binding
>>>
>>> -1 votes: none
>>>
>>> I'll upload the release artifacts and release the Maven artifacts.
>>> Once the changes are effective, the community may announce the
>>> release.
>>>
Regards,
Chiwan Park
The staging repository for this release can be found at:
>>>> https://repository.apache.org/content/repositories/orgapacheflink-1055
>>>>
>>>> -
>>>>
>>>> The vote is open for the next 48 hours and passes if a majority of at
>> least
>>>> three +1 PMC votes are cast.
>>>>
>>>> The vote ends on Thursday November 12, 2015.
>>>>
>>>> [ ] +1 Release this package as Apache Flink 0.10.0
>>>> [ ] -1 Do not release this package because ...
>>>>
>>>> ===
>>>>
>>>> The following commits have been added on top of release-0.10.0-rc7:
>>>>
>>>> c0fe305 [FLINK-2992] Remove use of SerializationUtils
>>>> c098377 [hotfix] Check for null in StreamSource.cancel()
>>>>
>>
>>
Regards,
Chiwan Park
Hi Martin,
I had the same problem. From my investigation, current custom Jekyll plugin for
Flink is not compatible with Jekyll 3.x. If you remove Jekyll 3.x and install
Jekyll 2.x, you can build docs. I’m using Jekyll 2.5.3 to build the docs.
Regards,
Chiwan Park
On November 6, 2015 at 4:58
Vector type. You can import class with renaming like following:
```
import org.apache.flink.ml.math.{Vector => FlinkVector}
```
I hope that this answer helps you. :)
Regards,
Chiwan Park
On November 3, 2015 at 6:11:03 AM, Daniel Blazevski
(daniel.blazev...@gmail.com) wrote:
Hello,
I
ng the version suffix to Scala 2.10 artifacts also. But I’m not
sure that removing the version suffix from Java-only artifacts would be good.
As I said above, It seems difficult for newcomers.
Regards,
Chiwan Park
On November 2, 2015 at 8:19:15 PM, Fabian Hueske (fhue...@gmail.com) wrote:
That wo
Chiwan Park created FLINK-2950:
--
Summary: Markdown presentation problem in SVM documentation
Key: FLINK-2950
URL: https://issues.apache.org/jira/browse/FLINK-2950
Project: Flink
Issue Type: Bug
Chiwan Park created FLINK-2947:
--
Summary: Coloured Scala Shell
Key: FLINK-2947
URL: https://issues.apache.org/jira/browse/FLINK-2947
Project: Flink
Issue Type: Improvement
Components
Chiwan Park created FLINK-2841:
--
Summary: Broken roadmap link in FlinkML contribution guide
Key: FLINK-2841
URL: https://issues.apache.org/jira/browse/FLINK-2841
Project: Flink
Issue Type: Bug
ople
>>>> can access it easily and get an overview what is already available (this
>>>> might also avoid duplicate development). It might also be a good point
>>>> to show common patterns. In order to collect as much as possible, the
>>>> contributing requirement (with respect to testing etc) could be lower
>>>> than for Flink itself.
>>>>
>>>> For example, I recently started a small flink-clojure module with a
>>>> simple word-count example to answer a question on SO. Including this in
>>>> Flink would not be appropriate. However, for a flink-external repro it
>>>> might be nice to have.
>>>>
>>>> What do you think about it?
>>>>
>>>>
>>>> -Matthias
>>>>
>>>
>>
>
Regards,
Chiwan Park
ned you can help find some starter tasks
> that I would appreciate. I tried to search some on my own I even created a
> PR for FLINK-2156, but I couldn't find any bigger one.
>
> Looking forward for any response.
>
> Regards
> Dawid
Regards,
Chiwan Park
@Fabian, Could you cover FLINK-2712 in your pull request? I think that it would
be better than split pull request.
Regards,
Chiwan Park
> On Sep 28, 2015, at 4:51 PM, Fabian Hueske wrote:
>
> Thanks everybody for the discussion.
> I'll prepare a pull request to update the &
Chiwan Park created FLINK-2768:
--
Summary: Wrong Java version requirements in "Quickstart: Scala
API" page
Key: FLINK-2768
URL: https://issues.apache.org/jira/browse/FLINK-2768
Proj
Chiwan Park created FLINK-2767:
--
Summary: Add support Scala 2.11 to Scala shell
Key: FLINK-2767
URL: https://issues.apache.org/jira/browse/FLINK-2767
Project: Flink
Issue Type: Improvement
(2), (3) and (4).
Regards,
Chiwan Park
> On Sep 24, 2015, at 2:23 AM, Henry Saputra wrote:
>
> Thanks again, Fabian for starting the discussions.
>
> For (1) and (2) I think it is good idea and will help people to
> understand and follow the author thought process.
> Follow
I just created a JIRA issue [1].
Regards,
Chiwan Park
[1] https://issues.apache.org/jira/browse/FLINK-2712
> On Sep 20, 2015, at 1:33 AM, Chiwan Park wrote:
>
> Okay, I’ll create a JIRA issue and send a pull request for it. :)
>
> Regards,
> Chiwan Park
>
>>
Chiwan Park created FLINK-2712:
--
Summary: Add some description about tests to "How to Contribute"
documentation
Key: FLINK-2712
URL: https://issues.apache.org/jira/browse/FLINK-2712
Proj
Okay, I’ll create a JIRA issue and send a pull request for it. :)
Regards,
Chiwan Park
> On Sep 19, 2015, at 7:35 PM, Ufuk Celebi wrote:
>
> Thanks Stephan for pointing this out. I agree with you. +1
>
> @Chiwan: Good idea with the Wiki. Actually maybe even better t
Hi Stephan,
Thanks for nice guide! I think we can upload this to the wiki or how to
contribute documentation.
This guide would be helpful for newcomers.
Regards,
Chiwan Park
> On Sep 17, 2015, at 9:33 PM, Stephan Ewen wrote:
>
> Hi all!
>
> The build time of Flink with all t
Chiwan Park created FLINK-2690:
--
Summary: CsvInputFormat cannot find the field of derived POJO class
Key: FLINK-2690
URL: https://issues.apache.org/jira/browse/FLINK-2690
Project: Flink
Issue
I just created the JIRA issue [1].
Regards,
Chiwan Park
[1] https://issues.apache.org/jira/browse/FLINK-2619
> On Sep 4, 2015, at 6:43 PM, Chiwan Park wrote:
>
> I also found the same circumstances. Although I add a fail test case in
> ExecutionGraphRestartTest, `mvn clean ver
Chiwan Park created FLINK-2619:
--
Summary: Some Scala Tests not being executed by Maven
Key: FLINK-2619
URL: https://issues.apache.org/jira/browse/FLINK-2619
Project: Flink
Issue Type: Bug
I also found the same circumstances. Although I add a fail test case in
ExecutionGraphRestartTest, `mvn clean verify` doesn’t fail. I will create an
issue covered this.
Regards,
Chiwan Park
> On Aug 29, 2015, at 10:13 PM, Stephan Ewen wrote:
>
> Hi!
>
> I found quite a few
Welcome Matthias! :)
Regards,
Chiwan Park
> On Sep 2, 2015, at 8:30 PM, Kostas Tzoumas wrote:
>
> The Project Management Committee (PMC) of Apache Flink has asked Matthias
> Sax to become a committer, and we are pleased to announce that he has
> accepted.
>
> Matthias
- TransitiveClosure
- WebLogAnalysis
- WordCount
- WordCountPOJO
Regards,
Chiwan Park
> On Aug 31, 2015, at 1:24 PM, Henry Saputra wrote:
>
> +1
>
> LICENSE file looks good
> NOTICE file looks good
> Signature files look good
> Hash files look good
> Source compile and
Robert's suggestion looks good. +1
Sent from my iPhone
> On Aug 26, 2015, at 9:55 PM, Aljoscha Krettek wrote:
>
> +1 seems to be a viable solution
>
>> On Wed, 26 Aug 2015 at 14:51 Stephan Ewen wrote:
>>
>> That sounds like a very good compromise.
>>
>> +1
>>
>>> On Wed, Aug 26, 2015 at 2:
Thank you for sharing!
Regards,
Chiwan Park
> On Aug 23, 2015, at 10:36 PM, Kostas Tzoumas wrote:
>
> Hi folks,
>
> I have a color scheme for Flink that people can use for presentations, blog
> posts, etc, based on the Flink logo colors:
>
> https://www.dropbo
Congrats Chesnay!
Regards,
Chiwan Park
> On Aug 20, 2015, at 7:39 PM, Gyula Fóra wrote:
>
> Welcome! :)
>
> On Thu, Aug 20, 2015 at 12:34 PM Matthias J. Sax <
> mj...@informatik.hu-berlin.de> wrote:
>
>> Congrats! The squirrel "army" is growing fas
Creating a JIRA issue [1] is done.
Regards,
Chiwan Park
[1] https://issues.apache.org/jira/browse/FLINK-2539
> On Aug 18, 2015, at 5:28 PM, Till Rohrmann wrote:
>
> Good initiative Chiwan. +1 for a more unified code style.
>
> On Tue, Aug 18, 2015 at 10:25 AM, Chiwan Park wr
Chiwan Park created FLINK-2539:
--
Summary: More unified code style for Scala code
Key: FLINK-2539
URL: https://issues.apache.org/jira/browse/FLINK-2539
Project: Flink
Issue Type: Improvement
Okay, I’ll create a JIRA issue covered this topic.
Regards,
Chiwan Park
> On Aug 17, 2015, at 1:17 AM, Stephan Ewen wrote:
>
> +1 for formatting templates for Eclipse and IntelliJ.
>
> On Sun, Aug 16, 2015 at 6:06 PM, Sachin Goel
> wrote:
>
>> We shou
scalastyle-maven-plugin to 0.7.0, adding some
rules such as NoWhitespaceBeforeLeftBracketChecker,
EnsureSingleSpaceAfterTokenChecker, IndentationChecker, and MagicNumberChecker
and updating the documentation in wiki.
I hope to discuss the code style for Scala. How think you about this?
Regards,
Chiwan
Currently, the site looks okay.
Regards,
Chiwan Park
> On Aug 14, 2015, at 6:05 PM, Gábor Gévay wrote:
>
> Hello,
>
> I would like to submit an abstract to Flink Forward, but the webpage
> of the conference (flink-forward.org) seems to be down. It prints
> "Err
Oh, I confused Streaming API with Batch API. :) Stephen’s comment will help you.
Regards,
Chiwan Park
> On Jul 27, 2015, at 4:22 PM, Stephan Ewen wrote:
>
> Your program gives this exception: java.lang.UnsupportedClassVersionError:
>
> This usually means that a JVM tries to loa
Hi, print() method runs the program immediately. After execution, there is no
sink in
the program. You should remove calling execute() method after calling print()
method.
There is more detail description [1][2] in Flink documentation. I hope that
this helps.
Regards,
Chiwan Park
[1]
https
AFAIK, There is no JIRA issue related this problem.
Regards,
Chiwan Park
> On Jul 12, 2015, at 2:37 AM, Henry Saputra wrote:
>
> Ah I saw Matthias already report this. Is there a JIRA filed for this?
>
> If not I could create one.
>
> - Henry
>
> On Sat, Jul
Oh, I misunderstood the problem. In firefox, the problem occurs. [1]
Regards,
Chiwan Park
[1] http://imgur.com/js5nZQ1
> On Jul 10, 2015, at 9:24 PM, Vasiliki Kalavri
> wrote:
>
> Hi,
>
> I have the same rendering problem as Matthias in Chrome. Looks OK in Safari.
> I
I think that the problem is on your network or DNS setting.
In my computer, the documentation is rendered properly. I attached the
screenshot. [1]
Regards,
Chiwan Park
[1] http://imgur.com/4mSohDQ
> On Jul 10, 2015, at 9:00 PM, Matthias J. Sax
> wrote:
>
> Hi,
>
> I just
sentence or missed the talking in documentation, please
notify me.
Regards,
Chiwan Park
> On Jul 6, 2015, at 8:07 PM, Alexander Alexandrov
> wrote:
>
>> Because we are using Scala in our runtime, all modules are Scala
> dependent module.
>
> If all modules will need the
Great! Nice start. :)
The logo is shown now.
Regards,
Chiwan Park
> On Jul 7, 2015, at 5:06 PM, Maximilian Michels wrote:
>
> Cool. Nice work, Matthias, and thanks for starting it off.
>
> On Mon, Jul 6, 2015 at 11:45 PM, Matthias J. Sax <
> mj...@informatik.hu-berli
>> We end up with a situation like this
>>
>> - flink-pure-java
>> `- flink-some-scala-A
>> `- flink-some-scala-B
>> - flink-some-scala-B_2.11
>>
>> We end up having both versions of *flink-some-scala-B* in our project.
>>
>>
&
Hi All,
I created a PR for this issue. [1] Please check and comment about the PR.
Regards,
Chiwan Park
[1] https://github.com/apache/flink/pull/885
> On Jul 2, 2015, at 5:59 PM, Chiwan Park wrote:
>
> @Alexander I’m happy to hear that you want to help me. If you help me, I
Thanks Till :)
I reimplemented my implementation using PredictDataSetOperation.
Regards,
Chiwan Park
> On Jun 29, 2015, at 7:41 PM, Till Rohrmann wrote:
>
> Hi Chiwan,
>
> at the moment the single element PredictOperation only supports
> non-distributed models. This mea
any advice about this to me, I will really appreciate.
Regards,
Chiwan Park
> On Jun 29, 2015, at 4:43 PM, Till Rohrmann wrote:
>
> Hi Chiwan,
>
> when you use the single element predict operation, you always have to
> implement the `getModel` method. There you have acces
We should assign FLINK-2066 to Nuno. :)
Regards,
Chiwan Park
> On Jun 29, 2015, at 1:21 PM, Márton Balassi wrote:
>
> Hey,
>
> Thanks for picking up the issue. This value can be specified as
> "execution-retries.delay" in the flink-conf.yaml. Hence you can check t
but in case of a single element there is no method
to access
parameter map.
But in k-nearest-neighbors classification, we need to know k in predict method
to select top
k values.
How can I solve this problem?
Regards,
Chiwan Park
[1]
https://github.com/apache/flink/commit
you decide a issue to contribute, please assign it to you. If you don’t
have permission to
assign, just comment into the issue. Then other people give permission to you
and assign
the issue to you.
Regards,
Chiwan Park
[1] https://issues.apache.org/jira/
[2] https://issues.apache.org/jira/browse
.
Regards,
Chiwan Park
> On Jun 25, 2015, at 10:03 AM, Matthias J. Sax
> wrote:
>
> Hi,
>
> I worked on rewriting flink-test according to
> https://issues.apache.org/jira/browse/FLINK-2275
>
> In "org.apache.flink.test.javaApiOperators.SortPartitionITCase" I hi
, you can contribute easily and safely with "How to Contribute”
guide[1] in web page.
But if you have questions, just post a mail to dev mailing list. We should
reply your mail.
I hope your time spent will be enjoyable.
Regards,
Chiwan Park
[1] http://flink.apache.org/how-to-contribute
Great! We should post the announcement mail to user mailing list :)
Regards,
Chiwan Park
> On Jun 24, 2015, at 9:22 PM, Stephan Ewen wrote:
>
> Great that this release is out, finally :-)
>
> On Wed, Jun 24, 2015 at 2:19 PM, Maximilian Michels wrote:
>
>> I'
-core” and some Scala codes which are used widely.
Regards,
Chiwan Park
> On Jun 21, 2015, at 8:48 AM, Robert Metzger wrote:
>
> I like option 1 the most ("move to flink-core"), however, it would scatter
> the type extractor / type information classes accross multiple project
flink/flink-docs-master/apis/programming_guide.html#parallel-execution
[3]
http://ci.apache.org/projects/flink/flink-docs-master/apis/programming_guide.html#iteration-operators
Regards,
Chiwan Park
> On Jun 19, 2015, at 9:15 PM, Maximilian Michels wrote:
>
> Dear Flink community,
>
&g
document.
Regards,
Chiwan Park
> On Jun 19, 2015, at 12:53 AM, Aljoscha Krettek wrote:
>
> I'm also for simplification but let's hear what those who put the build-jar
> profile there have to say about it.?
>
> On Thu, 18 Jun 2015 at 17:25 Ufuk Celebi wrote:
>
>>
Hi. What flink version is running now in cluster?
I copied your code and packaging with flink quickstart archetype. There is
another error because you don’t add any data sink in the program.
Regards,
Chiwan Park
> On Jun 16, 2015, at 4:17 PM, Thomas Peel wrote:
>
>
>
> Hi
+1 for generalisation.
@Ronny: Could you create a JIRA issue related to this?
Regards,
Chiwan Park
> On Jun 13, 2015, at 9:07 PM, Felix Neutatz wrote:
>
> Hi Ronny,
>
> I agree with you and I would go even further and generalize it overall. So
> that the movieID could be o
.
https://www.jetbrains.com/idea/help/reformat-code-dialog.html
http://imgur.com/muEVEZT
Regards,
Chiwan Park
> On Jun 9, 2015, at 8:39 PM, Matthias J. Sax
> wrote:
>
> On side comment:
>
> Eclipse allows to auto format on save and apply the formating rules to
> changed l
I attached jps and jstack log about hanging
TaskManagerFailsWithSlotSharingITCase to JIRA FLINK-2183.
Regards,
Chiwan Park
> On Jun 10, 2015, at 12:28 AM, Aljoscha Krettek wrote:
>
> I discovered something that might be a feature, rather than a bug. When you
> submit an example u
Hi. I have a problem running `mvn clean verify` command.
TaskManagerFailsWithSlotSharingITCase hangs in Oracle JDK 7 (1.7.0_80). But in
Oracle JDK 8 the test case doesn’t hang.
I’ve investigated about this problem but I cannot found the bug.
Regards,
Chiwan Park
> On Jun 9, 2015, at 2:11
Hi. I’m very excited about preparing a new major release. :)
I just picked two tests. I will report status as soon as possible.
Regards,
Chiwan Park
> On Jun 9, 2015, at 1:52 AM, Maximilian Michels wrote:
>
> Hi everyone!
>
> As previously discussed, the Flink developer co
1 - 100 of 125 matches
Mail list logo