Hi Flink-Devs,
I have created an outputformat for elasticsearch, which connnects through
the transport client.
While running the job from eclipse-IDE, it works fine.
But, while running the job from command line or Flink web interface, i am
getting different error.
if i deploy the jar for first tim
could this be related? https://github.com/elastic/elasticsearch/issues/13052
On 02.11.2015 10:02, santosh_rajaguru wrote:
java.lang.NoSuchFieldError: JRE_IS_64BIT
Hello Chesnay,
I have checked the elastic search bundle. It is bundled with lucene jars
which are available in elasticsearch 1.7.1, which contains 4.10.4 version of
lucene jars.
--
View this message in context:
http://apache-flink-mailing-list-archive.1008284.n3.nabble.com/job-failed-while-init
Hi Henry,
Note that we use a special Maven profile for aggregating all the java
docs and scala docs (-Paggregate-scaladoc). This makes the Scala
classes available in the JavaDoc. We also have extra Scala docs.
There were two issues recently for the Java Docs. The first one was
with Java 8 which c
Hi Santosh,
how do you generate your job jar. The second error with the
`NoClassDefFoundError` usually happens when not all required runtime
classes are shipped to the cluster (either not included in the fat jar or
not explicitly added in case of a non-fat jar).
Cheers,
Till
On Mon, Nov 2, 2015
Sebastian Kruse created FLINK-2952:
--
Summary: Runtimes > 24 hours are reported incorrectly in the web
frontend
Key: FLINK-2952
URL: https://issues.apache.org/jira/browse/FLINK-2952
Project: Flink
OK, let me try to summarize the discussion (and please correct me if I got
something wrong).
1) Flink deploys Scala 2.11 snapshot artifacts. Therefore, we have to
release 2.11 artifacts for the 0.10.0 release version as well.
2) Everybody agrees to appropriately tag all artifacts that have a
(tra
I would really love to see it extended.
Do you have time for opening a pull request for extending the page a bit
Matthias?
On Wed, Oct 28, 2015 at 5:01 AM, Martin Liesenberg <
martin.liesenb...@gmail.com> wrote:
> It might also be useful to link to the training materials which can be
> found here
Hi Till,
While going through the verbose of the logs, i figured out that i dint
include one dependent jar file for elasticsearch while created the
elasticsearch sdk plugin. Thanks for the help Till and Chesnay.
Thanks and Regards,
Santosh
--
View this message in context:
http://apache-flin
I'm for leaving it as-is and renaming all artifacts which depend on
Scala for the release following 0.10.
On Mon, Nov 2, 2015 at 11:32 AM, Fabian Hueske wrote:
> OK, let me try to summarize the discussion (and please correct me if I got
> something wrong).
>
> 1) Flink deploys Scala 2.11 snapshot
OK, I'll try to summarize the discussion so far (please correct me if I got
something wrong):
Everybody is in favor of adding a stricter code style based on the Google
Java code style.
Main points of discussion are:
1) Line length
2) JavaDocs
3) Tabs vs. Spaces
-- Line length
Issue:
Google code s
That would mean to have "flink-java_2.10" and "flink-java_2.11" artifacts
(and others that depend on flink-java and have no other Scala dependency)
in the 0.10.0 release and only "flink-java" in the next 1.0 release.
Do we want that?
2015-11-02 11:37 GMT+01:00 Maximilian Michels :
> I'm for leav
If we choose selective Scala version suffix for artifacts, we have to tell
which artifacts have the version suffix to newcomers. Some artifacts such as
"flink-java”, "flink-streaming-java" are easily recognized. But IMO, knowing
whether artifacts such as "flink-ml", "flink-clients", "flink-table
> That would mean to have "flink-java_2.10" and "flink-java_2.11" artifacts
> (and others that depend on flink-java and have no other Scala dependency)
> in the 0.10.0 release and only "flink-java" in the next 1.0 release.
My suggestion was to keep the Scala unsuffixed Scala 2.10 and add a
suffix
Ah OK. Sorry, I misunderstood your intention.
2015-11-02 14:07 GMT+01:00 Maximilian Michels :
> > That would mean to have "flink-java_2.10" and "flink-java_2.11" artifacts
> > (and others that depend on flink-java and have no other Scala dependency)
> > in the 0.10.0 release and only "flink-java"
Fabian Hueske created FLINK-2953:
Summary: Chained sortPartition() calls produce incorrect results
in Scala DataSet API
Key: FLINK-2953
URL: https://issues.apache.org/jira/browse/FLINK-2953
Project: F
Will do.
-Matthias
On 11/02/2015 11:34 AM, Robert Metzger wrote:
> I would really love to see it extended.
> Do you have time for opening a pull request for extending the page a bit
> Matthias?
>
> On Wed, Oct 28, 2015 at 5:01 AM, Martin Liesenberg <
> martin.liesenb...@gmail.com> wrote:
>
>> I
-1
A user reported a bug in the Scala DataSet API: FLINK-2953
Should be easy to solve. I will provide a fix soon.
2015-10-30 15:51 GMT+01:00 Maximilian Michels :
> We can continue testing now:
>
> https://docs.google.com/document/d/1keGYj2zj_AOOKH1bC43Xc4MDz0eLhTErIoxevuRtcus/edit
>
> On Fri, Oc
Thanks for the summary Fabian.
Maybe we should have kind of a vote about this. No classical Apache vote
though, but voting for different options (so only +1), and the option
with the highest score wins? (Not sure if this is possible to do...)
About spaced vs. tabs:
> "AFAIR, nobody said to have a
-1
We include openjdk JMH and this is GPL v2 license. This could be a problem.
> On 02 Nov 2015, at 14:27, Fabian Hueske wrote:
>
> -1
> A user reported a bug in the Scala DataSet API: FLINK-2953
>
> Should be easy to solve. I will provide a fix soon.
>
> 2015-10-30 15:51 GMT+01:00 Maximilian M
What is the status here: https://github.com/apache/flink-web/pull/12
Please give feedback.
On 10/28/2015 11:02 AM, Maximilian Michels wrote:
> Thanks Matthias! I made a comment. Please open a pull request.
>
> On Tue, Oct 27, 2015 at 10:37 PM, Matthias J. Sax wrote:
>> Just updated this. Improv
I also have an open PR that adds some new dependencies, most notably Nifi,
Storm, betty-router and Elasticsearch, and removes AWS.
> On 02 Nov 2015, at 14:39, Aljoscha Krettek wrote:
>
> -1
> We include openjdk JMH and this is GPL v2 license. This could be a problem.
>> On 02 Nov 2015, at 14:27,
Alright. Good finds. I'll cancel the release candidate and will create
a new one once the fixes are in.
On Mon, Nov 2, 2015 at 3:02 PM, Aljoscha Krettek wrote:
> I also have an open PR that adds some new dependencies, most notably Nifi,
> Storm, betty-router and Elasticsearch, and removes AWS.
>
You added Storm dependency? Why? Which version? How does it affect
storm-compatibility?
We decided to not include Storm as dependency to Flink when I started
with it. Thus, fat-jars must include Storm classes if storm-compat is use.
On 11/02/2015 03:07 PM, Maximilian Michels wrote:
> Alright. Goo
flink-storm has a dependency on storm-core. Therefore I though it should be
added to the LICENSE file (not the one in root but the one we ship with the
binary distribution).
> On 02 Nov 2015, at 16:00, Matthias J. Sax wrote:
>
> You added Storm dependency? Why? Which version? How does it affect
This is the PR: https://github.com/apache/flink/pull/1316 if anyone is
interested or knows something about how we have to declare licenses.
> On 02 Nov 2015, at 16:10, Aljoscha Krettek wrote:
>
> flink-storm has a dependency on storm-core. Therefore I though it should be
> added to the LICENSE
Aljoscha didn't add any dependencies. He added license notes for new
dependencies (since 0.9)
On Mon, Nov 2, 2015 at 4:10 PM, Aljoscha Krettek
wrote:
> flink-storm has a dependency on storm-core. Therefore I though it should
> be added to the LICENSE file (not the one in root but the one we ship
I think your comment was misleading. It sounded like you added the actual
dependency to the POM and not just to the LICENSE file.
> On 02 Nov 2015, at 16:26, Aljoscha Krettek wrote:
>
> This is the PR: https://github.com/apache/flink/pull/1316 if anyone is
> interested or knows something about
Hi,
I also discovered that basically we would need to provide custom LICENSE/NOTICE
files for our released binaries for different hadoop/scala/… versions
because they come with different dependencies (that we also include due to
shading).
For example, this is the dependency tree for flink-shaded
Hi,
I just opened a PR that updates the list of slide in the "Materials"
page. I browsed slideshare profiles I am aware of and added everything I
thought it might be interesting, while not being too repetitive.
Additionally, I linked dataArtisan's slideshare profile and Flink
training material, F
Sorry for the back-and-forth guys. I updated my PR to completely remove the
LICENSE/NOTICE files that where specific to the binary release. Now we just
copy over the LICENSE/NOTICE files from the source release. This is also how
Hadoop does it, by the way.
> On 02 Nov 2015, at 17:51, Aljoscha K
I think by now virtually everyone prefers spaces if it came for free, so it
is a matter of making an educated decision about the cost/benefit tradeoffs.
What are the benefits of spaces in the style other than people liking the
looks of space-formatted code better (aesthetics, schmaesthetics ;-) )
I am a bit sceptical about the slides on the website, because they just go
out of date. Nobody remembers to update the links to slides in the
materials section, so some more permanent links would be great.
Linking to training is good, maybe to a few slideshare accounts of
committers that these peo
+1 for Max' suggestion to fix that for 1.0 (next release).
Hot fixing of this thing so short before a release is a bit risky in my
opinion. It is easy to make errors (overlooking something, error not
visible because of cached older dependencies, ...), it happened more than
once during version upgr
+1 for that.
2015-11-02 20:52 GMT+01:00 Stephan Ewen :
> +1 for Max' suggestion to fix that for 1.0 (next release).
>
> Hot fixing of this thing so short before a release is a bit risky in my
> opinion. It is easy to make errors (overlooking something, error not
> visible because of cached older
Hello,
I am working on the exact knn algorithm in Flink, and I'd like to make the
structure more modular.
I am working off of the initial work of @chiwanpark, and when I print out
a variable to the screen, I get something like:
```
training.values = Vector(DenseVector(-0.206, -0.276), DenseVect
This vote is cancelled in favor of a new RC.
On Mon, Nov 2, 2015 at 7:11 PM, Aljoscha Krettek wrote:
> Sorry for the back-and-forth guys. I updated my PR to completely remove the
> LICENSE/NOTICE files that where specific to the binary release. Now we just
> copy over the LICENSE/NOTICE files f
+1 for the approach without the full LICENSE/NOTICE for binary releases. As
long as the source releases are correct there should be no problem
On Mon, Nov 2, 2015 at 7:11 PM, Aljoscha Krettek
wrote:
> Sorry for the back-and-forth guys. I updated my PR to completely remove
> the LICENSE/NOTICE fi
Hello,
I recently made a pull request for an exact knn algorithm, and have been
considering to start on the approximate knn algorithm and had an issue with
updating the master branch of Flink.
I am curious to know what best practices are in terms of keeping up to date
with the master branch to av
P.S. minor typo, a slightly better chart than
(1) pull master branch + build --- > (2) create new branch ---> (3) make
pull request ---> (4) update master ---> (5) create new branch > repeat
is
(1) clone master branch + build --- > (2) create new branch ---> (3) make
pull request ---> (4)
Jian Jiang created FLINK-2954:
-
Summary: Not able to pass custom environment variables in cluster
to processes that spawning TaskManager
Key: FLINK-2954
URL: https://issues.apache.org/jira/browse/FLINK-2954
I have seen something like that before in IntelliJ as well. I think under
some circumstances the IntelliJ project settings and caches can be come
corrupt. I tried the following:
- Use the option "Restart and clear caches"
- Use maven->reimport project
That did the trick for me. Hope it works
I've observed this sometimes too. Command line build succeeds, however
IntelliJ build fails.
The method suggested by Stephan doesn't work for me usually. However,
recreating the project from scratch takes at most 30-40 seconds, so I do
just that. :)
-- Sachin Goel
Computer Science, IIT Delhi
m. +9
I don't think that outdated slides are too much a problem because we
also include the date of the talks. So people can judge by themselves
how old the talk (and information is).
If we keep an eye on it, and update this section in about 3 month
intervals (or maybe even 6 month) to remove old stuff
Fair enough. If you believe we can handle the "garbage collection" of old
material well, I am good with your suggestion.
On Mon, Nov 2, 2015 at 3:16 PM, Matthias J. Sax wrote:
> I don't think that outdated slides are too much a problem because we
> also include the date of the talks. So people c
Thanks for the responses.
In IntelliJ, I tried:
File-->Invalidate Cahces / Restart --> Just Restart
and was able build the updated master branch, and the old FLINK-1745 branch.
Cheers,
Dan
On Mon, Nov 2, 2015 at 6:04 PM, Sachin Goel
wrote:
> I've observed this sometimes too. Command line buil
Good to hear!
On Mon, Nov 2, 2015 at 3:38 PM, Daniel Blazevski wrote:
> Thanks for the responses.
>
> In IntelliJ, I tried:
> File-->Invalidate Cahces / Restart --> Just Restart
> and was able build the updated master branch, and the old FLINK-1745
> branch.
>
> Cheers,
> Dan
>
>
> On Mon, Nov 2
You are using a fairly old version of Flink (0.8.1)
We have fixed quite a few classloading issues since then, upgrading to a
newer version might help as well.
On Mon, Nov 2, 2015 at 2:37 AM, santosh_rajaguru wrote:
> Hi Till,
>
> While going through the verbose of the logs, i figured out that
Wow, very nice results :-)
This input format alone is probably a very useful contribution, so I would
open a contribution there once you manage to get a few tests running.
I know little about neo4j, is there a way to read cypher query results in
parallel? (most systems do not expose such an inter
When creating the original version of Spargel I was pretty much thinking in
GSA terms, more than in Pregel terms. There are some fundamental
differences between Spargel and Pregel. Spargel is in between GAS and
Pregel in some way, that is how I have always thought about it.
The main reason for the
Chengxiang Li created FLINK-2955:
Summary: Add operations introduction in Table API page.
Key: FLINK-2955
URL: https://issues.apache.org/jira/browse/FLINK-2955
Project: Flink
Issue Type: New
Wow, good catch. Thanks for the explanation, Max.
Yeah, we should run JavaDoc generation on Travis to hopefully catch
this issue early.
- Henry
On Mon, Nov 2, 2015 at 1:21 AM, Maximilian Michels wrote:
> Hi Henry,
>
> Note that we use a special Maven profile for aggregating all the java
> docs a
+1 to remove binary LICENSE/NOTICE
It should be ok since Apache officially just do source release. Binary
release is just for convenience.
Lets keep reducing complexity on releases.
- Henry
On Mon, Nov 2, 2015 at 1:56 PM, Robert Metzger wrote:
> +1 for the approach without the full LICENSE/NOTI
Hi Daniel,
I think that you are confused about name of classes. Vector in your mail is not
org.apache.flink.ml.math.Vector, but scala.collection.immutable.Vector which is
immutable collection with random access.
So if you want to create a method which receives that values, you should
clarify V
We should not forget to verify that dependencies have a compatible license,
though. This happened for example with JMH.
> On 03 Nov 2015, at 06:31, Henry Saputra wrote:
>
> +1 to remove binary LICENSE/NOTICE
>
> It should be ok since Apache officially just do source release. Binary
> release i
I just found a user running into the issue fixed here:
https://github.com/apache/flink/pull/1252/files
Since it is already in the master (accepted, tests) what do you think about
cherry picking it into the release?
On Mon, Nov 2, 2015 at 10:56 PM, Aljoscha Krettek
wrote:
> We should not forget
Chengxiang Li created FLINK-2956:
Summary: Migrate integration tests for Table API
Key: FLINK-2956
URL: https://issues.apache.org/jira/browse/FLINK-2956
Project: Flink
Issue Type: Sub-task
Actually GAS was not known when we did the iterations work (and Spargel),
but the intuition that led to Spargel is similar then the intuition that
led to GAS.
On Mon, Nov 2, 2015 at 4:35 PM, Stephan Ewen wrote:
> When creating the original version of Spargel I was pretty much thinking
> in GSA t
58 matches
Mail list logo