Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2305
Are there strong reasons to use `apply` for `DataStream` and `with` for
`DataSet`? Could we deprecate the `apply` so that users having switched to
`with` will not lose API compatibility with 2.0
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2387
@uce this looks very nice, and navigating libraries is much improved with
the extra level(s?) of navigation.
---
If your project is set up for it, you can reply to this email and have your
reply
GitHub user greghogan opened a pull request:
https://github.com/apache/flink/pull/2408
[FLINK-4452] TaskManager network buffer gauges
Adds gauges for the number of total and available TaskManager network
memory segments.
You can merge this pull request into a Git repository by
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/2002#discussion_r76065257
--- Diff:
flink-core/src/main/java/org/apache/flink/api/common/io/compression/InflaterInputStreamFactory.java
---
@@ -23,13 +23,12 @@
import
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2294
The alternatives are much more hacky.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
GitHub user greghogan opened a pull request:
https://github.com/apache/flink/pull/2464
[FLINK-4522] [docs] Gelly link broken in homepage
The Gelly documentation was recently split in multiple pages in FLINK-4104
but was missing a redirect. This commit updates the Gelly redirect to
GitHub user greghogan opened a pull request:
https://github.com/apache/flink/pull/2465
[FLINK-4447] [docs] Include NettyConfig options on Configurations page
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/greghogan/flink
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/2465#discussion_r77363499
--- Diff: docs/setup/config.md ---
@@ -169,58 +169,111 @@ Default value is the `akka.ask.timeout`.
These parameters configure the default HDFS used by
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/2465#discussion_r77363711
--- Diff: docs/setup/config.md ---
@@ -169,58 +169,111 @@ Default value is the `akka.ask.timeout`.
These parameters configure the default HDFS used by
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/2465#discussion_r77363847
--- Diff: docs/setup/config.md ---
@@ -169,58 +169,111 @@ Default value is the `akka.ask.timeout`.
These parameters configure the default HDFS used by
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/2465#discussion_r77364407
--- Diff: docs/setup/config.md ---
@@ -169,58 +169,111 @@ Default value is the `akka.ask.timeout`.
These parameters configure the default HDFS used by
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2294
@fhueske what is your analysis?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2066
Is there a JIRA ticket for this PR?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2060
Hi @rekhajoshm have you had a chance to look at the last two comments?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
GitHub user greghogan opened a pull request:
https://github.com/apache/flink/pull/2469
[FLINK-4572] [gelly] Convert to negative in LongValueToIntValue
The Gelly drivers expect that scale 32 edges, represented by the lower 32
bits of long values, can be converted to int values
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2470
Looks good to me.
Should we also create a ticket to make AUTO the default as this checks for
credentials in multiple places?
http://docs.aws.amazon.com/java-sdk/latest/developer
GitHub user greghogan opened a pull request:
https://github.com/apache/flink/pull/2474
[FLINK-4257] [gelly] Handle delegating algorithm change of class
Replaces Delegate with NoOpOperator.
You can merge this pull request into a Git repository by running:
$ git pull https
GitHub user greghogan opened a pull request:
https://github.com/apache/flink/pull/2475
[FLINK-4571] [gelly] Configurable little parallelism in Gelly drivers
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/greghogan/flink
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/2337#discussion_r77714875
--- Diff:
flink-core/src/main/java/org/apache/flink/api/common/typeinfo/TypeInformation.java
---
@@ -122,14 +123,25 @@
public abstract Class
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/2337#discussion_r77827675
--- Diff:
flink-core/src/main/java/org/apache/flink/api/common/typeinfo/TypeInfoFactory.java
---
@@ -0,0 +1,55 @@
+/*
+ * Licensed to the Apache
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/2337#discussion_r77842627
--- Diff:
flink-core/src/main/java/org/apache/flink/api/java/typeutils/TypeExtractor.java
---
@@ -792,12 +832,40 @@ else if (t instanceof Class
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2469
Do you have a second use case in mind for adding this function to
`MathUtils`? My thought would be to keep this separate to avoid confusion
between signed and unsigned downcasts.
---
If your
GitHub user greghogan opened a pull request:
https://github.com/apache/flink/pull/2481
[FLINK-4594] [core] Validate lower bound in MathUtils.checkedDownCast
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/greghogan/flink
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2469
I split the long to int translator into both signed and unsigned
translators so the conversion would not be ambiguous. The test will fail
without the fix in FLINK-4594.
---
If your project is
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/2060#discussion_r78199216
--- Diff:
flink-core/src/main/java/org/apache/flink/api/common/io/GenericCsvInputFormat.java
---
@@ -314,6 +320,25 @@ protected void setFieldsGeneric
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/2060#discussion_r78199262
--- Diff:
flink-core/src/main/java/org/apache/flink/api/common/io/GenericCsvInputFormat.java
---
@@ -314,6 +320,25 @@ protected void setFieldsGeneric
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/2060#discussion_r78199365
--- Diff:
flink-core/src/main/java/org/apache/flink/api/common/io/GenericCsvInputFormat.java
---
@@ -314,6 +320,25 @@ protected void setFieldsGeneric
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/2060#discussion_r78199594
--- Diff:
flink-core/src/main/java/org/apache/flink/api/common/io/GenericCsvInputFormat.java
---
@@ -314,6 +320,25 @@ protected void setFieldsGeneric
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/2060#discussion_r78199868
--- Diff:
flink-core/src/main/java/org/apache/flink/types/parser/FieldParser.java ---
@@ -75,8 +76,30 @@
/** Invalid Boolean value
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/2060#discussion_r78203358
--- Diff:
flink-core/src/main/java/org/apache/flink/types/parser/FieldParser.java ---
@@ -75,8 +76,30 @@
/** Invalid Boolean value
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/2490#discussion_r78286377
--- Diff:
flink-java/src/main/java/org/apache/flink/api/java/operators/CrossOperator.java
---
@@ -129,14 +129,11 @@ private String getDefaultName
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2490
This isn't improving performance but moving the null checks before first
access and removing the duplicate `DataSet` references.
---
If your project is set up for it, you can reply to this
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2060
Apologies for the long delay. I'd like to attempt to summarize this ticket
and pull request to validate my understanding.
Previously StringParser was using the system encodin
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2515
Thanks for reporting and fixing! Merging ...
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2490
Merging ...
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2458
Would this be useful to prevent building Flink with Maven 3.3.x?
https://maven.apache.org/enforcer/enforcer-rules/versionRanges.html
https://maven.apache.org/enforcer/enforcer-rules
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/2519#discussion_r79655828
--- Diff: docs/quickstart/run_example_quickstart.md ---
@@ -277,8 +277,8 @@ The number in front of each line tells you on which
parallel instance of the
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/2519#discussion_r79654882
--- Diff: docs/quickstart/run_example_quickstart.md ---
@@ -125,21 +125,21 @@ public class WikipediaAnalysis {
}
{% endhighlight %}
-I
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/2519#discussion_r79655043
--- Diff: docs/quickstart/run_example_quickstart.md ---
@@ -125,21 +125,21 @@ public class WikipediaAnalysis {
}
{% endhighlight %}
-I
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2525
Oh no! I just committed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and
GitHub user greghogan opened a pull request:
https://github.com/apache/flink/pull/2528
[FLINK-4643] [gelly] Average Clustering Coefficient
Questions:
- Can we generalize "average" to operator on a common interface (i.e.
"ScorableResult")? Here, th
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2525
Thank you for the contribution @alpinegizmo.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2527
Would a graph translator simplify the conversion from Long to String? You
can do `graph.run(new TranslateEdgeValues<...>(new StringToLong())` and write a
simple `public class String
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2527
I'd also like to add a `ToNullValue` translator that would accept any type
and convert to `NullValue`.
---
If your project is set up for it, you can reply to this email and have your
reply a
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2527
I'm not following why specifying the `TypeInformation` is now required with
the change to using `Either`. Is the type system failing to handle this
properly?
---
If your project is set up f
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/2527#discussion_r80109825
--- Diff:
flink-libraries/flink-gelly/src/main/java/org/apache/flink/graph/library/Summarization.java
---
@@ -226,11 +247,15 @@ public void
GitHub user greghogan opened a pull request:
https://github.com/apache/flink/pull/2536
[FLINK-4664] [gelly] Add translator to NullValue
This translator is appropriate for translating vertex and edge values to
NullValue when the values are not used in an algorithm.
You can merge
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2539
I don't think this fixes a bug since I'm not aware of filesystems unable to
handle a leading dash.
---
If your project is set up for it, you can reply to this email and have your
reply
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2536
I added a second commit to move the translators into their own subpackage,
as well as additional tests.
---
If your project is set up for it, you can reply to this email and have your
reply
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2539
These are temporary files automatically deleted by the client. To handle
paths with leading dashes one can prefix with a directory (`vi ./-44.txt`) or
place after a double dash (`vi -- -44.txt
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3431
@vasia, I was only thinking of this being used in gelly examples. The
documentation for the use of the example drivers will be updated but I am
anticipating that users will enclose new and
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3402
@StephanEwen, I updated the test to include the original test plus a new
test with object reuse enabled.
@vasia, would you be also be able to review this change?
---
If your project is
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3402
@StephanEwen my last comment was ambiguous, I had originally modified a
test and then with yesterday's commit reverted that change and added as a new
test.
Will merge.
---
If
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2029
@StephanEwen should I create an alternate PR?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3433
@vasia reusable parameters will make it much easier to add drivers and
inputs.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If
GitHub user greghogan opened a pull request:
https://github.com/apache/flink/pull/3515
[FLINK-5890] [gelly] GatherSumApply broken when object reuse enabled
The initial fix for this ticket is not working on larger data sets.
Reduce supports returning the left input, right
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3515
@StephanEwen deja vu FLINK-2883 / FLINK-3340.
I'm also looking to run the FLINK-4949 IT tests with object reuse both
enabled and disabled which would have highlighted this issue a
GitHub user greghogan opened a pull request:
https://github.com/apache/flink/pull/3516
[FLINK-6009] [java api] Deprecate DataSetUtils#checksumHashCode
This is likely only used by Gelly and we have a more featureful
implementation allowing for multiple outputs and setting the job
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3511
Has the FLIP been
[posted](https://cwiki.apache.org/confluence/display/FLINK/Flink+Improvement+Proposals)
and officially discussed on the mailing list?
---
If your project is set up for it, you
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3392
@lincoln-lil @StephanEwen should this PR wait until 2.0? The modified
example shows a breaking change for valid usage.
---
If your project is set up for it, you can reply to this email and have
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3392
This change also effects running in a cluster environment where the UDF
requests multiple iterators but accepts that the same iterator is returned, as
in the modified test.
Since this
GitHub user greghogan opened a pull request:
https://github.com/apache/flink/pull/3563
[FLINK-2814] [optimizer] DualInputPlanNode cannot be cast to
SingleInputPlanNode
WorksetIterationNode#instantiate loops over all solution and work set
candidates. Since the solution set
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2029
I created #3563 which combines this PR and my suggestion.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3567
@aljoscha do you think we can use a single checkstyle or will this need to
be customized per module? Is this enforced by IntelliJ?
---
If your project is set up for it, you can reply to this
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3567
@aljoscha should we host the new checkstyle under tools/maven/ alongside
the existing checkstyle? There is already a ticket (FLINK-6137) to add a custom
checkstyle to flink-cep and I don't se
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/3595#discussion_r107423467
--- Diff:
flink-core/src/test/java/org/apache/flink/core/memory/ByteArrayOutputStreamWithPosTest.java
---
@@ -0,0 +1,31 @@
+/*
+ * Licensed to
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3515
Not an issue of spilling memory but at least three elements are required to
trigger two reduces and the error condition depends on which value is returned.
@StephanEwen is this a +1
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/3595#discussion_r107697744
--- Diff:
flink-core/src/test/java/org/apache/flink/core/memory/ByteArrayOutputStreamWithPosTest.java
---
@@ -0,0 +1,31 @@
+/*
+ * Licensed to
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/3595#discussion_r107696959
--- Diff:
flink-core/src/test/java/org/apache/flink/core/memory/ByteArrayOutputStreamWithPosTest.java
---
@@ -0,0 +1,31 @@
+/*
+ * Licensed to
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3605
Thanks @rmetzger!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3604
@rmetzger, how would users know which Hadoop dependencies to include in
`flink-dist-hadoop.jar`? Would they be copying multiple component jars into
`lib/`?
---
If your project is set up for it
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3571
Merging ...
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/3567#discussion_r10860
--- Diff: tools/maven/strict-checkstyle.xml ---
@@ -0,0 +1,550 @@
+
+
+http://www.puppycrawl.com/dtds/configuration_1_3.dtd
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/3567#discussion_r108035621
--- Diff: tools/maven/strict-checkstyle.xml ---
@@ -0,0 +1,550 @@
+
+
+http://www.puppycrawl.com/dtds/configuration_1_3.dtd
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3567
@aljoscha, I likewise have no great preference for import order. I do think
it is important for the checkstyle to match IntelliJ's code style, either the
default or a provided Flink
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3604
@rmetzger thanks for the clarification. Sounds good!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3615
Is there a Jira ticket for this PR?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3567
Google's style seems sensible since IntelliJ's style cannot be modeled in
Checkstyle, the import listing is folded by default, and developers will need
to load other non-default con
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/3433#discussion_r108069363
--- Diff:
flink-libraries/flink-gelly-examples/src/main/java/org/apache/flink/graph/drivers/parameter/Parameter.java
---
@@ -0,0 +1,55
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3615
Multiple PRs are added to one ticket but the commit header must reference
the Jira ID. I think the comments here are sufficient so there is no need to
close and open a new PR.
---
If your
GitHub user greghogan opened a pull request:
https://github.com/apache/flink/pull/3626
[FLINK-5912] [gelly] Inputs for CSV and graph generators
Create Input classes for reading graphs from CSV as well as for each of the
graph generators.
Inputs are tested in driver
GitHub user greghogan opened a pull request:
https://github.com/apache/flink/pull/3632
[FLINK-6176] [scripts] Add JARs to CLASSPATH deterministically
Sorts files read from Flink's lib directory and places the distribution JAR
to the end of the CLASSPATH.
You can merge this
GitHub user greghogan opened a pull request:
https://github.com/apache/flink/pull/3635
[FLINK-5913] [gelly] Example drivers
Replace existing and create new algorithm Driver implementations for each
of the library methods.
You can merge this pull request into a Git repository by
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3563
@StephanEwen thanks for the reminder. Do you think this should also be
merged to 1.2 or 1.1?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/3595#discussion_r108437565
--- Diff:
flink-core/src/main/java/org/apache/flink/core/memory/ByteArrayInputStreamWithPos.java
---
@@ -118,7 +118,8 @@ public int getPosition
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/3595#discussion_r108437640
--- Diff:
flink-core/src/main/java/org/apache/flink/core/memory/ByteArrayOutputStreamWithPos.java
---
@@ -110,7 +110,7 @@ public int getPosition
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/3595#discussion_r108438856
--- Diff:
flink-core/src/test/java/org/apache/flink/core/memory/ByteArrayInputStreamWithPosTest.java
---
@@ -0,0 +1,50 @@
+/*
+ * Licensed to the
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3632
@StephanEwen will look at refactoring this.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3629
@sunjincheng121 tests are failing for scalastyle line length violations.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3632
@EronWright this will not fix conflicts but will make debugging easier
since the classpath will be consistent across filesystems and Flink
installations. We currently have no order to the
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3595
When expanding with `write` the new capacity (up to `position` of course)
is filled with data. Expanding with `setPosition` can leave holes in the data
and may mask a bug. Since these classes are
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/3616#discussion_r108948674
--- Diff:
flink-streaming-java/src/main/java/org/apache/flink/streaming/api/transformations/StreamTransformation.java
---
@@ -202,7 +203,17 @@ public int
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3382
Failing test is unrelated. Merging this after offline discussion with
@vasia ...
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/3595
@wenlong88 you are right that the position can currently be moved beyond
written data. This still feels a bit like feature creep as we've added to the
PR. Do you need to expand the array is
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2029
@rehevkor5 thanks for this identifying bug and submitting this PR. My
apologies for taking so long to look into this issue. With #3563 accepted I
think you can go ahead and close this PR.
---
If
Github user greghogan closed the pull request at:
https://github.com/apache/flink/pull/3382
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2708
@mxm I pushed a new commit that is working in YARN with recursive
directories. The issue looks to have been that YARN was copying files
recursively but the Java classpath can only contain simple
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2564
Try switching to `ExecutionEnvironment.createCollectionsEnvironment()`.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user greghogan commented on the issue:
https://github.com/apache/flink/pull/2782
1) This needs a Jira ticket.
2) There are more cases in `FlumeSink` where `client` accessed without
checking for `null`.
---
If your project is set up for it, you can reply to this email and
Github user greghogan commented on a diff in the pull request:
https://github.com/apache/flink/pull/2457#discussion_r87440964
--- Diff:
flink-runtime/src/main/java/org/apache/flink/runtime/jobgraph/jsonplan/JsonPlanGenerator.java
---
@@ -52,10 +52,10 @@ public static String
1 - 100 of 1193 matches
Mail list logo