Robert Metzger created FLINK-4110:
-
Summary: Provide testing skeleton in quickstarts
Key: FLINK-4110
URL: https://issues.apache.org/jira/browse/FLINK-4110
Project: Flink
Issue Type: Improveme
Hi Anton,
you mean - keeping working memory facts in Flink state and with each
event throw them into stateless session?
Can you elaborate a bit on your use case? What would be the state?
Would it be for example some event aggregations, which would be filtered
by the rules?
maciek
On 22/06/2
Is there a ticket for supporting user defined counters?
Thanks,
Steve
On Sat, Jun 18, 2016 at 8:18 AM, Steve Cosenza wrote:
> Perfect!
>
> -Steve
>
>
> On Saturday, June 18, 2016, Chesnay Schepler wrote:
>
>> it would be scoped and reported just like any other Counter.
>>
>> Regards,
>> Chesna
Hi All,
Just an update on this:
Setting the codec using DataFileWriter setCodec method :
writer.setCodec(CodecFactory.snappyCodec());
Pls help me with this issue
Regards,
Vinay
On Jun 22, 2016 10:47 PM, "Vinay Patil" wrote:
> Hi ,
>
> I am writing the data in S3 in avro data file, but when
Jark Wu created FLINK-4109:
--
Summary: Change the name of ternary condition operator 'eval' to
'?'
Key: FLINK-4109
URL: https://issues.apache.org/jira/browse/FLINK-4109
Project: Flink
Issue Type
Martin Scholl created FLINK-4108:
Summary: NPE in Row.productArity
Key: FLINK-4108
URL: https://issues.apache.org/jira/browse/FLINK-4108
Project: Flink
Issue Type: Bug
Components: B
Thanks Maiek
On Wed, Jun 22, 2016 at 9:00 PM, Maciek Próchniak wrote:
> This is straightforward, it also shouldn't be problematic performance wise
> OTOH, if you want to use stateful sessions things are getting more
> complicated - because if you want to play well with Flink you'd have to
> integ
Greg Hogan created FLINK-4106:
-
Summary: Restructure Gelly docs
Key: FLINK-4106
URL: https://issues.apache.org/jira/browse/FLINK-4106
Project: Flink
Issue Type: Improvement
Components:
Greg Hogan created FLINK-4107:
-
Summary: Restructure Gelly docs
Key: FLINK-4107
URL: https://issues.apache.org/jira/browse/FLINK-4107
Project: Flink
Issue Type: Improvement
Components:
Greg Hogan created FLINK-4105:
-
Summary: Restructure Gelly docs
Key: FLINK-4105
URL: https://issues.apache.org/jira/browse/FLINK-4105
Project: Flink
Issue Type: Improvement
Components:
Greg Hogan created FLINK-4104:
-
Summary: Restructure Gelly docs
Key: FLINK-4104
URL: https://issues.apache.org/jira/browse/FLINK-4104
Project: Flink
Issue Type: Improvement
Components:
Hi,
I've been using drools in a few projects. Embedding Drools in Flink is
certainly possible, however there are a few points you have to consider.
- are you going to use Drools in stateful or stateless way? The blog
post you mention uses flink in stateless way - that is, each event is
process
Hi ,
I am writing the data in S3 in avro data file, but when I deploy it on
cluster , I am facing the following issue :
*java.lang.UnsatisfiedLinkError:
org.xerial.snappy.SnappyNative.maxCompressedLength(I)*
This occurs when I try to add the record (GenericRecord) as :
writer.append(record);
Th
Suneel Marthi created FLINK-4103:
Summary: Modify CsvTableSource to implement StreamTableSource
Key: FLINK-4103
URL: https://issues.apache.org/jira/browse/FLINK-4103
Project: Flink
Issue Type
Greg Hogan created FLINK-4102:
-
Summary: Test failure with checkpoint barriers
Key: FLINK-4102
URL: https://issues.apache.org/jira/browse/FLINK-4102
Project: Flink
Issue Type: Bug
Compo
Akshay Shingote created FLINK-4101:
--
Summary: Calculating average in Flink DataStream on window time
Key: FLINK-4101
URL: https://issues.apache.org/jira/browse/FLINK-4101
Project: Flink
Issu
Hi guys,
I'm trying to add a UDF in Flink table API, say, in DataSet table API.
My example code is as follows:
---
object WordCountTable {
case class WC(word: String, num: Int)
def main(args: Array[String]): Unit = {
// set up execution environment
val env = Ex
The code was in the link of stackoverflow:
http://stackoverflow.com/questions/37841342/flink-with-kafka-consumer-doesnt-work
2016-06-22 11:28 GMT+02:00 Aljoscha Krettek :
> Once again, if you could please provide some example code that would allow
> us to figure out what the problem might be. Fro
Chesnay Schepler created FLINK-4100:
---
Summary: RocksDBStateBackend#close() can throw NPE
Key: FLINK-4100
URL: https://issues.apache.org/jira/browse/FLINK-4100
Project: Flink
Issue Type: Bug
Once again, if you could please provide some example code that would allow
us to figure out what the problem might be. From just looking at the logs
it is not obvious, at the end the job seems to be running.
On Wed, 22 Jun 2016 at 11:06 Adrian Portabales
wrote:
> Hi Robert, thanks for the answer
Hi Robert, thanks for the answer.
How do you test if data is being read from Kafka or not?
-Tested with kafka consumer script
What are the taskmanager logs saying?
-http://pastebin.com/AA2MwW5W
I think the problem is with the line 89.
Thanks!
2016-06-21 18:17 GMT+02:00 Robert Metzger :
> Hi A
Ufuk Celebi created FLINK-4099:
--
Summary: CliFrontendYarnAddressConfigurationTest fails
Key: FLINK-4099
URL: https://issues.apache.org/jira/browse/FLINK-4099
Project: Flink
Issue Type: Bug
22 matches
Mail list logo