Hi Esa,
just to remind that don’t miss the dot and underscore.
Best,
Xingcan
> On 22 Feb 2018, at 3:59 PM, Esa Heikkinen
> wrote:
>
> Hi
>
> Actually I have also line “import org.apache.flink.streaming.api.scala” on my
> code, but this line seems to be highlighted weaker in window of IDEA
It should be ok. This is the list of my all imports. First part of it has been
highlighted weaker. I don’t know why.
import org.apache.flink.streaming.api.windowing.time.Time
import org.apache.flink.api.java.utils.ParameterTool
import org.apache.flink.streaming.api.scala.StreamExecutionEnvironm
Hi Esa,
which Scala version do you use?
Flink supports Scala 2.11 (and Scala 2.10 support was dropped with Flink
1.4.0).
Fabian
2018-02-22 9:28 GMT+01:00 Esa Heikkinen :
>
>
>
> It should be ok. This is the list of my all imports. First part of it has
> been highlighted weaker. I don’t know why
Hi Krzysztof,
you could compute the stats in two stages:
1) compute an daily window. You should use a ReduceFunction or
AggreagteFunction here if possible to perform the computation eagerly.
2) compute a sliding window of 90 days with a 1 day hop (or 90 rows with a
1 row hop).
That will crunch d
Hi
How to check versions ?
In pom.xml there are lines:
UTF-8
1.4.0
1.7.7
Hi Fabian and Esa,
I ran the code myself and also noticed the strange behavior. It seems that only
I explicitly import the function i.e.,
org.apache.flink.streaming.api.scala.asScalaStream, can it works. In other
words, the underscore import becomes useless. I also checked other package
object
I have created a java application using flink api and table api. I can
provide the source code if needed.
The application works perfectly locally. But when I tried to upload the
created jar in aws lambda and execute it I'm being thrown with the following
error:
*reference.conf: 804: Could not res
Hi Esa and Fabian,
sorry for my inaccurate conclusion before, but I think the reason is clear now.
The org.apache.flink.streaming.api.scala._ and org.apache.flink.api.scala._
should not be imported simultaneously due to conflict. Just remove either of
them.
Best,
Xingcan
> On 22 Feb 2018, at
Hi
It works now. Thank you ☺
How to know what the imports are incompatible or something like that ?
BR Esa
From: Xingcan Cui [mailto:xingc...@gmail.com]
Sent: Thursday, February 22, 2018 12:00 PM
To: Esa Heikkinen
Cc: Fabian Hueske ; user@flink.apache.org
Subject: Re: Problems to use toAppendS
Hello everyone,
I'm consuming from a Kafka topic, on which I'm writing with a
FlinkKafkaProducer, with the timestamp relative flag set to true.
>From what I gather from the documentation [1], Flink is aware of Kafka
Record's timestamp and only the watermark should be set with an appropriate
Times
Hi
But next problem ☹
When I try to run I got error:
Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/Seq
at pack.CepTest2.main(CepTest2.scala)
Caused by: java.lang.ClassNotFoundException: scala.collection.Seq
a
Hey, I am new to flink and I have a question and want to see if anyone can
help here.
How to use Dimension table in Flink TableAPI with StreamExecutionEnvironment ?
I use TableFuncion to deal this question, but it have some problem in debug
like this:
LogicalProject(col_1=[$0])
LogicalJoin(con
Hi Fabian,
Thank you for your suggestion. In the meantime I rethought this problem and
implemented alternative solution without using windows at all.
I used plain ProcessFunction with
1. Keyed state (by companyId) - to keep ratings per key
2. EventTime timers - to remove outdated ratings from stat
An update - I was able to overcome these issues by setting the preallocate
flag to true. Not sure why this fixes the problem. Need to dig a bit deeper.
--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/
I am currently suffering through similar issues.
Had a job running happily, but when it the cluster tried to restarted it
would not find the JSON serializer in it. The job kept trying to restart in
a loop.
Just today I was running a job I built locally. The job ran fine. I added
two commits and
Something seems to be off with the user code class loader. The only way I
can get my job to start is if I drop the job into the lib folder in the JM
and configure the JM's classloader.resolve-order to parent-first.
Suggestions?
On Thu, Feb 22, 2018 at 12:52 PM, Elias Levy
wrote:
> I am current
16 matches
Mail list logo