elements are distributed to downstream tasks.
>
>
>
> *From: *Martin Frank Hansen
> *Reply-To: *"m...@berlingskemedia.dk"
> *Date: *Thursday, January 14, 2021 at 1:48 AM
> *To: *user
> *Subject: *Deterministic rescale for test
>
>
>
> Hi,
>
> I
nment
env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime)
env.setParallelism(parallelism)
val rawStream: DataStream[String] = env.readTextFile(s3Path).setParallelism(1)
rawStream.rescale
...
best regards
--
Martin Frank Hansen
/blob/master/flink-streaming-java/src/test/java/org/apache/flink/streaming/runtime/operators/windowing/WindowOperatorTest.java
>
> Cheers,
> Till
>
> On Wed, Dec 2, 2020 at 12:04 PM Martin Frank Hansen <
> m...@berlingskemedia.dk> wrote:
>
>> Hi,
>>
>> I
, so does anyone know if it will be supported in the future? (Or if
it is supported how should I use it?)
Furthermore does anyone have some good ideas for a test-setup for
processWindowFunctions?
best regards
Martin Frank Hansen
Data Engineer
Another note, the case class in hand has about 40 fields in it, is there a
maximum limit for the number of fields?
best regards
Den fre. 18. sep. 2020 kl. 13.05 skrev Martin Frank Hansen <
m...@berlingskemedia.dk>:
> Hi Dawid,
>
> Thanks for your reply, much appreciated.
>
&g
w
> what is going on in the StreamEnvironment.getStreamEnvironment). If you use
> the ProcessingTime the results will be emitted only after a minute passes.
> (You are using TimeWindow.minutes(1))
>
> Hope that helps.
>
> Best,
>
> Dawid
> On 18/09/2020 10:42, Martin Fr
s(1))
.process(new MetricsProcessFunction)
.addSink(target1Min)
// execute program
executionEnv.execute("Count pageview-based metrics")
}
}
--
Martin Frank Hansen
Data Engineer
Digital Service
M: +45 25 57 14 18
E: m...@berlingskemedia.dk
Well got it working.
The varchars in the database were set too small.
Thanks for your help!
Den fre. 22. maj 2020 kl. 13.30 skrev Martin Frank Hansen <
m...@berlingskemedia.dk>:
> Arh ok thanks, no problem.
>
> My problem is now that nothing is sent, do I need to format it in an
57 skrev Flavio Pompermaier <
pomperma...@okkam.it>:
> No sorry, you're right. The JDBCOutputFormat should work..I get confused
> with the Table API
>
> On Fri, May 22, 2020 at 11:51 AM Martin Frank Hansen <
> m...@berlingskemedia.dk> wrote:
>
>> Hi again,
>&g
Hi again,
I am a bit confused as to why the generic jdbc connector would not work
with sql-server?
Can you explain a bit more?
Den fre. 22. maj 2020 kl. 11.33 skrev Martin Frank Hansen <
m...@berlingskemedia.dk>:
> Hi Flavio,
>
> Thanks for your reply. I will try another way
>> work if you import `org.apache.flink.table.api.scala._`.
>>
>> Maybe this example helps:
>>
>>
>> https://github.com/apache/flink/blob/release-1.10/flink-examples/flink-examples-table/src/main/scala/org/apache/flink/table/examples/scala/StreamSQLExample.scala
This should
>>>> work if you import `org.apache.flink.table.api.scala._`.
>>>>
>>>> Maybe this example helps:
>>>>
>>>>
>>>> https://github.com/apache/flink/blob/release-1.10/flink-examples/flink-examples-table/src/main/scala/org/apache/flink/tabl
Hi,
I am trying to write input from Kafka to a SQL server on AWS, but I have
difficulties.
I get the following error could not find implicit value for evidence
parameter of type
org.apache.flink.api.common.typeinfo.TypeInformation[org.apache.flink.types.Row]
[error] val dsRow = tableEnv.toAppen
// "0.8", "0.9", "0.10", "0.11", and "universal"
> .topic("...") // required: topic name from which the table is read
>
>
> Best,
> Leonard Xu
> [1]
> https://ci.apache.org/projects/flin
6. maj 2020 kl. 04.57 skrev Jark Wu :
> Hi,
>
> Could you share the SQL DDL and the full exception message? It might be
> you are using the wrong `connector.version` or other options.
>
> Best,
> Jark
>
> On Fri, 15 May 2020 at 20:14, Martin Frank Hansen
> wro
TableFactory allows to create different table-related instances from
string-based properties. All available factories are called for matching to
the given set of properties and a corresponding factory class.
Factories leverage Java’s Service Provider Interfaces (SPI)
<https://docs.oracle.com/javase/tutorial/sound/SPI-intro.html> for
discovering. This means that every dependency and JAR file should contain a
file org.apache.flink.table.factories.TableFactory in the
META_INF/services resource
directory that lists all available table factories that it provides.
But how do I do that? I thought the sbt-file would take care of this.
Any help is highly appreciated!
Best regards
Martin Frank Hansen
16 matches
Mail list logo