I am not very familiar with scala and glad to hear that you solved
the problem.
Best,
Yang
Salva Alcántara 于2021年1月15日周五 下午9:19写道:
> Hi Yang,
>
> Just to see that I finally found the problem:
>
> lazy val log4j = "org.apache.logging.log4j" % "log4j-slf4j-impl" % "2.13.3"
> % "provided"
>
> This
Hi Yang,
Just to see that I finally found the problem:
lazy val log4j = "org.apache.logging.log4j" % "log4j-slf4j-impl" % "2.13.3"
% "provided"
This dependency was not added to my jar (but sbt was including it for the
run task, that is why it worked with `sbt run`). After adding this
dependency
The example you provided works just fine, but when I replace the jar with
mine I get this error instead:
```
The program finished with the following exception:
org.apache.flink.client.program.ProgramInvocationException: The main method
caused an error: Could not create actor system
at
or
Actually, you do not need to start a Flink cluster beforehand. Because a
local cluster will be started
in the same process with CliFrontend automatically. The local cluster means
"all-in-one-process".
"start-cluster-sh" will start one JobManager process and one TaskManager
process. It is a standalo
Hi Yang,
Thanks for your reply. I've given it a try within my container (I
copy-pasted the Dockerfile for it) but running `flink run --target local
my.jar` results in a different error for me:
"org.apache.flink.client.program.ProgramInvocationException: The main method
caused an error: Could not
You could directly run the Flink job in local mode with java command. But
you
need to construct the Flink classpath and set it to the start command. An
easier
way is to build your image based on the Flink official image and copy your
jar into
the image. I believe you have done this. Then you could
Can anyone explain why I am getting this error?
"Exception in thread "main" java.lang.IllegalStateException: No
ExecutorFactory found to execute the application."
I have tried a slightly different approach by running the jar that `sbt
assembly`produces inside a container that looks like this (Doc
Hi Yang,
Many thanks for the summary of the different options. For now, as I
mentioned, I am interested in the simplest approach since my purpose is to
run some smoke (e2e) tests. It is not entirely clear to me how to run flink
using option 1. I'm using the official scala template now
(https://git
Hi Salva,
I think we could have the following options to make Flink application run
on a Kubernetes cluster.
1. Local cluster
This is what you have in mind. Flink now is really like a common java
application, which you could start easily.
2. Standalone cluster on K8s
By applying some yaml files,
I would like to deploy flink on a local cluster built with KIND for the
purposes of e2e testing. The flink app is one of the components running
within the system, which consists of other components (mostly written in
Golang). I was wondering what would be the simplest way for me to deploy the
flink
10 matches
Mail list logo