Hi Yang,

Many thanks for the summary of the different options. For now, as I
mentioned, I am interested in the simplest approach since my purpose is to
run some smoke (e2e) tests. It is not entirely clear to me how to run flink
using option 1. I'm using the official scala template now
(https://github.com/tillrohrmann/flink-project.g8/blob/master/src/main/g8/build.sbt):

```
ThisBuild / resolvers ++= Seq(
    "Apache Development Snapshot Repository" at
"https://repository.apache.org/content/repositories/snapshots/";,
    Resolver.mavenLocal
)

name := "$name$"

version := "$version$"

organization := "$organization$"

ThisBuild / scalaVersion := "$scala_version$"

val flinkVersion = "$flink_version$"

val flinkDependencies = Seq(
  "org.apache.flink" %% "flink-clients" % flinkVersion % "provided",
  "org.apache.flink" %% "flink-scala" % flinkVersion % "provided",
  "org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided")

lazy val root = (project in file(".")).
  settings(
    libraryDependencies ++= flinkDependencies
  )

assembly / mainClass := Some("$organization$.Job")

// make run command include the provided dependencies
Compile / run  := Defaults.runTask(Compile / fullClasspath,
                                   Compile / run / mainClass,
                                   Compile / run / runner
                                  ).evaluated

// stays inside the sbt console when we press "ctrl-c" while a Flink
programme executes with "run" or "runMain"
Compile / run / fork := true
Global / cancelable := true

// exclude Scala library from assembly
assembly / assemblyOption  := (assembly /
assemblyOption).value.copy(includeScala = false)
```

For local manual testing, I'm just running `sbt run` on the terminal. I
would like to execute the same thing inside a docker container but without
having run `sbt run`. Instead, I would like to run the jar directly.

The jar for my job is produced running `sbt assembly`, but this jar does not
include the `provided` dependencies (see above). So I cannot run the jar
like this `java -jar my.jar` because these dependencies are not in scope.

As a quick check, If I remove the "provided" annotations above as well as
replace `includeScala=false` with `includeScala=true` in order to build a
self-contained fat jar and I execute it by running `java -jar my.jar` then I
am getting this error:

"Exception in thread "main" java.lang.IllegalStateException: No
ExecutorFactory found to execute the application."

I've just come across this jira ticket:
https://issues.apache.org/jira/browse/FLINK-19968

which seems to indicate that this is indeed a bug (I'm also using flink
1.11.2) and that theoretically what I am doing should work.

In any case, is this the right approach for option 1 (local cluster) in your
list? Again, what I want is a single container that runs the flink job jar
as a normal regular java application.

Thanks again for your support!







--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/

Reply via email to