The question is cross-posted on Stack Overflow
https://stackoverflow.com/questions/67001326/why-does-flink-quickstart-scala-suggests-adding-connector-dependencies-in-the-de
.

## Connector dependencies should be in default scope

This is what [flink-quickstart-scala](
https://github.com/apache/flink/blob/d12eeedfac6541c3a0711d1580ce3bd68120ca90/flink-quickstart/flink-quickstart-scala/src/main/resources/archetype-resources/pom.xml#L84)
suggests:

```
<!-- Add connector dependencies here. They must be in the default scope
(compile). -->

<!-- Example:
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
-->
```

It also aligns with [Flink project configuration](
https://ci.apache.org/projects/flink/flink-docs-release-1.12/dev/project-configuration.html#adding-connector-and-library-dependencies
):

> We recommend packaging the application code and all its required
dependencies into one jar-with-dependencies which we refer to as the
application jar. The application jar can be submitted to an already running
Flink cluster, or added to a Flink application container image.
>
> Important: For Maven (and other build tools) to correctly package the
dependencies into the application jar, these application dependencies must
be specified in scope compile (unlike the core dependencies, which must be
specified in scope provided).

## Hive connector dependencies should be in provided scope

However, [Flink Hive Integration docs](
https://ci.apache.org/projects/flink/flink-docs-stable/dev/table/connectors/hive/#program-maven)
suggests the opposite:

> If you are building your own program, you need the following dependencies
in your mvn file. It’s recommended not to include these dependencies in the
resulting jar file. You’re supposed to add dependencies as stated above at
runtime.

## Why?

Thanks!

Best,
Yik San

Reply via email to