wuchong commented on a change in pull request #8200: [FLINK-11614] [docs-zh] 
Translate the "Configuring Dependencies" page into Chinese
URL: https://github.com/apache/flink/pull/8200#discussion_r278376140
 
 

 ##########
 File path: docs/dev/projectsetup/dependencies.zh.md
 ##########
 @@ -132,63 +113,45 @@ Below is an example adding the connector for Kafka 0.10 
as a dependency (Maven s
 </dependency>
 {% endhighlight %}
 
-We recommend to package the application code and all its required dependencies 
into one *jar-with-dependencies* which
-we refer to as the *application jar*. The application jar can be submitted to 
an already running Flink cluster,
-or added to a Flink application container image.
-
-Projects created from the [Java Project Template]({{ site.baseurl 
}}/dev/projectsetup/java_api_quickstart.html) or
-[Scala Project Template]({{ site.baseurl 
}}/dev/projectsetup/scala_api_quickstart.html) are configured to automatically 
include
-the application dependencies into the application jar when running `mvn clean 
package`. For projects that are
-not set up from those templates, we recommend to add the Maven Shade Plugin 
(as listed in the Appendix below)
-to build the application jar with all required dependencies.
+我们建议将应用程序代码及其所有需要的依赖项打包到一个 *jar-with-dependencies* 的 jar 包中。
+这个打包好的应用 jar 可以提交到已经运行的 Flink 集群中,或者添加到 Flink 应用容器镜像中。
+ 
+通过[Java 项目模板]({{ site.baseurl }}/dev/projectsetup/java_api_quickstart_zh.html) 
或者
+[Scala 项目模板]({{ site.baseurl }}/dev/projectsetup/scala_api_quickstart_zh.html) 
创建的应用,
+当使用命令 `mvn clean package` 打包的时候会自动将应用依赖类库打包进应用 jar 包。
+对于不是通过上面模板创建的应用,我们推荐添加 Maven Shade Plugin 去构建应用。(下面的附录会给出具体配置)
 
-**Important:** For Maven (and other build tools) to correctly package the 
dependencies into the application jar,
-these application dependencies must be specified in scope *compile* (unlike 
the core dependencies, which
-must be specified in scope *provided*).
+**注意:** 要使 Maven(以及其他构建工具)正确地将依赖项打包到应用程序 jar 中,必须将这些依赖项的作用域设置为 *compile* 
(与核心依赖项不同,后者作用域应该设置为 *provided* )。
 
+## Scala 版本
 
-## Scala Versions
+Scala 版本(2.10、2.11、2.12等)互相是不兼容的。因此,依赖 Scala 2.11 的 Flink 环境是不可以运行依赖 Scala 
2.12 应用的。
 
-Scala versions (2.10, 2.11, 2.12, etc.) are not binary compatible with one 
another.
-For that reason, Flink for Scala 2.11 cannot be used with an application that 
uses
-Scala 2.12.
+所有依赖 Scala 的 Flink 类库都以它们依赖的 Scala 版本为后缀,例如 `flink-streaming-scala_2.11`。
 
-All Flink dependencies that (transitively) depend on Scala are suffixed with 
the
-Scala version that they are built for, for example 
`flink-streaming-scala_2.11`.
+只使用 Java 的开发人员可以选择任何 Scala 版本,Scala 开发人员需要选择与其应用程序相匹配的 Scala 版本。
 
-Developers that only use Java can pick any Scala version, Scala developers 
need to
-pick the Scala version that matches their application's Scala version.
+对于指定的 Scala 版本如何构建 Flink 应用可以参考 [构建指南]({{ site.baseurl 
}}/flinkDev/building_zh.html#scala-versions) 
 
 Review comment:
   ```suggestion
   对于指定的 Scala 版本如何构建 Flink 应用可以参考 [构建指南]({{ site.baseurl 
}}/zh/flinkDev/building.html#scala-versions) 
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to