ChengkaiYang2022 commented on code in PR #20510:
URL: https://github.com/apache/flink/pull/20510#discussion_r945569096


##########
docs/content.zh/docs/try-flink/table_api.md:
##########
@@ -28,51 +28,49 @@ under the License.
 
 # 基于 Table API 实现实时报表
 
-Apache Flink offers a Table API as a unified, relational API for batch and 
stream processing, i.e., queries are executed with the same semantics on 
unbounded, real-time streams or bounded, batch data sets and produce the same 
results.
-The Table API in Flink is commonly used to ease the definition of data 
analytics, data pipelining, and ETL applications.
+Apache Flink 提供 Table API 作为批流统一的关系型 
API,例如,相同语法的查询在无界的实时流数据或者有界的批数据集上执行会得到一致的结果。  
+Table API 在 Flink 中常被用于简化数据分析、数据流水线以及 ETL 应用的定义。
 
-## What Will You Be Building? 
+## 你接下来要搭建的是什么系统?
 
-In this tutorial, you will learn how to build a real-time dashboard to track 
financial transactions by account.
-The pipeline will read data from Kafka and write the results to MySQL 
visualized via Grafana.
+在本教程中,你会学习到如何创建一个按账户追踪金融交易的实时看板。  

Review Comment:
   > Sorry,more tips?
   
   I think it might be more fluent if we using
    '通过' instead of '按'
   and remove '如何'.



##########
docs/content.zh/docs/try-flink/table_api.md:
##########
@@ -28,51 +28,49 @@ under the License.
 
 # 基于 Table API 实现实时报表
 
-Apache Flink offers a Table API as a unified, relational API for batch and 
stream processing, i.e., queries are executed with the same semantics on 
unbounded, real-time streams or bounded, batch data sets and produce the same 
results.
-The Table API in Flink is commonly used to ease the definition of data 
analytics, data pipelining, and ETL applications.
+Apache Flink 提供 Table API 作为批流统一的关系型 
API,例如,相同语法的查询在无界的实时流数据或者有界的批数据集上执行会得到一致的结果。  

Review Comment:
   Okay



##########
docs/content.zh/docs/try-flink/table_api.md:
##########
@@ -28,51 +28,49 @@ under the License.
 
 # 基于 Table API 实现实时报表
 
-Apache Flink offers a Table API as a unified, relational API for batch and 
stream processing, i.e., queries are executed with the same semantics on 
unbounded, real-time streams or bounded, batch data sets and produce the same 
results.
-The Table API in Flink is commonly used to ease the definition of data 
analytics, data pipelining, and ETL applications.
+Apache Flink 提供 Table API 作为批流统一的关系型 
API,例如,相同语法的查询在无界的实时流数据或者有界的批数据集上执行会得到一致的结果。  
+Table API 在 Flink 中常被用于简化数据分析、数据流水线以及 ETL 应用的定义。
 
-## What Will You Be Building? 
+## 你接下来要搭建的是什么系统?
 
-In this tutorial, you will learn how to build a real-time dashboard to track 
financial transactions by account.
-The pipeline will read data from Kafka and write the results to MySQL 
visualized via Grafana.
+在本教程中,你会学习到如何创建一个按账户追踪金融交易的实时看板。  
+整条流水线为读 Kafka 中的数据并将结果写入到 MySQL 通过 Grafana 展示。
 
-## Prerequisites
+## 准备条件
 
-This walkthrough assumes that you have some familiarity with Java or Scala, 
but you should be able to follow along even if you come from a different 
programming language.
-It also assumes that you are familiar with basic relational concepts such as 
`SELECT` and `GROUP BY` clauses.
+本次代码练习假定你对 Java 或者 Scala 有一定的了解,当然如果你之前使用的是其他编程语言,也应该可以继续学习。  
+我们也假设你对关系型概念例如 `SELECT` 以及 `GROUP BY` 语句有一定的了解。
 
-## Help, I’m Stuck! 
+## 困难求助
 
-If you get stuck, check out the [community support 
resources](https://flink.apache.org/community.html).
-In particular, Apache Flink's [user mailing 
list](https://flink.apache.org/community.html#mailing-lists) consistently ranks 
as one of the most active of any Apache project and a great way to get help 
quickly. 
+如果遇到问题,可以参考 [社区支持资源](https://flink.apache.org/community.html)。
+Flink 的 [用户邮件列表](https://flink.apache.org/community.html#mailing-lists) 是 
Apahe 项目中最活跃的一个,这也是快速获得帮助的一个好方法。
 
 {{< hint info >}}
-If running docker on Windows and your data generator container is failing to 
start, then please ensure that you're using the right shell.
-For example **docker-entrypoint.sh** for 
**table-walkthrough_data-generator_1** container requires bash.
-If unavailable, it will throw an error **standard_init_linux.go:211: exec user 
process caused "no such file or directory"**.
-A workaround is to switch the shell to **sh** on the first line of 
**docker-entrypoint.sh**.
+如果你是在 Windows 上运行的 docker,并且生成数据的容器启动失败了,可以检查下是否使用了正确的脚本。

Review Comment:
   > Shall we specify the "docker" somewhere?
   > 
   > > 在 Windows 环境下,如果用来生成数据的 docker 容器启动失败,请检查使用的脚本是否正确。
   
   Okay



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to