morsapaes commented on a change in pull request #14003:
URL: https://github.com/apache/flink/pull/14003#discussion_r532042948



##########
File path: docs/dev/table/sql/gettingStarted.md
##########
@@ -0,0 +1,226 @@
+---
+title: "Getting Started - Flink SQL"
+nav-parent_id: sql
+nav-pos: 0
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+* This will be replaced by the TOC
+{:toc}
+
+Flink SQL enables SQL developers to design and develop the batch or streaming 
application without writing the Java, Scala, or any other programming language 
code. It provides a unified API for both stream and batch processing. As a 
user, you can perform powerful transformations. Flink’s SQL support is based on 
[Apache Calcite](https://calcite.apache.org/) which implements the SQL standard.
+
+In addition to the SQL API, Flink also has a Table API with similar semantics 
as SQL. The Table API is a language-integrated API, where users develop in a 
specific programming language to write the queries or call the API. For 
example, jobs create a table environment, read a table, and apply different 
transformations and aggregations, and write the results back to another table. 
It supports different languages e.g. Java, Scala, Python. 
+ 
+Flink SQL and Table API are just two different ways to write queries that use 
the same Flink runtime. All the queries are optimized for efficient execution. 
SQL API is a more descriptive way of writing queries using well-known SQL 
standards e.g. `select * from Table`. On the other hand, Table API queries 
start with from clause, followed by joins and where clause, and then finally 
projection or select at the last e.g. `Table.filter(...).select(...)`. Standard 
SQL is easy and quick to learn even for users with no programming background. 
This article will focus on Flink SQL API but Table API details can be found 
[here]({{ site.baseurl }}/dev/table/).
+
+### Pre-requisites
+You only need to have basic knowledge of SQL to follow along. You will not 
need to write Java or Scala code or use an IDE.
+
+### Installation
+There are various ways to [install]({{ site.baseurl }}/ops/deployment/) Flink. 
Probably the easiest one is to download the binaries and run them locally for 
experimentation. We assume [local installation]({{ site.baseurl 
}}/try-flink/local_installation.html) for the rest of the tutorial. You can 
start a local cluster using the following command from the installation folder

Review comment:
       Don't think it's relevant to link to deployment modes here — this could 
be confusing to new users.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to