davidradl commented on code in PR #27243:
URL: https://github.com/apache/flink/pull/27243#discussion_r2533334790


##########
docs/content/docs/dev/table/tableApi.md:
##########
@@ -2735,6 +2735,114 @@ result = t.select(col('a'), col('c')) \
 
 {{< query_state_warning >}}
 
+### Model Inference
+
+{{< label Streaming >}}
+
+The Table API supports model inference operations that allow you to integrate 
machine learning models directly into your data processing pipelines. You can 
create models with specific providers (like OpenAI) and use them to make 
predictions on your data.
+
+#### Creating and Using Models
+
+Models are created using `ModelDescriptor` which specifies the provider, 
input/output schemas, and configuration options. Once created, you can use the 
model to make predictions on tables.
+
+{{< tabs "model-inference" >}}
+{{< tab "Java" >}}
+
+```java
+// 1. Set up the local environment
+EnvironmentSettings settings = EnvironmentSettings.inStreamingMode();
+TableEnvironment tEnv = TableEnvironment.create(settings);
+
+// 2. Create a source table from in-memory data
+Table myTable = tEnv.fromValues(
+    ROW(FIELD("text", STRING())),
+    row("Hello"),
+    row("Machine Learning"),
+    row("Good morning")
+);
+
+// 3. Create model
+tEnv.createModel(

Review Comment:
   I suggest that we include the return value being assigned to a variable, and 
maybe call a method on that object. It is odd to see a build() without a return 
object.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to