lihaosky commented on code in PR #27243:
URL: https://github.com/apache/flink/pull/27243#discussion_r2534984994
##########
docs/content/docs/dev/table/tableApi.md:
##########
@@ -2735,6 +2735,114 @@ result = t.select(col('a'), col('c')) \
{{< query_state_warning >}}
+### Model Inference
+
+{{< label Streaming >}}
+
+The Table API supports model inference operations that allow you to integrate
machine learning models directly into your data processing pipelines. You can
create models with specific providers (like OpenAI) and use them to make
predictions on your data.
Review Comment:
I can remove `like OpenAI` here
##########
docs/content/docs/dev/table/tableApi.md:
##########
@@ -2735,6 +2735,114 @@ result = t.select(col('a'), col('c')) \
{{< query_state_warning >}}
+### Model Inference
+
+{{< label Streaming >}}
+
+The Table API supports model inference operations that allow you to integrate
machine learning models directly into your data processing pipelines. You can
create models with specific providers (like OpenAI) and use them to make
predictions on your data.
Review Comment:
I can change it to inference
##########
docs/content/docs/dev/table/tableApi.md:
##########
@@ -2735,6 +2735,114 @@ result = t.select(col('a'), col('c')) \
{{< query_state_warning >}}
+### Model Inference
+
+{{< label Streaming >}}
+
+The Table API supports model inference operations that allow you to integrate
machine learning models directly into your data processing pipelines. You can
create models with specific providers (like OpenAI) and use them to make
predictions on your data.
+
+#### Creating and Using Models
+
+Models are created using `ModelDescriptor` which specifies the provider,
input/output schemas, and configuration options. Once created, you can use the
model to make predictions on tables.
+
+{{< tabs "model-inference" >}}
+{{< tab "Java" >}}
+
+```java
+// 1. Set up the local environment
+EnvironmentSettings settings = EnvironmentSettings.inStreamingMode();
+TableEnvironment tEnv = TableEnvironment.create(settings);
+
+// 2. Create a source table from in-memory data
+Table myTable = tEnv.fromValues(
+ ROW(FIELD("text", STRING())),
+ row("Hello"),
+ row("Machine Learning"),
+ row("Good morning")
+);
+
+// 3. Create model
+tEnv.createModel(
Review Comment:
I think what you suggested make sense. The behavior here is consistent with
`createTable` which returns nothing. I can follow up return `Model` for
`createModel`
##########
docs/content/docs/dev/table/tableApi.md:
##########
@@ -2735,6 +2735,114 @@ result = t.select(col('a'), col('c')) \
{{< query_state_warning >}}
+### Model Inference
+
+{{< label Streaming >}}
+
+The Table API supports model inference operations that allow you to integrate
machine learning models directly into your data processing pipelines. You can
create models with specific providers (like OpenAI) and use them to make
predictions on your data.
+
+#### Creating and Using Models
+
+Models are created using `ModelDescriptor` which specifies the provider,
input/output schemas, and configuration options. Once created, you can use the
model to make predictions on tables.
+
+{{< tabs "model-inference" >}}
+{{< tab "Java" >}}
+
+```java
+// 1. Set up the local environment
+EnvironmentSettings settings = EnvironmentSettings.inStreamingMode();
+TableEnvironment tEnv = TableEnvironment.create(settings);
+
+// 2. Create a source table from in-memory data
+Table myTable = tEnv.fromValues(
+ ROW(FIELD("text", STRING())),
+ row("Hello"),
+ row("Machine Learning"),
+ row("Good morning")
+);
+
+// 3. Create model
+tEnv.createModel(
+ "my_model",
+ ModelDescriptor.forProvider("openai")
+ .inputSchema(Schema.newBuilder().column("input", STRING()).build())
+ .outputSchema(Schema.newBuilder().column("output", STRING()).build())
+ .option("endpoint", "https://api.openai.com/v1/chat/completions")
+ .option("model", "gpt-4.1")
+ .option("system-prompt", "translate to chinese")
+ .option("api-key", "<your-openai-api-key-here>")
+ .build()
+);
+
+Model model = tEnv.fromModel("my_model");
+
+// 4. Use the model to make predictions
Review Comment:
It's in system-prompt: "translate to chinese". I can also mention it here
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]