[ 
https://issues.apache.org/jira/browse/CAMEL-21019?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tadayoshi Sato updated CAMEL-21019:
-----------------------------------
    Fix Version/s: 4.10.0

> Add a component for TensorFlow Serving
> --------------------------------------
>
>                 Key: CAMEL-21019
>                 URL: https://issues.apache.org/jira/browse/CAMEL-21019
>             Project: Camel
>          Issue Type: New Feature
>          Components: camel-ai
>    Affects Versions: 4.7.0
>            Reporter: Tadayoshi Sato
>            Assignee: Tadayoshi Sato
>            Priority: Major
>             Fix For: 4.10.0
>
>
> Running a TensorFlow model is already supported through Camel DJL component. 
> However, Camel users might prefer to externalise inferencing to an external 
> server instead of running it inside the Camel route. For TensorFlow models, 
> it is generally done with [TensorFlow 
> Serving|https://www.tensorflow.org/tfx/guide/serving], which is a REST API 
> server for inferencing with TensorFlow. Camel should provide a producer 
> component that makes it easy to invoke the TensorFlow specific REST API from 
> the routes.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to