Abhishek-kumar-samsung commented on PR #32649:
URL: https://github.com/apache/superset/pull/32649#issuecomment-2791584496

   1. Yes i have tightly coupled with langchain for agent creation but i can 
implement that agent without langchain i.e using for loop of thought, action, 
action input and observation, and move prompt template to environment ( I was 
waiting for some green signal to work on it ). Regarding tight coupling with 
openai - no its not coupled - one can run their custom llm in local and and 
configure 'base_url' in superset_config. I am only using openai library. Even i 
am not using open ai token while testing(i am using locally running llama3.1-8b)
   
   2. Mostly all llm configurations i have kept in superset_config, and i 
thought based on usecases we can have separate prompts in backend that's why i 
didn't kept prompt in env(if want i can do that)
   
   3. No, user need not to share data to external providers, i have kept 
'base_url' in superset_config. If anyone runs their own llm maybe using 
TGI(then they can configure local llm path and no data will go outside). If 
user is using openai token then its external call.
   
   4. Yes prompt i also thought to move to env, this i can do.
   
   5. This maybe specific user can configure when we move prompt to env. But i 
didnt thought its benefit so i didn't did.
   
   6. Yes RAG approach in deciding which chart data call tool should be called 
can be there. Earlier i didn't added because it will start a new vector db 
docker component(I thought that if i add new container service, no one will see 
the PR) Currently its not there because i have left to llm to decide which 
chart data tool to call by using chart titles.
   
   7. Yes async support is already supported, i am showing superset provided 
progress bar till response is not fetched.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to