On Friday, 17 October 2025 at 07:53:58 UTC, Mark Davies wrote:
I should have mentioned in my previous reply but one of the code blobs I generated is the core of an agent written in DLang, it has websocket interface to the html front end and openai interface to llama.cpp. It has tool calling support and tracks conversations. I've kind of put it on the back burner as I can't get tool calling working in the way that I want.

All the code was generated by AI models.

Mark.

Sounds nice.

How do you handle async tasks for agents?

Reply via email to