On Tuesday, November 4, 2025 at 8:32:11 PM UTC Hannes Stauss wrote:

This is an interesting approach. 


Thank you.
 

I would have following question/feedback to the design:

   1. In view of the intent to broadcast a non empty value to all receivers 
   of a channel, my intuition would be that the receivers should receive the 
   value only once. What was the rational behind the continuing receiving of 
   the "sticky" value from a logical point of view? Maybe a broadcast send 
   could be initiated with ch <* x. The receiving goroutine would receive the 
   value and would park until the channel is signaled to be ready for another 
   value, either a sticky or a non sticky one. This could eliminate the 
   application logic part of versioning the receives and unnecessary loops on 
   that.

For other readers, Hannes is referring to this discussion
of how we can replace condition variable uses cases
with sticky sends 
(https://github.com/glycerine/pont?tab=readme-ov-file#10-with-version-numbers-readers-can-poll-conditions).

That would be less general than condition variables,
and is readily achieved today by having the receiver nil out
their copy of the channel after they have
received on it however many times they wish.

Generally we don't want to store extra state in the 
channel if we can help it. Currently I've added only
two integers to each channel (one for sticky sends,
one for final sends), and even that feels kind of rich :)


   1. The block until full receive x <| ch seems to be a means to 
   synchronize on goroutine termination in a select thus allowing to implement 
   a timeout. However, one would need to know the number of goroutines to 
   synchronize for beforehand - which seems not to be a universal case. The 
   intent to have a timeout on Wait could simply be achieved by Wait() 
   replicating the context Done() logic.

Correct. The full-receive operator defines "full" according
to the size of the channel's capacity, which is
a constant quantity once the channel
is made. If that isn't known, I'm not sure how
one would define "full".

Yes this indeed was aimed at the common case-- when
you can allocate a channel of fixed size corresponding
to the number of known subtask goroutines will
start shortly thereafter.

For more complex scenarios, I think you would need
to deploy supervision trees ala Erlang. I have written
a package for this. I typically use trees of idem.Halter
from my https://github.com/glycerine/idem
package, as it already solves the otherwise tricky
lock ordering deadlocks that one can run into when multiple
packages use supervision trees together. 

Thanks again for your interest. Any other questions, let me know.

Jason

-- 
You received this message because you are subscribed to the Google Groups 
"golang-nuts" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion visit 
https://groups.google.com/d/msgid/golang-nuts/3dfd718f-bb7b-4214-9410-9910a33fea1en%40googlegroups.com.

Reply via email to