I'm a freshman. I went through the Tour of GO smoothly today, until I 
encountered "Exercise: Web Crawler". The exercise is at here 
https://tour.golang.org/concurrency/10 .
This exercise requires me to implement a concurrent crawler. I write a main 
function like this:
 

> func main() {
>     go Crawl("http://golang.org/";, 4, fetcher)
> }


Yes, the program terminated just after it created the goroutine, nothing 
was printed. I thought the program would wait for all goroutines to finish 
without explicit coding, but it didn't. The canonical way to do this is to 
use WaitGroup, but the tutorial omitted it. I think it should be added 
before this exercise.

-- 
You received this message because you are subscribed to the Google Groups 
"golang-nuts" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to golang-nuts+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to