awm
awm

Reputation: 1200

Go routine to download from a URL to a file

I am currenlty learning GO. After learning some of the basics I have been trying to write a small program that concurrently download webpages (urls in a slice) to different files using goroutines. Here is some code that I wrote:

func downloadFromUrl(url string) {
    tokens := strings.Split(url, "/")
    fileName := tokens[len(tokens)-1]
    // I took out the bit that download the file for testing.
    fmt.Println("Downloading", url, "to", fileName)

}

I commented out the bit that actually downloads the page for testing. In my main function I am doing this:

func main() {
    urls := []string{"http://www.google.com", "http://www.yahoo.com", "http://www.google.com"}

    for _, url := range urls {
        fmt.Println(url);
        go downloadFromUrl(url);
    }
}

The problem is that when I use the expression go downloadFromUrl(url); The function downloadFromUrl does not run. But if I just use downloadFromUrl(url) in the loop it works fine. What am I doing wrong? Do I have to use channel with the routines?

Upvotes: 1

Views: 1973

Answers (1)

OneOfOne
OneOfOne

Reputation: 99225

The problem is main exits before your goroutine returns, the easiest solution is to use a sync.WaitGroup.

func main() {
    urls := []string{"http://www.google.com", "http://www.yahoo.com", "http://www.google.com"}
    var wg sync.WaitGroup
    for _, url := range urls {
        wg.Add(1)
        log.Println(url)
        go downloadFromUrl(url, &wg)
    }
    wg.Wait()
}

func downloadFromUrl(url string, wg *sync.WaitGroup) {
    defer wg.Done()
    ....
}

Upvotes: 6

Related Questions