Reputation: 19
I have an upload program that I am working on and I am running into an issue. I have n go routines that handle uploading the parts to a big file. Essentially it will split the file into 100MB chunks and upload them concurrently depending on the amount of concurrent processes you specify in the config.
The issue I'm having is when I create a buffer to read the file and upload the make([]byte, 100000000) hangs... but only if it's in a go routine. (I'm using 100000000 to simplify the upload calculations)
Here is an example.
This works: https://play.golang.org/p/tkn8JVir9S
package main
import (
"fmt"
)
func main() {
buffer := make([]byte, 100000000)
fmt.Println(len(buffer))
}
This doesn't: https://play.golang.org/p/H8626OLpqQ
package
main
import (
"fmt"
)
func main() {
go createBuffer()
for {
}
}
func createBuffer() {
buffer := make([]byte, 100000000)
fmt.Println(len(buffer))
}
It just hangs... I'm not sure if there is a memory constraint for a go routine? I tried to research and see what I could find but nothing. Any thoughts would be appreciated.
EDIT: Thanks everyone for the feedback. I will say I didn't explain the real issue very well and will try to provide more of a holistic view next time. I ended up using a channel to block to keep my goroutines ready for new files to process. This is for a DR backup uploading to a 3rd party all that requires large files to be split into 100mb chunks. I guess I should have been more clear as to the nature of my program.
Upvotes: 0
Views: 341
Reputation: 8446
This program hangs because there is an infinite loop in your code. Try running the code just like this to prove it to yourself. The goroutine is not what is causing the hanging.
func main() {
for {
}
}
If you just want to see fmt.Println(..)
print, then I'd recommend having a time.Sleep
call or similar.
If you would like to wait for a bunch of goroutines to complete, then I'd recommend this excellent answer to that exact question.
Upvotes: 1