squarism
squarism

Reputation: 3307

Reading a file concurrently

The reading part isn't concurrent but the processing is. I phrased the title this way because I'm most likely to search for this problem again using that phrase. :)

I'm getting a deadlock after trying to go beyond the examples so this is a learning experience for me. My goals are these:

  1. Read a file line by line (eventually use a buffer to do groups of lines).
  2. Pass off the text to a func() that does some regex work.
  3. Send the results somewhere but avoid mutexes or shared variables. I'm sending ints (always the number 1) to a channel. It's sort of silly but if it's not causing problems I'd like to leave it like this unless you folks have a neater option.
  4. Use a worker pool to do this. I'm not sure how I tell the workers to requeue themselves?

Here is the playground link. I tried to write helpful comments, hopefully this makes sense. My design could be completely wrong so don't hesitate to refactor.

package main

import (
  "bufio"
  "fmt"
  "regexp"
  "strings"
  "sync"
)

func telephoneNumbersInFile(path string) int {
  file := strings.NewReader(path)

  var telephone = regexp.MustCompile(`\(\d+\)\s\d+-\d+`)

  // do I need buffered channels here?
  jobs := make(chan string)
  results := make(chan int)

  // I think we need a wait group, not sure.
  wg := new(sync.WaitGroup)

  // start up some workers that will block and wait?
  for w := 1; w <= 3; w++ {
    wg.Add(1)
    go matchTelephoneNumbers(jobs, results, wg, telephone)
  }

  // go over a file line by line and queue up a ton of work
  scanner := bufio.NewScanner(file)
  for scanner.Scan() {
    // Later I want to create a buffer of lines, not just line-by-line here ...
    jobs <- scanner.Text()
  }

  close(jobs)
  wg.Wait()

  // Add up the results from the results channel.
  // The rest of this isn't even working so ignore for now.
  counts := 0
  // for v := range results {
  //   counts += v
  // }

  return counts
}

func matchTelephoneNumbers(jobs <-chan string, results chan<- int, wg *sync.WaitGroup, telephone *regexp.Regexp) {
  // Decreasing internal counter for wait-group as soon as goroutine finishes
  defer wg.Done()

  // eventually I want to have a []string channel to work on a chunk of lines not just one line of text
  for j := range jobs {
    if telephone.MatchString(j) {
      results <- 1
    }
  }
}

func main() {
  // An artificial input source.  Normally this is a file passed on the command line.
  const input = "Foo\n(555) 123-3456\nBar\nBaz"
  numberOfTelephoneNumbers := telephoneNumbersInFile(input)
  fmt.Println(numberOfTelephoneNumbers)
}

Upvotes: 18

Views: 25496

Answers (2)

jjm
jjm

Reputation: 6178

Edit: The answer by @tomasz above is the correct one. Please disregard this answer.

You need to do two things:

  1. use buffered chan's so that sending doesn't block
  2. close the results chan so that receiving doesn't block.

The use of buffered channels is essential because unbuffered channels need a receive for each send, which is causing the deadlock you're hitting.

If you fix that, you'll run into a deadlock when you try to receive the results, because results hasn't been closed.

Here's the fixed playground: http://play.golang.org/p/DtS8Matgi5

Upvotes: 1

tomasz
tomasz

Reputation: 13052

You're almost there, just need a little bit of work on goroutines' synchronisation. Your problem is that you're trying to feed the parser and collect the results in the same routine, but that can't be done.

I propose the following:

  1. Run scanner in a separate routine, close input channel once everything is read.
  2. Run separate routine waiting for the parsers to finish their job, than close the output channel.
  3. Collect all the results in you main routine.

The relevant changes could look like this:

// Go over a file line by line and queue up a ton of work
go func() {
    scanner := bufio.NewScanner(file)
    for scanner.Scan() {
        jobs <- scanner.Text()
    }
    close(jobs)
}()

// Collect all the results...
// First, make sure we close the result channel when everything was processed
go func() {
    wg.Wait()
    close(results)
}()

// Now, add up the results from the results channel until closed
counts := 0
for v := range results {
    counts += v
}

Fully working example on the playground: http://play.golang.org/p/coja1_w-fY

Worth adding you don't necessarily need the WaitGroup to achieve the same, all you need to know is when to stop receiving results. This could be achieved for example by scanner advertising (on a channel) how many lines were read and then the collector reading only specified number of results (you would need to send zeros as well though).

Upvotes: 16

Related Questions