Konstantin
Konstantin

Reputation: 337

High memory consumption during method execution

For a project I want to manually create structs for each of the approximately 50 million rows of a CSV. For this I iterate line by line through the file and append each struct to a slice. This is the dumbed down method:

func readCSV(filePath string) DataFrame {
    file, _ := os.Open(filePath)
    defer file.Close()
    var rows []Row
    scanner := bufio.NewScanner(file)
    scanner.Scan()
    for scanner.Scan() {
        parts := strings.Split(scanner.Text(), ",")
        if len(parts) < 7 {
            continue
        }
        column1, _ := strconv.Atoi(parts[0])
        column2, _ := strconv.ParseFloat(parts[1], 32)
        column3, _ := strconv.ParseFloat(parts[2], 32)
        column4 := parts[3]
        column5, _ := strconv.ParseFloat(parts[4], 32)
        column6 := parts[5]
        column7 := parts[6]
        row := Row{
            Column1: column1,
            Column2: column2,
            Column3: column3,
            Column4: column4,
            Column5: column5,
            Column6: column6,
            Column7: column7,
        }
        rows = append(rows, row)
    }
    return DataFrame{
        Rows: rows,
    }
}

The resulting DataFrame has around 3 GB of memory. The problem is that RAM consumption goes through the roof during method execution and the Go process uses 15GB+ of memory, making the function unusable for my purpose. Once the slice is returned, the RAM consumption of the process drops to the expected 3GB.

The heap profile looks like this:

    3.26GB     5.81GB (flat, cum)   100% of Total
         .          .     62:   scanner := bufio.NewScanner(file)
         .          .     63:   scanner.Scan()
         .          .     64:   for scanner.Scan() {
         .     2.55GB     65:           parts := strings.Split(scanner.Text(), ",")
         .          .     66:           if len(parts) < 7 {
         .          .     67:                   continue
         .          .     68:           }
         .          .     69:           column1, _ := strconv.Atoi(parts[0])
         .          .     70:           column2, _ := strconv.ParseFloat(parts[1], 32)
         .          .     71:           column3, _ := strconv.ParseFloat(parts[2], 32)
         .          .     72:           column4 := parts[3]
         .          .     73:           column5, _ := strconv.ParseFloat(parts[4], 32)
         .          .     74:           column6 := parts[5]
         .          .     75:           column7 := parts[6]
         .          .     76:           row := Row{
         .          .     77:                   Column1: column1,
         .          .     78:                   Column2: column2,
         .          .     79:                   Column3: column3,
         .          .     80:                   Column4: column4,
         .          .     81:                   Column5: column5,
         .          .     82:                   Column6: column6,
         .          .     83:                   Column7: column7,
         .          .     84:           }
    3.26GB     3.26GB     85:           rows = append(rows, row)
         .          .     86:   }
         .          .     87:
         .          .     88:   return DataFrame{
         .          .     89:           Rows: rows,

I am clueless where the high RAM consumption comes from. I tried to call the garbage collector manually without success. Can anyone give me a hint?

Upvotes: 2

Views: 529

Answers (1)

Schwern
Schwern

Reputation: 164639

rows is an array of Row structs, not pointers. Each row costs 32 bytes for the floats and integers, plus however long the strings are. With 50 million rows that can get quite large. Worse, append will grow rows by about a factor of 1.5 so it can wind up allocating a lot of extra memory, while also throwing away a lot of smaller versions which need to be garbage collected. And then append(rows, row) is a copy meaning more allocation and deallocation. And it must wait to be garbage collected bloating the memory usage.

This can be avoided by storing references instead. This should mean less allocations and make rows significantly smaller.

var rows []*Row
...
rows = append(rows, &row)

However, the real problem is slurping everything in at once. This is Go! We can use channels and goroutines to read concurrently a row at a time along with our processing.

CSVs are deceptively tricky. Go already has a CSV library, encoding/csv, so we'll use that.

# A handy function to make ignoring errors a bit less laborious.
func IgnoreError(value interface{}, err error) interface{} {
    return value
}

# Its more flexible to take an io.Reader.
# It returns a channel of individual rows.
func readCSV(input io.Reader) chan Row {
    rows := make(chan Row)
    go func() {
        defer close(rows)

        # Use encoding/csv.
        # Let it reuse its backing array for each row.
        # Ignore rows with the wrong number of columns.
        reader := csv.NewReader(input)
        reader.FieldsPerRecord = 7
        reader.ReuseRecord = true

        for {
            parts, err := reader.Read()

            if err == io.EOF {
                break
            }
            if err != nil {
                continue
            }

            # Send each row down the channel.
            rows <- Row{
                Column1: IgnoreError(strconv.Atoi(parts[0])).(int),
                Column2: IgnoreError(strconv.ParseFloat(parts[1], 32)).(float64),
                Column3: IgnoreError(strconv.ParseFloat(parts[2], 32)).(float64),
                Column4: parts[3],
                Column5: IgnoreError(strconv.ParseFloat(parts[4], 32)).(float64),
                Column6: parts[5],
                Column7: parts[6],
            }
        }
    }();
    
    return rows;
}

func main() {
    file, err := os.Open("test.csv")
    if err != nil {
        log.Fatal(err)
    }
    
    rows := readCSV(file)
    for row := range rows {
        fmt.Println(row)
    }
}

Now only one row is loaded at a time. Memory usage should be constant.

Upvotes: 2

Related Questions