Reputation: 187
I need to regularly load over 300'000 rows x 78 columns
of data into my Go program.
Currently I use (import github.com/360EntSecGroup-Skylar/excelize
):
xlsx, err := excelize.OpenFile("/media/test snaps.xlsm")
if err != nil {
fmt.Println(err)
return
}
//read all rows into df
df := xlsx.GetRows("data")
It takes about 4 minutes on a decent PC using Samsung 960 EVO Series - M.2 Internal SSD.
Is there a faster way to load this data? Currently it takes me more time to read the data than processing it. I'm also opened to other file formats.
Upvotes: 3
Views: 1265
Reputation: 5582
As suggested in the comments, instead of using the XLS format, use a custom, fast data format for reading and writing your table.
In the most basic case, just write the number of columns and rows to a binary file, then write all the data in one go. This will be very fast, I have created a little example here which just writes 300.000 by 40 float32s to a file and reads them back. On my machine this takes about 400ms and 250ms (notice that the file is hot in cache after writing it, may take longer on initial read).
package main
import (
"encoding/binary"
"os"
"github.com/gonutz/tic"
)
func main() {
const (
rowCount = 300000
colCount = 40
)
values := make([]float32, rowCount*colCount)
func() {
defer tic.Toc()("write")
f, _ := os.Create("file")
defer f.Close()
binary.Write(f, binary.LittleEndian, int64(rowCount))
binary.Write(f, binary.LittleEndian, int64(colCount))
check(binary.Write(f, binary.LittleEndian, values))
}()
func() {
defer tic.Toc()("read")
f, _ := os.Open("file")
defer f.Close()
var rows, cols int64
binary.Read(f, binary.LittleEndian, &rows)
binary.Read(f, binary.LittleEndian, &cols)
vals := make([]float32, rows*cols)
check(binary.Read(f, binary.LittleEndian, vals))
}()
}
func check(err error) {
if err != nil {
panic(err)
}
}
Upvotes: 3