Pengin
Pengin

Reputation: 4772

Using Scala to cut up a large CSV file

What's the best way to do file IO in Scala 2.8?

All I want to do is cut a massive CSV file into lots of smaller ones with, say 1000 lines of data per file, and each file retaining the header.

Upvotes: 3

Views: 954

Answers (2)

Nicolas Rinaudo
Nicolas Rinaudo

Reputation: 6168

Moritz' answer is good, provided you don't run into some of CSV's more annoying corner cases. A relevant example would be CSV data where one column is a string that might contain line breaks: you can't rely on a row being on a single line, or you'll end up cutting some rows in half.

I'd use a dedicated CSV parsing library to turn your data into an iterator. kantan.csv is an example (I'm the author), but there are other alternatives such as product-collections or opencsv.

Upvotes: 0

Moritz
Moritz

Reputation: 14212

For simple tasks like this I would use scala.io.Source. An example would look like this:

val input = io.Source.fromFile("input.csv").getLines()

if (input.hasNext) {
  // assuming one header line
  val header = List(input.next())

  for ((i, lines) <- Iterator.from(1) zip input.grouped(linesPerFile)) {
    val out = createWriter(i) // Create a file for index i
    (header.iterator ++ lines.iterator).foreach(out.println)
    out.close
  }
}

Upvotes: 12

Related Questions