Pramod R
Pramod R

Reputation: 73

How to insert batch of 1000 records into db?

I'm reading a huge file and inserting records in mysql using statement.executeBatch() of prepared statement. Below is the code:

for(int i = 0; i < file.length; i++) {

      count += 1
      statement.setString(1, record.id)
      statement.setString(2, record.date)
      statement.setString(3, record.msg)
      statement.addBatch()

      if (count % 1000 == 0) 
          statement.executeBatch()
}
statement.executeBatch() 

How can Slick help here (perhaps mimicking the code above)?

Upvotes: 6

Views: 6815

Answers (1)

mmedina
mmedina

Reputation: 256

Using Slick 2.0.2

Once you have a case class modeling your data, your Table class and your TableQuery object defined, just read the file into a Seq, and add it to the TableQuery object using the ++= function.

case class MyDataClass(id: String, date: String, msg: String)

class MyData(tag: Tag) extends Table[MyDataClass](tag,"mydatatableInDB") {
   def id = column[String]("id")
   def date = column[String]("date")
   def msg = column[String]("msg")

   def * = (id, date, msg) <> (MyDataClass.tupled)(MyDataClass.unapply)
}

val myDataTableQuery = TableQuery[MyData]

// read your data into a Seq[MyDataClass] (for example) and...

myDataTableQuery ++= listWithData

If you really need to do it with a batch, you could group the elements using grouped, and iterate on them, adding data to the table query on each iteration. Something like this:

// With your data already in a Seq
def insertIntoTableQuery(data: List[MyDataClass) = {
    myDataTableQuery ++= data
}

// make groups of 1000 and iterate    
listWithData.grouped(1000) foreach insertInToTableQuery

Unless I'm missing something, I think this is pretty much covered in the documentation:

Slick 2.0.2 Documentation

Some weeks ago I had a similar task. The file I was working with had 350,000+ lines, and from each line I had to extract data and create a couple of objects, each of one was to be inserted in a different table. Mind you, I didn't need to do it in a batch and it was really fast.

Upvotes: 7

Related Questions