Javier Parra
Javier Parra

Reputation: 2080

Optimizing big import in php

I have a simple importer, it goes through each line of a rather big csv and imports it to the database.

My question is: Should I call another method to insert each object (generating a DO and telling it's mapper to insert) or should I hardcode the insert process in the import method, duplicating the code?

I know the elegant thing to do is to call the second method, but I keep hearing in my head that function calls are expensive.

What do you think?

Upvotes: 2

Views: 393

Answers (3)

Eiko
Eiko

Reputation: 25632

It shouldn't matter, as the insertion will take probably orders of magnitude longer than the php code.

As others have stated, bulk insert will give you much more benefit. Those line-level optimizations will only make you blind for the good higher level optimizations.

If you are unsure, do a simple timing with both ways, it shouldn't take longer than a couple of minutes to find out.

Consider combining both approaches to make batch inserts, if all-at-once hits some memory/time/.... limits.

Upvotes: 1

Bill Karwin
Bill Karwin

Reputation: 562250

Many RDBMS brands support a special command to do bulk imports. For example:

Using these commands is preferred over inserting one row at a time from a CSV data source because the bulk-loading command usually runs at least an order of magnitude faster.

Upvotes: 4

k_b
k_b

Reputation: 2480

I don't think this matters too much. Consider a bulk insert. At least make sure you're using a transaction, and consider to disable indices before inserting.

Upvotes: 1

Related Questions