Reputation: 10616
I'm going to use serialize to store some big array to save into mySQL, and I wonder if this method costs countable performace ?
[edit]
The array I'm going to store is the user's price lists. I made a dynamic form for user to type in their price list, and each pricelist contains about 20 to 100 lines. Each line has 4 fields : product_name, unit, wholesale price, retail price.
I just want to save the lists & re-display them, not for querying by SQL.
Upvotes: 1
Views: 4256
Reputation: 212452
There is processor overhead with both serialize() and the corresponding unserialize() although it's negligeable: unless you're doing it thousands of times per request or on seriuosly large data volumes, you won't even notice.
The serialized data is larger, and it's less practical to search for values on your database, so I'd recommend better normalization of your tables so you don't need to do this
Upvotes: 0
Reputation: 91963
How big is the array? It should be a linear problem so the performance hit would probably don't be that much compared to other approaches. You can always benchmark it if you are worried.
With that said, please remember that saving a serialized array in a relational database is terrible compared to normalised data. You will have a hard time using the serialized array from another language, and it will be almost impossible to query the data in the WHERE clause. I hope you have considered this.
Upvotes: 0
Reputation: 41424
Serialize takes a long time for big things and a short time for little things. How big are your things and how fast do you need things to be?
You could directly measure the cost in microseconds of serialize() for your particular arrays with something like
$startTime = microtime(true);
yourSerializeFunction(); // <-- you are timing this
$endTime = microtime(true);
var_dump( $endTime - $startTime )
Or there are lots of other profiling options if you want to be professional about it.
Upvotes: 5