Reputation: 19714
I have a PHP frontend and a C++ backend, and I need to be able to send groups of names to the frontend. What serialized format would be the most efficient/fastest for the PHP to read?
Example data
group1:
name1 3923
name2 9879
name3 8944
group2:
name5 9823
group3:
name9 9822
name1 4894
What would be the fastest for PHP to read?
Upvotes: 3
Views: 1944
Reputation: 40568
As Paolo pointed out you can use json_decode which is very fast. On the C++ backend these are some of your options ( taken directly from the json.org website ):
C++:
Upvotes: 0
Reputation: 342665
I've used PHP's serialize() and unserialize() on large text files, and it performed miserably (that was a couple of years ago - maybe it's better now). Anyway, I devised a little trick to overcome this, it simply involves generating a PHP array declaration from the data you're exporting straight into a text file, e.g.:
<?php
$groups = array('groups' => array( array('jeff' => 2343,
'tom' => 8477),
array('baal' => 2873,
'scorpio' => 3210),
array('jeff' => 2343,
'tom' => 8477)
)
)
);
?>
...and then unserializing it by simply calling:
include 'groups.php';//makes $groups available
Worked nicely back then.
Upvotes: 2
Reputation: 74588
PHP's own serialized format will probably be the fastest. unserialize()
is the function PHP uses to convert this data back to its own types. This post has various links to other languages' implementations of PHP's serialized format, I'm sure you could convert one of those easily.
Upvotes: 6
Reputation: 488504
JSON would be pretty easy using json_decode
. I'm not sure about speed, but unless you plan on transferring megabytes of this data between the systems it should be irrelevant which one you go with.
Upvotes: 1