Reputation: 3232
here is my dilemma :
I am creating a multilingual platform. In that purpose I created json files containing all the translated text for each language.
Now when the user land on a their page I read from that file and store the array of translations in the $_SESSION variable such as
$_SESSION['website_text'] = json_decode(file_get_contents("content_".$language.".json"), true);
Then everytime I want to echo text on the views I access the element from the session array :
$text = $_SESSION['website_text']['paragraph2_headline'];
Now I am wondering, since the $_SESSION is stored on the server. Is it faster to read from the session like I do or to read everytime from the file and decode the json? The second option would go like that :
$website_text = json_decode(file_get_contents("content_".$language.".json"), true);
$text = $website['paragraph2_headline'];
Thank you all for your help!
Upvotes: 0
Views: 1749
Reputation: 488
In both examples you gave us, you store your localization data in file but process them in different way, one in variable other in session.
Question should be "How to process a localization data efficient"
Upvotes: 0
Reputation: 2486
I do not know much about JSON, but I know for the $_SESSION
is save as
a file on the server with
file_put_contents($path, serialize($_SESSION));
And read as
$_SESSION = unserialize(file_get_contents($path));
Yes as Jon say, if you will has speed, format this as native PHP code or we can save the data as a PHP file in this exemple.
function save_data($path,$data) {
file_put_contents($path, "<?php $data=unserialize('".addslashes(serialize($data))."'); ?>");
}
And for load this is only to include the file and then get the data from $data.
Upvotes: 1
Reputation: 437386
Most likely it's faster when pulling the data from $_SESSION
, but $_SESSION
is not a good place to store localization data in because it will end up being duplicated for each user.
When you retrieve the strings from $_SESSION
then PHP has to read the data from the session file (which it already does to read any other session data, so the cost of opening the file is somewhat amortized) and run unserialize
on it; if you retrieve it from a JSON file then it has open the file, read it and run json_decode
. unserialize
should be faster than json_decode
, but please don't quote me on that.
If you are interested in making this fast, it would be better to read the strings directly from a PHP file where they are stored as an array:
// content_en.php
<?php
return array(
'welcome' => 'Welcome to our website!',
// ...
);
Even if your localization files are in JSON it would be very easy to "compile" the JSON into PHP and use the PHP code as a cache:
$lang = 'en';
$sourceFile = 'content_'.$lang.'.json';
$cacheFile = 'content_'.$lang.'.cache.php';
if (!is_file($cacheFile)) {
$content = json_decode(file_get_contents($sourceFile), true);
file_put_contents($cacheFile, "<?php\n return ".var_export($content, true).";");
}
else {
$content = include($cacheFile);
}
Upvotes: 4