Reputation: 20173
I am using Google translator API to translate my website in some of the websites:
I do it so:
function translate($from_lan, $to_lan, $text){
$json = json_decode(file_get_contents('https://ajax.googleapis.com/ajax/services/language/translate?v=1.0&q=' . urlencode($text) . '&langpair=' . $from_lan . '|' . $to_lan));
$translated_text = $json->responseData->translatedText;
return $translated_text;
}
As explained translate a PHP $string using google translator API
Wich works fine (lets not focus on the translation quality), the problem is that about 20/30 requests (or more) will make the website unusable.. too slow. if you want to check just go to
http://funcook.com/ and test in french http://funcook.com/?lan=3 or german http://funcook.com/?lan=4
I also tried to loop all the strings of my website, translate them and save the translated strings so i don't need to request so many times,
but they are about 300 and it fails in the process (about 50%) i guess because of the delay
So question is: ¿A better alternative?
Upvotes: 1
Views: 4290
Reputation: 4320
There is no better alternative. What you need to do is cache the results you get from Google Translate and not load from Google Translate every single time. Write the translations to a file or database and load them from there. If the keyword does not exist, then ask from Google Translate. And flush/reset the cache every now and then if you think the translation changes in time.
Never ever make multiple requests to outside services without caching them. You are basically wasting your own resources as well as will likely end up blocked by Google.
Upvotes: 4