Reputation: 29377
I mean this methot where user can ask data for max 50 articles without abusing api system by multiple requests, so:
https://en.wikipedia.org/w/api.php?action=query&prop=revisions&rvprop=size&titles=Berlin|Paris|Rome
Gives correct size for these articles. But how to retrieve such info about sizes from different languages ? The only method I know of is to replace the language code at the beginning of the domain, like this:
https://en.wikipedia.org/w/api.php?action=query&prop=revisions&rvprop=size&titles=Berlin
https://de.wikipedia.org/w/api.php?action=query&prop=revisions&rvprop=size&titles=Berlin
https://sv.wikipedia.org/w/api.php?action=query&prop=revisions&rvprop=size&titles=Berlin
etc..
So natural question is how to make these requests into one, but all I try do not work.
for example:
https://en.wikipedia.org/w/api.php?action=query&prop=revisions&rvprop=size&titles=en:Berlin|de:Berlin|sv:Berlin - shows only data for English page, for others there is only a strange entry for example: { "title": "sv:berlin", "iw": "sv" }
that contains no useful data
removing the country code from domain also breaks the query: https://wikipedia.org/w/api.php?action=query&prop=revisions&rvprop=size&titles=en:Berlin|de:Berlin|sv:Berlin - "invalidreason": "The requested page title contains invalid characters: \"%7c\"."
So is the only way to get such data to try to gently 'flood' their api servers and hope that they wouldn't block me ?
Upvotes: 0
Views: 127
Reputation: 2544
There is no reason to bundle those requests in one, especially as they are very cheap API requests. Just make all the requests you need.
Upvotes: 1