Pingping
Pingping

Reputation: 15

Python Requests to cycle through pages in cURL API

I'm wanting to cycle through pages until returning None, but not certain how to achieve this with a cURL API. I'm also wanting to combine all results at the end into one file. I achieved this with a noob method of repeating variables, but this obviously is inefficient. I tried looking for existing answers to similar questions but wasn't able to get anything to work for my instance.

The API has no headers, by the way, just in case those are considered for any reason.

I also tried download pycurl, but the pip install appears to be broken and am not experienced enough to manually install from file, but I'm sure this can be achieved with requests.

import requests
import json

url = 'https://API.API.io/?page='
username = 'API key'
password = ''

params1={"page":"1","per_page":"500"}
params2={"page":"2","per_page":"500"}
params3={"page":"3","per_page":"500"}

r1=requests.get(url,params=params1,auth=(username,password))
r2=requests.get(url,params=params2,auth=(username,password))
r3=requests.get(url,params=params3,auth=(username,password))

rj1=r1.json()
rj2=r2.json()
rj3=r3.json()

writeFile = open('file.json','w',encoding='utf-8')
json.dump(
rj1+
rj2+
rj3,
writeFile)
writeFile.close()

Upvotes: 1

Views: 69

Answers (1)

Andrej Kesely
Andrej Kesely

Reputation: 195428

You can use for-loop to get responses from various pages. Also, use with open(...) when opening the file for writing:

import json
import requests


url = "https://API.API.io/?page="
username = "API key"
password = ""

params = {"page": 1, "per_page": 500}

all_data = []
for params["page"] in range(1, 4):  # <--- this will get page 1, 2 and 3
    r = requests.get(url, params=params, auth=(username, password))
    all_data.extend(r.json())


with open("file.json", "w", encoding="utf-8") as f_out:
    json.dump(all_data, f_out, indent=4)

Upvotes: 1

Related Questions