MDI
MDI

Reputation: 63

Multiple URLs to save json data

I'm trying to call multiple (more than 10 urls) at a time and save all those 10 url's data, which will be in json format and trying to save in my location

here is the below code I have tried, using this I can only achieve to get only last URL's data saved in my json file. How to get all URL's data and stored in a single json file?

import json
import requests

URLs = ['http://httpbin.org/ip',
'http://httpbin.org/user-agent',
'http://httpbin.org/headers']

json_list = []
for url in URLs:
    data = requests.get(url)
    resolvedwo = data.json()
    with open('resolvedworesolution.json', 'w') as f:
         json.dump(resolvedwo, f)

Upvotes: 0

Views: 2706

Answers (5)

Sunitha
Sunitha

Reputation: 12015

Your problem is that you are overwriting the file, each time in the loop. Instead, store the loop results in a list and write it to the file only once

import requests
import json

URLs = ['http://httpbin.org/ip',
        'http://httpbin.org/user-agent',
        'http://httpbin.org/headers']

json_list = []

for url in URLs:
    data = requests.get(url)
    resolvedwo = data.json()
    json_list.append(resolvedwo)

with open('resolvedworesolution.json', 'w+') as f:
    json.dump(json_list, f, sort_keys=True, indent=4)

Output:

[
    {
        "origin": "137.221.143.66, 137.221.143.66"
    },
    {
        "user-agent": "python-requests/2.21.0"
    },
    {
        "headers": {
            "Accept": "*/*",
            "Accept-Encoding": "gzip, deflate",
            "Host": "httpbin.org",
            "User-Agent": "python-requests/2.21.0"
        }
    }
]

Upvotes: 2

dzang
dzang

Reputation: 2260

You can store the info in an object that can be serialized as a whole:

import json
import requests

URLs = ['http://httpbin.org/ip',
'http://httpbin.org/user-agent',
'http://httpbin.org/headers']

json_list = []
for url in URLs:
    data = requests.get(url)
    resolvedwo = data.json()
    json_list.append(resolvedwo)

with open('resolvedworesolution.json', 'w+') as f:
    json.dump(json_list, f)

Upvotes: 0

DirtyBit
DirtyBit

Reputation: 16772

Use append mode while writing to the file in order to "retain" the existing data:

import json
import requests
URLs = ['http://httpbin.org/ip',
'http://httpbin.org/user-agent',
'http://httpbin.org/headers']

json_list = []
for url in URLs:
    data = requests.get(url)
    resolvedwo = data.json()
    with open('resolvedworesolution.json', 'a') as f:   # Using the append mode
        json.dump(resolvedwo, f)
        f.write("\n")                                   # new line for readability

OUTPUT:

{"origin": "159.122.207.241, 159.122.207.241"}
{"user-agent": "python-requests/2.21.0"}
{"headers": {"Accept": "*/*", "Accept-Encoding": "gzip, deflate", "Host": "httpbin.org", "User-Agent": "python-requests/2.21.0"}}

EDIT:

You could write the response to the file in one-go:

with open('resolvedworesolution.json', 'a') as f:
    f.write(str(resolvedwo))
    f.write("\n")

OR

for url in URLs:
    data = requests.get(url)
    with open('resolvedworesolution.json', 'a') as f:
        f.write(data.text)
        f.write("\n")

Upvotes: 0

C.Nivs
C.Nivs

Reputation: 13106

When writing to files, opening a file in w mode will erase/truncate the contents before writing to it.

with open('resolvedworesolution.json', 'a') as f:

That should solve your problem

Upvotes: 0

alexthefifth
alexthefifth

Reputation: 1

Instead of:

resolvedwo = data.json()

You probably want:

resolvedwo += data.json()

Upvotes: -1

Related Questions