EB_Crypto
EB_Crypto

Reputation: 125

How to loop through full API collection by changing one number in URL

I am using this API ('https://api.opensea.io/asset/0x5d00d312e171be5342067c09bae883f9bcb2003b/1/') to collect information about individual digital assets within a certain game. The '1' at the end of the URL represents the asset with token_id = 1. I am trying to get my code to collect the variables that I need for each asset and then cycle to the next - token_id = 2 which would be a '2' at the end of the URL, token_id = 3, and so on. The final asset would be token_id = 51,450.

I would like to print the results and then somehow export to excel if possible, been trying to figure that out as well. All help is appreciated, thanks!

import json
import requests

r = requests.get('https://api.opensea.io/asset/0x5d00d312e171be5342067c09bae883f9bcb2003b/1/')
EMONA_json = r.json()
EMONA_str = json.dumps(EMONA_json, indent=2)

token_id = EMONA_json['token_id']
traits = EMONA_json['traits']
owner_address = EMONA_json['owner']['address']

print(token_id, traits, owner_address)

Best case scenario - the code would print all the results into excel. Each column would include 'token_id', 'traits' (will eventually extract trait_type: class_name, level & catch_number values) and owner_address.

Upvotes: 0

Views: 80

Answers (2)

Pitto
Pitto

Reputation: 8589

Here's a very simple example that will hopefully guide you to a more refined and better solution.

In order to keep things easy I output results into a csv file that you can easily import in Excel.

import json
import requests
import csv

result = {}
max_id_to_get = 3

for token_id in range(1,max_id_to_get):
    r = requests.get('https://api.opensea.io/asset/0x5d00d312e171be5342067c09bae883f9bcb2003b/{}'.format(token_id))
    EMONA_json = r.json()
    EMONA_str = json.dumps(EMONA_json, indent=2)
    token_id = EMONA_json['token_id']
    traits = EMONA_json['traits']
    owner_address = EMONA_json['owner']['address']
    result[token_id] = {"traits":traits, "owner_address":owner_address}

with open('output.csv', 'w') as csvFile:
    writer = csv.writer(csvFile)
    writer.writerow(["token_id", "traits", "owner_address"])
    for id in result:
        writer.writerow([id, result[id]['traits'], result[id]['owner_address']])
csvFile.close()

Upvotes: 1

Horatiu Jeflea
Horatiu Jeflea

Reputation: 7414

Assuming that your question is how to handle 50000+ api calls, well here goes:

Use a queue (RabbitMQ for example) which retries an event in case it fails. Create 50000 events (or maybe use batches) and sent them to the queue. A services will subscribe to this queue which will do the actual API call. This can be made in parallel. I would store resulting data in a NoSQL (key value or document), with the token_id as a hash key. Once all are done, scan that table and create the excel/csv file.

Upvotes: 1

Related Questions