Patriots_25
Patriots_25

Reputation: 73

Slow API Request Loop in Python

I am trying to make some simple API requests from IEX's cloud server to grab closing prices for a number of securities. I am successfully pulling the data, but it is taking forever and its not even a lot of data. It would be a HUGE help if anyone would be able to help me restructure my code to make it faster and more efficient. Below is what I have thus far.. Thanks so much guys

for i, row in Universe.iterrows():
    try:
        request = urllib.request.urlopen('https://cloud.iexapis.com/stable/stock/'+Universe['ticker'][i]+'/chart/date/'+Px_1+'?chartByDay=true&token=TOKEN').read()
        data = json.loads(request)
        Universe['Px_1'][i] = data[0]['close']
    except:
        Universe['Px_1'][i] = np.nan

Upvotes: 0

Views: 3113

Answers (1)

Alex Hall
Alex Hall

Reputation: 36043

Use the requests package to create a reusable persistent connection, that should make each request faster:

import requests

session = requests.Session()

for i, row in Universe.iterrows():
    try:
        data = session.get(url).json()
        Universe['Px_1'][i] = data[0]['close']
    except:
        Universe['Px_1'][i] = np.nan

You can also run the requests in parallel using a thread pool:

from multiprocessing.dummy import Pool


def get_px1(ticker):
    url = 'https://cloud.iexapis.com/stable/stock/' + ticker + '/chart/date/' + Px_1 + '?chartByDay=true&token=TOKEN'
    try:
        data = session.get(url).json()
        return data[0]['close']
    except:
        return np.nan


Universe['Px_1'] = Pool(20).map(get_px1, Universe.ticker)

But whether a session can be used across threads is not entirely clear, you may have to use something special or fall back to requests.get. And the server may not be happy with such a high rate of requests.

Also beware of that except as it may silently swallow bugs in your code.

Upvotes: 4

Related Questions