Sid
Sid

Reputation: 4055

How to use requests or other module to get data from a page where the url doesn't change?

I am currently using selenium to go to a page:

https://www.nseindia.com/products/content/derivatives/equities/historical_fo.htm

Then select the relevant options and click Get Data button.

Then retrieving the table that is generated using BeautifulSoup.

Is there a way to use requests in this case? If so, if someone can point me to a tutorial?

Upvotes: 1

Views: 767

Answers (2)

B.Adler
B.Adler

Reputation: 1539

When you select the options you are more or less just setting parameters for the get data button to make a request to their backend. If you mimic the request as is the case in this curl:

curl 'https://www.nseindia.com/products/dynaContent/common/productsSymbolMapping.jsp?instrumentType=FUTIDX&symbol=NIFTYMID50&expiryDate=31-12-2020&optionType=select&strikePrice=&dateRange=day&fromDate=&toDate=&segmentLink=9&symbolCount=' -H 'Pragma: no-cache' -H 'Accept-Encoding: gzip, deflate, br' -H 'Accept-Language: en-US,en;q=0.9' -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/71.0.3578.98 Safari/537.36' -H 'Accept: */*' -H 'Referer: https://www.nseindia.com/products/content/derivatives/equities/historical_fo.htm' -H 'X-Requested-With: XMLHttpRequest' -H 'Connection: keep-alive' -H 'Cache-Control: no-cache' --compressed

Then you can do the same thing in requests:

import requests

headers = {
    'Pragma': 'no-cache',
    'Accept-Encoding': 'gzip, deflate, br',
    'Accept-Language': 'en-US,en;q=0.9',
    'User-Agent': 'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/71.0.3578.98 Safari/537.36',
    'Accept': '*/*',
    'Referer': 'https://www.nseindia.com/products/content/derivatives/equities/historical_fo.htm',
    'X-Requested-With': 'XMLHttpRequest',
    'Connection': 'keep-alive',
    'Cache-Control': 'no-cache',
}

params = (
    ('instrumentType', 'FUTIDX'),
    ('symbol', 'NIFTYMID50'),
    ('expiryDate', '31-12-2020'),
    ('optionType', 'select'),
    ('strikePrice', ''),
    ('dateRange', 'day'),
    ('fromDate', ''),
    ('toDate', ''),
    ('segmentLink', '9'),
    ('symbolCount', ''),
)

response = requests.get('https://www.nseindia.com/products/dynaContent/common/productsSymbolMapping.jsp', headers=headers, params=params)

A good site for learning how to do this:

https://curl.trillworks.com/

Upvotes: 5

chitown88
chitown88

Reputation: 28630

You'll have to test it out by trying different values in the query dictionary, but I was able to get the table back with the url formed used to get the http request

import requests
import pandas as pd

query = {  # just mimicking sample query that I saw after loading link
'instrumentType': 'OPTIDX',
'symbol': 'BANKNIFTY',
'expiryDate': 'select',
'optionType': 'CE',
'strikePrice': '23700',
'dateRange': '',
'fromDate': '05-06-2017',
'toDate': '08-06-2017',
'segmentLink': '9',
'symbolCount': '',
}





url = 'https://www.nseindia.com/products/dynaContent/common/productsSymbolMapping.jsp?\
instrumentType=%s\
&symbol=%s\
&expiryDate=%s\
&optionType=%s\
&strikePrice=%s\
&dateRange=%s\
&fromDate=%s\
&toDate=%s\
&segmentLink=%s\
&symbolCount=%s' %(query['instrumentType'],
  query['symbol'],
  query['expiryDate'],
  query['optionType'],
  query['strikePrice'],
  query['dateRange'],
  query['fromDate'],
  query['toDate'],
  query['segmentLink'],
  query['symbolCount']
  )

response = requests.get(url)

table = pd.read_html(response.text)
table[0]

Output:

0   Historical Contract-wise Price Volume Data        ...                      NaN
1                                       Symbol        ...         Underlying Value
2                                    BANKNIFTY        ...                 23459.65
3                                    BANKNIFTY        ...                 23459.65
4                                    BANKNIFTY        ...                 23459.65
5                                    BANKNIFTY        ...                 23459.65
...

[42 rows x 17 columns]

Upvotes: 2

Related Questions