Reputation: 17
I am trying to fill a form using mechanize on python. When I run the code, I get an error:
Error 403:request disallowed by robots.txt.
I went through the previous answered questions with similar issue and saw that adding br.set_handle_robots(False)
should fix it, but I am still getting the same error. So what am I missing here?
import re
import mechanize
from mechanize import Browser
br = mechanize.Browser()
br.set_handle_equiv(False)
br.set_handle_robots(False)
br.addheaders = [('User-agent','Mozilla/5.0 (X11; Linux x86_64; rv:18.0)Gecko/20100101 Firefox/18.0 (compatible;)'),('Accept', '*/*')]
text = "1500103233"
browser = Browser()
browser.open("http://kuhs.ac.in/results.htm")
browser.select_form(nr=0)
browser['Stream']=['Medical']
browser['Level']=['UG']
browser['Course']=['MBBS']
browser['Scheme']=['MBBS 2015 Admissions']
browser['Year']=['Ist Year MBBS']
browser['Examination']=['First Professional MBBS Degree Regular(2015 Admissions) Examinations,August2016']
browser['Reg No']=text
response = browser.submit()
Upvotes: 0
Views: 2779
Reputation: 324
br = mechanize.Browser()
and then you set browser = Browser()
?http://kuhs.ac.in/results.htm
if you can see from the page source, the source is : src="http://14.139.185.148/kms/index.php/results/create"
Stream</label
is name="Results[streamId]"
So , you can try this :
import mechanize
br = mechanize.Browser()
br.set_handle_equiv(False)
br.set_handle_robots(False)
br.addheaders = [('User-agent','Mozilla/5.0 (X11; Linux x86_64; rv:18.0)Gecko/20100101 Firefox/18.0 (compatible;)'),('Accept', '*/*')]
text = "1500103233"
br.open("http://14.139.185.148/kms/index.php/results/create").read()
for forms in br.forms():
print forms
br.select_form(nr=0)
br['Results[streamId]']=['1',] #Medical
#etc..
response = br.submit()
print response.read()
You can see here :Submitting a form with mechanize (TypeError: ListControl, must set a sequence)
Hope this helps, it works for me!
Upvotes: 2