Reputation: 59
I am using python 2.7 requests module
to make a web crawler. But I am having trouble while making requests to a site that requires certificate. When I made requests.get(url)
, it throws sslError, certificate verify failed, ok.
So, I tried requests.get(url, verify=False)
, it works but it returns meta http-equiv="refresh" url='...'
, and the url is not the one I made the request. Is there a way to solve this problem or a need to send the certificate?
I saw in requests doc that I can send the certificate and the key. I have the certificate.crt
, but I don't have the key, is there a way to get the key?
The certificate is AC certisign multipla G5 and uses TLS 1.2
Upvotes: 3
Views: 2374
Reputation: 59
After a long time of trying to solve this issue, I figured it out. The problem was not with the SSL
certificate.
I was making a request to a web page that needs a session; The url that I was using is redirected from another page. To access it correctly, I had to send a request to that page and get the last redirected page.
So, what I did was using Requests' Session
method:
Session.get(url, verify=False)
where the url
is the redirecting url.
Upvotes: 1