user139260
user139260

Reputation: 1007

Python urllib request to Google sites slow

I'm trying to make urllib requests to http://google.com in Python 3 (I rewrote it in 2.7 using urllib2 as well, same issue). Below is some of my code:

import urllib.request
from urllib.request import urlopen
import http.cookiejar

cj = http.cookiejar.CookieJar()
opener = urllib.request.build_opener(urllib.request.HTTPCookieProcessor(cj))
opener.addheaders = [('User-agent', 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/40.0.2214.91 Safari/537.36')]

def makeRequest():
    search = 'http://google.com'

    print('About to search...')
    response = opener.open(search).read()
    print('Done')

makeRequest()

When I run this code, it runs in about 14 seconds:

real    0m14.386s
user    0m0.087s
sys     0m0.027s

This seems to be the case with any Google site (Gmail, Google Play, etc.). When I change the search variable to a different site, such as Stackoverflow or Twitter, it runs in well under half a second:

real    0m0.277s
user    0m0.085s
sys     0m0.017s

Does anyone know what could be causing the slow response from Google?

Upvotes: 2

Views: 885

Answers (1)

lqhcpsgbl
lqhcpsgbl

Reputation: 3782

First, you can use ping or traceroute to google.com and others sites to compare the time delay to see if the DNS issue.

Second, you can use wireshark to sniffer every packets to see if something wrong with the communication.

I think may be DNS issue, but I can't make sure that.

Upvotes: 2

Related Questions