nzalle
nzalle

Reputation: 29

How can I decrease the runtime of my code?

Are there any modifications that I can make to this piece of code to make it run faster? My code is currently working but takes upward of 10 hours to scrape all 50,000 of the profiles. Please let me know what I can do to decrease the runtime. Thank You!

from selenium import webdriver
from selenium.webdriver.common.keys import Keys
from time import sleep
from selenium.common.exceptions import NoSuchElementException


Data = []
text = "test"

driver = webdriver.Chrome("/Users/nzalle/Downloads/chromedriver")
driver.get("https://directory.bcsp.org/")
count = int(input("Number of Pages to Scrape: "))

body = driver.find_element_by_xpath("//body") 
profile_count = driver.find_elements_by_xpath("//div[@align='right']/a")

while len(profile_count) < count:   # Get links up to "count"
    body.send_keys(Keys.END)
    sleep(1)
    profile_count = driver.find_elements_by_xpath("//div[@align='right']/a")

for link in profile_count:   # Calling up links
    temp = link.get_attribute('href')   # temp for
    driver.execute_script("window.open('');")   # open new tab
    driver.switch_to.window(driver.window_handles[1])   # focus new tab
    driver.get(temp)

    # scrape code

    Name = driver.find_element_by_xpath('/html/body/table/tbody/tr/td/table/tbody/tr/td[5]/div/table[1]/tbody/tr/td[1]/div[2]/div').text
    IssuedBy = "Board of Certified Safety Professionals"
    CertificationorDesignaationNumber = driver.find_element_by_xpath('/html/body/table/tbody/tr/td/table/tbody/tr/td[5]/div/table[1]/tbody/tr/td[3]/table/tbody/tr[1]/td[3]/div[2]').text
    CertfiedorDesignatedSince = driver.find_element_by_xpath('/html/body/table/tbody/tr/td/table/tbody/tr/td[5]/div/table[1]/tbody/tr/td[3]/table/tbody/tr[3]/td[1]/div[2]').text
    try:
        AccreditedBy = driver.find_element_by_xpath('/html/body/table/tbody/tr/td/table/tbody/tr/td[5]/div/table[1]/tbody/tr/td[3]/table/tbody/tr[5]/td[3]/div[2]/a').text

    except NoSuchElementException:
        AccreditedBy = "N/A"

    try:
        Expires = driver.find_element_by_xpath('/html/body/table/tbody/tr/td/table/tbody/tr/td[5]/div/table[1]/tbody/tr/td[3]/table/tbody/tr[5]/td[1]/div[2]').text

    except NoSuchElementException:
        Expires = "N/A"

    info = Name, IssuedBy, CertificationorDesignaationNumber, CertfiedorDesignatedSince, AccreditedBy, Expires + "\n"

    Data.extend(info)
    driver.close()
    driver.switch_to.window(driver.window_handles[0])


with open("Spredsheet.txt", "w") as output:
    output.write(','.join(Data))

driver.close()

Upvotes: 0

Views: 88

Answers (1)

carbaretta
carbaretta

Reputation: 324

other than moving issuedBy outside of the for loop as it needn't be repetitively declared (which would still only make a very, very insignificant difference), there doesn't seem to be much you can change. Since this is scraping data off of the internet, the most limiting factor will be your broadband speed. Overall you system has a big O of O(n), meaning that as the data set increases, the processing time increases linearly.

The time for processing of this script is definitely mostly just bottlnecked by broadband speed which you don't have much control over and as such, no, there isn't much you can change

Upvotes: 2

Related Questions