caitlinp
caitlinp

Reputation: 183

Problems saving scraped data from a webpage into a excel file

I am new to scraping using Python. After using a lot of useful resources I was able to scrape the content of a Page. However, I am having trouble saving this data into a .csv file.

Python:

import mechanize 
import time
import requests
import csv
from selenium import webdriver
from selenium.webdriver.common.by import By

driver = webdriver.Firefox(executable_path=r'C:\Users\geckodriver.exe')

driver.get("myUrl.jsp")

username = driver.find_element_by_name('USER')
password = driver.find_element_by_name('PASSWORD')

username.send_keys("U")
password.send_keys("P")

main_frame = driver.find_element_by_xpath('//*[@id="Frame"]')
src = driver.switch_to_frame(main_frame)

table = driver.find_element_by_xpath("/html/body/div/div[2]/div[5]/form/div[7]/div[3]/table")
rows = table.find_elements(By.TAG_NAME, "tr")
for tr in rows:
outfile = open("C:/Users/Scripts/myfile.csv", "w")
    with outfile:
        writers = csv.writer(outfile)
        writers.writerows(tr.text)

Problem:

Only one of the rows gets written to the excel file. However, when I print the tr.text into the console, all the required rows show up. How can I get all the text inside tr elements to be written into an excel file?

Upvotes: 0

Views: 194

Answers (1)

Tranqodile
Tranqodile

Reputation: 68

Currently your code will open the file, write one line, close it, then on the next row open it again and overwrite the line. Please consider the following code snippet:

# We use 'with' to open the file and auto close it when done
# syntax is best modified as follows
with open('C:/Users/Scripts/myfile.csv', 'w') as outfile:
    writers = csv.writer(outfile)

    # we only need to open the file once so we open it first
    # then loop through each row to print everything into the open file
    for tr in rows:
        writers.writerows(tr.text)

Upvotes: 2

Related Questions