James
James

Reputation: 1

Python list into csv file

I created a scraper to get info from a website using a for loop and created a list with each loop. The result is what I expected but I'm having a hard time to put this info altogether into a csv file. My intention is to create 4 rows where the corresponding url, location, etc, will match but I maybe the way that I set my loops is making it harder?

I thought the the follow syntax would do the trick but nah. It only prints all the urls in the same row. I got to print all the info but they were still in the same row.

with open('trademe.csv', 'w', newline='') as f:
    writer = csv.writer(f)
    writer.writerow(['URL', 'Price', 'Location', 'Flatmates'])
    writer.writerow([link, price, loc, mates])

By the way, this is the code so far;

trademe = urlopen(url)
trademe_html = soup(trademe.read(), "html.parser")
trademe.close()

link=[]
for i in trademe_html.find_all('div', attrs={'class' : 'dotted'}):
    link.append('URL: www.trademe.co.nz'+i.a['href'])
    print('URL: www.trademe.co.nz'+i.a['href'])

price=[]
for i in trademe_html.find_all('div', attrs={'class' : 'flatmates-list-view-card-price'}):
    price.append('Price and availability: ' + i.text.strip())
    print('Price and availability: ' + i.text)

loc=[]
for i in trademe_html.find_all('div', attrs={'class' : 'flatmates-card-subtitle'}):
    loc.append('Location: ' + i.text)
    print('Location: ' + i.text)
    
mates=[]
for i in trademe_html.find_all('div', attrs={'class' : 'flatmates-card-existing-flatmates'}):
    mates.append(i.text.strip())
    print(i.text)

Upvotes: 0

Views: 38

Answers (1)

topgunner
topgunner

Reputation: 52

  • Insert your data into a pandas DataFrame
  • Use the to_csv function in pandas

Pandas makes it really easy to export a DataFrame to csv with headers

Upvotes: 1

Related Questions