Alexander Soare
Alexander Soare

Reputation: 3267

How to use scrapy to click on element and return JS

I am trying to scrape names and contact details from this page https://www.realestate.com.au/find-agent/agents/sydney-cbd-nsw. I normally want to click into each of the list items and get the information from the resulting page, but there is no href to follow.

I'm presuming that the class type somehow points to some JS codes. When the list item is clicked the JS redirects you to the new url. Can I get at it somehow using Scrapy?

Note: I don't know much about JS

Upvotes: 0

Views: 223

Answers (2)

Michael Savchenko
Michael Savchenko

Reputation: 1445

This will give you all the links you need without JS rendering.

response.css('script::text').re('"url":"(.+?)"')

Don't use Chrome for scraping until there's no other way. It's really bad practice.

Upvotes: 3

noahcoleman
noahcoleman

Reputation: 89

I'd recommend using Selenium which will automate an instance of an actual browser. This means that sessions, cookies, javascript execution, etc. is all handled for you.

Example:

from selenium import webdriver

driver = webdriver.Chrome()
driver.get("http://example.com")
button = driver.find_element_by_id('buttonID')
button.click()

Upvotes: 0

Related Questions