Jeongbin Kim
Jeongbin Kim

Reputation: 181

trigger js event during the python web scraping

# ./scrape.py
from lxml import html
import requests

url = "http://www.my-target-url.com"

page = requests.get(url)
# can I insert some js event codes to execute here?
tree = html.fromstring(page.content)
print tree.xpath("/html/to/target/data/text()")[0]

I made this for scraping target page which has several buttons to change data. I want all the data which can be derived from those buttons. I searched for the way to send POST or GET with parameters. There seems no other way but to trigger js event from that web page though(like I can do in chrome developer's console).

Is there any way I can execute js events to change response object for needed data in this python code? should I use another library other than requests? or some other way which I can search for?(like make some web browser object behind and do with it? then what can help?)

Upvotes: 4

Views: 2449

Answers (1)

Ibrahim
Ibrahim

Reputation: 2072

Short answer, no.

All the javascript events are handled by the browser's js engine. This means you will need a javascript engine too to handle and execute the scripts and trigger the events.

Upvotes: 1

Related Questions