user10837120
user10837120

Reputation:

Python selenium prevent website from blocking

I don’t really have a project in mind, just trying to figure stuff out when I started wondering how I can prevent a site from knowing that I visited them repeatedly.

I had no idea how to use selenium and that’s why it was a lot of try and error - suddenly the website blocked me. I turned protonvpn (free) on, but it still wouldn’t let me get on that site. I read about fake user-agents on chrome and Proxies and all that stuff, but what is the key? What do I need to do before entering a second time, so that no one would know it’s me again?

Is it enough to change my IP Adress? Is this the way to go? I don’t find a fitting python-related answer

Upvotes: 3

Views: 5522

Answers (1)

user10614851
user10614851

Reputation:

The issue here sounds like it two fold:

  1. Many sites have user-agent detection methods which will identify automation tools, for instance Selenium.

  2. Rapid execution of actions against a website often trip bot detection tools and is also ill-advised. Generally with scraping sites, if you are unsure what kind of anti-bot or anti-spam systems are in place, you want to configure the scraper to have human-like action execution times.

Your best bet is to check the selenium user-agent and configure it to something else as per this post here.

Upvotes: 2

Related Questions