Reputation: 413
The website that I want to scrape is ScienceDirect . The affiliation will be available after clicking on the show more button. I am able to click on it but I am not able to scrape the affiliations which are loaded after clicking on the show more button Here is the code . The for loop is not printing the dl-tag which contains the affiliation
import time
from selenium import webdriver
from selenium.common.exceptions import NoSuchElementException
from selenium import webdriver
from bs4 import BeautifulSoup
driver = webdriver.Firefox()
driver.get('https://www.sciencedirect.com/science/article/pii/S1571065308000656')
soup = BeautifulSoup(driver.page_source,'html.parser')
time.sleep(7)
try:
element = driver.find_element_by_css_selector('.show-hide-details.u-font-sans')
element.click()
time.sleep(15)
for data in soup.find(id='author-group'):
print(data)
print('---')
except NoSuchElementException:
pass
Upvotes: 1
Views: 683
Reputation: 84465
Data is loaded from a script tag meaning you can just use requests and extract script content and parse with json library
import requests, json
from bs4 import BeautifulSoup as bs
headers = {'User-Agent':'Mozilla/5.0'}
url = 'https://www.sciencedirect.com/science/article/pii/S1571065308000656'
r = requests.get(url, headers = headers)
soup = bs(r.content, 'lxml')
data = json.loads(soup.select_one('[type="application/json"]').text)
for author in data['authors']['content']:
print(' '.join([author['$$'][0]['$$'][0]['_'],author['$$'][0]['$$'][1]['_']]))
print(author['$$'][1]['$$'][0]['_'])
Upvotes: 0
Reputation: 10090
I think you need to move your soup instantiation down to after you've clicked on the "Show more" button.
If I run the following code:
driver = webdriver.Firefox()
driver.get('https://www.sciencedirect.com/science/article/pii/S1571065308000656')
time.sleep(3)
try:
element = driver.find_element_by_css_selector('.show-hide-details.u-font-sans')
element.click()
time.sleep(9)
soup = BeautifulSoup(driver.page_source,'html.parser')
for data in soup.find(id='author-group'):
print(data)
print('---')
except NoSuchElementException:
pass
my output is:
<span class="sr-only">Author links open overlay panel</span>
---
<a class="author size-m workspace-trigger" href="#!" name="baep-author-id6"><span class="content"><span class="text given-name">Ignaz</span><span class="text surname">Rutter</span><span class="author-ref" id="bfn001"><sup>1</sup></span><svg class="icon icon-envelope" focusable="false" height="24" viewbox="0 0 102 128" width="19.125"><path d="m55.8 57.2c-1.78 1.31-5.14 1.31-6.9 0l-31.32-23.2h69.54l-31.32 23.19zm-55.8-24.78l42.94 32.62c2.64 1.95 6.02 2.93 9.4 2.93s6.78-0.98 9.42-2.93l40.24-30.7v-10.34h-102zm92 56.48l-18.06-22.74-8.04 5.95 17.38 21.89h-64.54l18.38-23.12-8.04-5.96-19.08 24.02v-37.58l-1e1 -8.46v61.1h102v-59.18l-1e1 8.46v35.62"></path></svg></span></a>
---
<dl class="affiliation"><dd>Fakultät für Informatik, Universität Karlsruhe, Germany</dd></dl>
---
Upvotes: 2