Reputation: 23
I try to scrape a website using the following code:
library(RSelenium)
library(dplyr)
library(rvest)
rD<-rsDriver(browser = 'firefox', port = 4875L)
remDr<-rD$client
input_galaxus<-c('https://www.galaxus.ch/8606656','https://www.galaxus.ch/9796481','https://www.galaxus.ch/10592688')
vec_galaxus<-vector()
i=0
for (j in input_galaxus){
remDr$navigate(j)
i=i+1
try(vec_galaxus[i]<-read_html(remDr$getPageSource()[[1]])%>%
html_nodes('div strong')%>%
html_text()%>%
nth(5))
Sys.sleep(runif(1, min=5, max=10))
}
But when the loop turns to the second webpage, it cannot acces the website anymore.
Can someone help me out how to fix this problem?
Thank you so much!
Upvotes: 0
Views: 172
Reputation: 18425
I got it to work with an rvest
session - no need for Selenium. Just remove the RSelenium lines and replace your for
loop with
sess <- session(input_galaxus[1]) #to start the session
for (j in input_galaxus){
sess <- sess %>% session_jump_to(j) #jump to URL
i=i+1
try(vec_galaxus[i] <- read_html(sess) %>% #can read direct from sess
html_nodes('div strong') %>%
html_text() %>%
nth(5))
Sys.sleep(runif(1, min=5, max=10))
}
vec_galaxus
[1] " 399.–" " 660.–" " 931.–"
Upvotes: 1