Reputation: 87
I recently wrote a script in order to look for search volume through API calls.
Here is the script that I used:
install.packages("SEMrushR")
library(SEMrushR)
#Data frame to append data
final_result_useo_rumbo <- data.frame()
mes_keywords_to_check <- readLines("useo_rumbo_es.txt")
mes_keywords_to_check <- as.character(mes_keywords_to_check)
#Loop in order to look for each keyword that is in my list, then return Search volume thanks to the API call and finally store it in a new database.
for (i in 1:length(mes_keywords_to_check)) {
test_keyword <- as.character(mes_keywords_to_check[i])
df_test_2 <- keyword_overview_all(test_keyword, "es","API KEY")
final_result_useo_rumbo <- rbind(final_result_useo_rumbo,df_test_2)
}
The script is working just fine, but the problem is that I have a LOT of keywords to check (800 000). When I did it for 60 000 keywords, it took almost 4 hours to proceed...
Do you know how I could speed up the process? Is there a better way to write the script?
Upvotes: 1
Views: 780
Reputation: 522007
You could try replacing the for
loop with an apply
function:
result <- sapply(mes_keywords_to_check, function(x) {
keyword_overview_all(x, "es", "API KEY")
})
Then, you may data.frame
the above, if you want a data frame and not a matrix:
result <- data.frame(result)
Or maybe take the transpose:
result <- data.frame(t(result))
You don't need to call as.character
on each entry in mes_keywords_to_check
, because you already converted that entire vector to character before the loop (or apply
call, in the above case). Also, you also probably don't need to call rbind
in each iteration of the loop. Rather, let R rollup the data for you, and then worry about what to do with after the loop/apply has completed.
Upvotes: 1