Reputation: 1
me and my partners have this piece of code where we extract tweets in R and put it in a database, what we like to know is how to loop this piece of code, so that it loops periodically. Preferably every 30 minutes.
Here's our code:
#Load twitter package for R
library(twitteR)
#load MySQL package for R
library(RMySQL)
#Load authentication files for twitter
load(file="twitter_authentication.Rdata")
registerTwitterOAuth(cred)
#Search twitter for tweets e.g. #efteling
efteling <- searchTwitter("@efteling", n=100)
#Store the tweets into a dataframe
dataFrameEfteling <- do.call("rbind", lapply(efteling, as.data.frame))
#Setup up the connection to the database()
doConnect <- dbConnect(MySQL(), user="root", password="", dbname="portfolio", host="127.0.0.1")
dbWriteTable(doConnect, "tweetsEfteling", dataFrameEfteling)
eftelingResult <- dbSendQuery(doConnect, "select text from tweetsEfteling")
showResultEfteling <- fetch(eftelingResult, n=20)
Upvotes: 0
Views: 145
Reputation: 4930
Do you have access to crontab? If so, you can set it to run the script however frequently you like.
Here is a little information on crontab.
If your server is running linux, you can just type in
crontab -e
to pull up your personal crontab file. After that, you schedule your command. For every 30 mins, you would use this command.
*/30 * * * * /path/to/script
Save and exit.
Upvotes: 2
Reputation: 109
Have you considered using Twitter's streaming API vs REST? This would likely accomplish the same thing if you leave the connection open for an extended period of time. Plus it would cut down on API pulls. Try the streamR
package.
If you still want to set it on a timer—http://statistics.ats.ucla.edu/stat/r/faq/timing_code.htm looks useful.
Upvotes: 1