Reputation: 526
I deployed a shiny app to our company's AWS server. It serves as an internal dashboard. I recently found some articles about how to make the charts and tables refresh dynamically from the database every x milliseconds. However, this causes plots to be re-rendered every x milliseconds regardless of whether it is necessary or not. Aside from making the DashBoard jittery, I'm somewhat worried about the resource drain it might represent when the DashBoard will be full (about 60-100 plots). Is there a way that one can make the charts refresh whenever the table in the database that is being called changes?
Suppose the following dataframe is in the database:
df = tibble(id = as.factor(c(1:10)),
value = c(sample(c(1:50), 10, replace = TRUE)),
variable = c(sample(c('blue','red','green'), 10, replace = TRUE)))
And the 'df' object now represents the pool connector to the database:
renderPlot{(
invalidateLater(10000)
my plot = df %>%
ggplot(aes(id, value, color = variable)) +
geom_point()
myplot
)}
This plot will basically be re-rendered every 10 seconds, which causes the entire plot to refresh. What I would like is to instead check the database every 10 seconds, and re-render the plot only if the database table being called has changed. Any ideas?
Upvotes: 1
Views: 1464
Reputation: 33500
Below please find an example on how to detect updates of a database table using reactivePoll
.
The upper part of the script starts a background R session and continuously writes data to a sqlite database (to illustrate a realstic scenario).
In the shiny app part I'm using a global reactivePoll
, which provides an update to the displayed data once the row count of the monitored database table changed.
Observing the row count instead of a timestamp is advantageous because not every database table will provide you with a timestamp and more over you might need to deal with clock shifts of local daylight saving times instead of UTC.
However, to realize a chunked update of the table we need an extra reactiveVal
:
library(DT)
library(DBI)
library(shiny)
library(callr)
library(RSQLite)
library(datasets)
library(data.table)
# create dummy data -------------------------------------------------------
bgprocess <- r_bg(func = function(){
con <- DBI::dbConnect(RSQLite::SQLite(), "iris_db.sqlite")
on.exit(dbDisconnect(con))
if(DBI::dbIsValid(con) && !DBI::dbExistsTable(con, "iris")){
DBI::dbWriteTable(con, "iris", datasets::iris[1,])
}
for (i in 2:NROW(iris)){
Sys.sleep(4)
print("Updating DB")
DBI::dbWriteTable(con, "iris", datasets::iris[i,], append = TRUE)
if(i == NROW(datasets::iris)){
print("Finished updating DB")
}
}
})
# bgprocess$is_alive()
print(paste("bgprocess pid:", bgprocess$get_pid()))
# shiny app ---------------------------------------------------------------
con <- dbConnect(RSQLite::SQLite(), "iris_db.sqlite")
row_count <- 0L
previous_row_count <- 0L
display_data <- reactiveVal(NULL)
ui <- fluidPage(DTOutput("my_db_data"))
db_data_chunk <- reactivePoll(
intervalMillis = 1000L, # check for a db update every second
session = NULL,
checkFunc = function() {
print(paste("Running checkFunc:", Sys.time()))
if(DBI::dbIsValid(con) && dbExistsTable(con, "iris")){
row_count <<- dbGetQuery(con, "select count(*) from iris")[[1]]
} else {
0L
}
},
valueFunc = function() {
if(DBI::dbIsValid(con) && dbExistsTable(con, "iris")){
print(paste("Running valueFunc: Updating display_data | Current row count:", row_count))
DT <- setDT(dbGetQuery(con, sprintf("select * from iris LIMIT %s OFFSET %s", row_count-previous_row_count, previous_row_count)))
previous_row_count <<- row_count
DT
} else {
NULL
}
}
)
server <- function(input, output, session) {
observeEvent(db_data_chunk(), {
if(is.null(display_data())){
display_data(db_data_chunk())
} else {
display_data(rbindlist(list(display_data(), db_data_chunk())))
}
})
# check ?dataTableProxy() and ?replaceData() to avoid re-rendering the table
output$my_db_data <- renderDT({req(display_data())}, server = FALSE)
}
shinyApp(ui, server, onStart = function() {
onStop(function() {
dbDisconnect(con)
})
})
Upvotes: 3
Reputation: 29407
You can maybe do something like this, note that data <- sendquery(query) # send the query to DB
is how you send the query to DB for new records, so adjust accordingly. Data will be shared across all sessions... You can use the reactivePoll
also, but it will compare the entire dataset with the previous one before update, unless you get the last timestamp or something similar into the checkFunc
, so the operation is small. Good example is provided by @ismirsehregal here. Its best to get the deltas (new records) from Database
directly and use R
for small things like displaying the data
library(shiny)
library(tibble)
library(ggplot2)
df <- tibble(id = as.factor(c(1:10)),
value = c(sample(c(1:50), 10, replace = TRUE)),
variable = c(sample(c('blue','red','green'), 10, replace = TRUE)))
autoInvalidate <- reactiveTimer(1000, session=NULL)
v <- reactiveValues(mytimestamp = Sys.time())
data <- NULL
new_data_check <- reactive({
autoInvalidate()
query <- 'select from table where timestamp > LASTTIMESTAMP'
query <- gsub("LASTTIMESTAMP",v$mytimestamp,query)
#data <- sendquery(query) # send the query to DB
if(!is.null(data)){
df <<- rbind(df,data)
v$mytimestamp <- max(data$timestamp)
}
})
ui <- fluidPage(
plotOutput("myplot")
)
server <- function(input, output, session) {
observe({
new_data_check()
})
newdata <- eventReactive(v$mytimestamp,{
df
})
output$myplot <- renderPlot({
myplot <- df %>%
ggplot(aes(id, value, color = variable)) +
geom_point()
myplot
})
}
shinyApp(ui, server)
Upvotes: 3