Reputation: 1741
The html generated using the below code has a file size of 1.2 MB, due in part to the embedded web image. I'm making a simple web page that will eventually include dozens of these pictures, and the file size will become overwhelming quickly.
Is there a way to programatically downsample the web images so that the resulting html file is a manageable size? I would be okay with hardcoding the downsampled resolution size for the images; they will all be very similar to the one in the example.
Adjusting the dpi
and out.width
chunk options did not change the file size, though the latter did change the size of the picture in the output.
Thanks very much for your assistance.
---
output: html_document
---
```{r setup, include=FALSE}
knitr::opts_chunk$set(echo = FALSE, message = FALSE, warning = FALSE, cache = TRUE)
options(scipen=999)
library(knitr)
```
```{r, out.width = "600px", dpi = 36}
include_graphics("https://i.imgur.com/BacCbVa.png")
```
Upvotes: 0
Views: 764
Reputation: 1741
Thanks to J_F for pointing me to the magick
package in the comments.
The code below took my HTML from 1.2 MB to 940 KB with no resolution loss. Tweaking the scaling size while keeping the same out.width
in the chunk options lets me reduce the file size further if I'm willing to accept some image quality loss.
This works for now, but I'm going to keep this question open for a bit longer in case anyone has a solution that reduces the size further.
---
output: html_document
---
```{r setup, include=FALSE}
knitr::opts_chunk$set(echo = FALSE, message = FALSE, warning = FALSE, cache = TRUE)
options(scipen=999)
library(knitr)
library(magick)
library(tidyverse)
add_image <- function(filepath, size){
image_read(filepath) %>%
image_scale(str_extract(size, "[0-9]+"))
}
```
```{r, out.width = "600px", dpi = 300}
add_image("https://i.imgur.com/BacCbVa.png", 300)
```
Upvotes: 2