xyz
xyz

Reputation: 186

Get the URL of an .url (Windows URL shortcut) file

I want to get the URL of an .url shortcut file (made in Windows) in R.

The file format looks like this:

[{000214A0-0000-0000-C000-000000000046}]
Prop4=31,Stack Overflow - Where Developers Learn, Share, & Build Careers
Prop3=19,11
[{A7AF692E-098D-4C08-A225-D433CA835ED0}]
Prop5=3,0
Prop9=19,0
[InternetShortcut]
URL=https://stackoverflow.com/
IDList=
IconFile=https://cdn.sstatic.net/Sites/stackoverflow/img/favicon.ico?v=4f32ecc8f43d
IconIndex=1
[{9F4C2855-9F79-4B39-A8D0-E1D42DE1D5F3}]
Prop5=8,Microsoft.Website.E7533471.CBCA5933

and has some documentation.

I have used file.info(). But it only shows the information of the first properties header, I guess.

I need to do this in R, because I have a long list of .url files, which addresses I need to convert.

Upvotes: 0

Views: 2270

Answers (1)

hrbrmstr
hrbrmstr

Reputation: 78822

Crude way (I'll update this in a sec):

ini::read.ini("https://rud.is/dl/example.url")$InternetShortcut$URL
## [1] "https://rud.is/b/2017/11/11/measuring-monitoring-internet-speed-with-r/"

Made slightly less crude:

read_url_shortcut <- function(x) {
  require(ini)
  x <- ini::read.ini(x)  
  x[["InternetShortcut"]][["URL"]]
}

Without the ini package dependency:

read_url_shortcut <- function(x) {
  x <- readLines(x)
  x <- grep("^URL", x, value=TRUE)
  gsub("^URL[[:space:]]*=[[:space:]]*", "", x)
}

More "production-worthy" version:

#' Read in internet shortcuts (.url or .webloc) and extract URL target
#' 
#' @param shortcuts character vector of file path+names or web addresses
#'        to .url or .webloc files to have URL fields extracted from.
#' @return character vector of URLs
read_shortcut <- function(shortcuts) {

  require(ini)
  require(xml2)
  require(purrr)

  purrr::map_chr(shortcuts, ~{

    if (!grepl("^http[s]://", .x)) {
      .x <- path.expand(.x)
      if (!file.exists(.x)) return(NA_character_)
    }

    if (grepl("\\.url$", .x)) {
      .ini <- suppressWarnings(ini::read.ini(.x)) # get encoding issues otherwise
      .ini[["InternetShortcut"]][["URL"]][1] # some evidence multiple are supported but not sure so being safe
    } else if (grepl("\\.webloc$", .x)) {
      .x <- xml2::read_xml(.x)
      xml2::xml_text(xml2::xml_find_first(.x, ".//dict/key[contains(., 'URL')]/../string"))[1] # some evidence multiple are supported but not sure so being safe
    } else {
      NA_character_
    }  

  })

}

Ideally, such a function would return a single data frame row with all relevant info that could be found (title, URL and icon URL, creation/mod dates, etc). I'd rather not keep my Windows VM up long enough to generate sufficient samples to do that.

NOTE: Said "production"-ready version still doesn't gracefully handle edge cases where the file or web address is not readable/reachable nor does it deal with malformed .url or .webloc files.

Upvotes: 2

Related Questions