R script for unpacking data from a journal article

I am looking to used freely available data from published journal articles to explore social networks. Can anyone help me with some R script that will accommodate this?

We need a bit more information, especially what format the papers are in, that is, pdf, html, word processor, etc, and the tye of data needed. For instance are you looking at tables, text, images or whatever?

The best idea might be to point us to an example journal document and explain what data you want to extract.

Hi John

Thank you for coming back to me. I'm looking to extract the data from articles such as this;
Tables like Table 2


Have a look at the tabulizer package. It seems intended for this but I have not used it.

Oops, I was thinking of an article in pdf format.



content <- read_html("https://onlinelibrary.wiley.com/doi/full/10.1111/mec.15259")
tables <- content %>% html_table(fill = TRUE)

mytable <-   tables[[2]]

Shamelessly stolen from Scrape HTML Table using rvest | R-bloggers