I'm wondering what best practice is regarding creating a loop to continually pull data via HTTP Requests where there is a limit set by the API. In this case, a max of 30 records per requests.
#30 CVEs w exploit by id - get the data & assign to a variable
codeexecexpl <- GET("http://www.cvedetails.com/json-feed.php?",
query = list(numrows="30",vendor_id="0", product_id="0",version_id="0", hasexp="1", opec="0",opov="0",opcsrf="0",opfileinc="0",opgpriv="0",opsqli="0", orderby="3")
)
#get content of response
codeexecexpl_content <- content(codeexec, as = "text", encoding = "UTF-8")
fromJSONTrue <- fromJSON(codeexecexpl_content, simplifyVector = TRUE)
I'm still not sure how to deal with the fact that the API limits the number of records I can request? Can you please explain how I can continually download 30 records at a time until I've downloaded all records for code execution for example?
Do you know how to iterate on your search ? Which parameters will change ? creating a list of parameters, then iterating with purrr function on httr could do the job.