Looping http requests with rate limits - best practice

Hi all,

I'm wondering what best practice is regarding creating a loop to continually pull data via HTTP Requests where there is a limit set by the API. In this case, a max of 30 records per requests.

#30 CVEs w exploit  by id - get the data & assign to a variable
codeexecexpl <- GET("http://www.cvedetails.com/json-feed.php?",
                query = list(numrows="30",vendor_id="0", product_id="0",version_id="0", hasexp="1", opec="0",opov="0",opcsrf="0",opfileinc="0",opgpriv="0",opsqli="0", orderby="3")
) 
#get content of response
codeexecexpl_content <- content(codeexec, as = "text", encoding = "UTF-8")
fromJSONTrue <- fromJSON(codeexecexpl_content, simplifyVector = TRUE)

Thanks!

I think you could be interested by the polite :package:

There is feature about rate limit, that is using ratelimitr and memoise :package:.

Also, purrr has now some functions regarding rate:
See News purrr 0.3.0

1 Like

Awesome thanks! I'll check them out!

1 Like

I'm still not sure how to deal with the fact that the API limits the number of records I can request? Can you please explain how I can continually download 30 records at a time until I've downloaded all records for code execution for example?

Do you know how to iterate on your search ? Which parameters will change ? creating a list of parameters, then iterating with purrr function on httr could do the job.

Know that there seems also other API :

there may be the same info and different limitations ?

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.