All,
I have a set of polygons that are our units of observation. Let's call them our sectors of interest.
I want to find the distance from each sector of interest to the nearest police station. I have polygons of police stations, but don't care about the m x n
matrix of distances from every sector to every police station. My strategy is to use st_combine
to make one big polygon of all the police stations and then use st_distance
to find the distances.
Unfortunately it is very slow, much slower than the equivalent operation using arcpy
and slower than I would expect considering it's calling optimized GDAL
code.
Here is the code I am using to find the distances
match_polygons_distance_to = function(polygons, shape, idvar) {
idvar = enquo(idvar)
# combine all the shapes into one so we don't have a dense matrix.
combined_shape = st_combine(shape)
# This is an array of distances with the same length as `polygons`
# Tolerance is set to 100 to prevent too much searching
distances = st_distance(polygons, combined_shape, tolerance = 100, by_element = TRUE)
# Make a dataframe of these to return. Will merge in later
polygons_df = polygons %>% st_set_geometry(NULL) %>%
mutate(distance = distances) %>%
select(!!idvar, distance)
return(polygons_df)
}
I don't care too much about the actual distances, and would be fine to get distances within 50 or 100 meters of accuracy.
What I imagine a good algorithm would do is just, for each polygon, draw a donut of 50m around it and check if it contains a police station, then do a 100m donut etc. But I'm not sure what methods to call to get this procedure.
Any help is appreciated.