I can recommend partial least squares as a well-established way to include variables that are highly correlated in a regression. It's able to collapse multiple correlated predictor variables into a smaller number of latent variables. In this case, among your ~15 variables, it may be the case that only 3 latent variables captures the majority of information in the variables.
You do have to choose the number of latent variables to include. There are some heuristics to choose the appropriate number that may give you a simple solution, but the most rigorous way is to use cross-validation to determine the best number. I can recommend caret with method pls as a way to do this in R.
If your data is high-frequency, then each observation may have an autocorrelation with the recent observations. This may lead you to choose a cross-validation method that separates the cross-validation folds in time either with the timeslice argument in trainControl or simply by specifying the fold indices with the index argument to trainControl