Question re: Advanced R, functions

From this page:
http://adv-r.had.co.nz/Functions.html#lexical-scoping

I'm confused with the following. My confusion comes from the fact that the first and second codes perform identically for me, and not as suggested in the text. The statement that "x is lazily evaluated the first time that you call one of the adder functions. At this point, the loop is complete and the final value of x is 10. Therefore all of the adder functions will add 10 on to their input, probably not what you wanted!" isn't something I can reconcile. Any help would be appreciated.

add <- function(x) {
function(y) x + y
}
adders <- lapply(1:10, add)
adders[1]

[1] 11

adders[10]

[1] 20

x is lazily evaluated the first time that you call one of the adder functions. At this point, the loop is complete and the final value of x is 10. Therefore all of the adder functions will add 10 on to their input, probably not what you wanted! Manually forcing evaluation fixes the problem:

add <- function(x) {
force(x)
function(y) x + y
}
adders2 <- lapply(1:10, add)
adders2[1]

[1] 11

adders2[10]

[1] 20

You should probably read 2nd edition of Advanced R.

The answer to your question is there:

The apply functions underwent this same change in R 3.2.0:

Higher order functions such as the apply functions and Reduce() now force arguments to the functions they apply in order to eliminate undesirable interactions between lazy evaluation and variable capture in closures.

You can replicate the behavior that is mentioned in this passage with for loop (it's in the link I've provided):

add <- function(x) {
  function(y) x + y
}

adders <- list()
for (i in 1:10) {
  adders[[i]] <- add(i)
}

adders[[1]](10)
#> [1] 20
adders[[10]](10)
#> [1] 20
1 Like

Thank you! My bad for reading the old version. Thank you for the correct link!

1 Like

Keep in mind, that 2nd edition is work-in-progress, so some changes will certainly happen before it is published. But additions to the book are worth the effort, IMHO.