I am picturing something similar to the with_*() and local_*() functions from withr. Example use case: ropensci/drake#561. When a drake user selects a distributed parallel backend, the imported functions and files are still processed locally. (If we are just hashing an input data file, it does not make sense to send the job to a compute node on a cluster.) In drake, it might be nice to write something like this:
make <- function(...){
future::with_plan(multiprocess, {
process_imports(...)
}
build_targets(...) # Uses whatever future::plan() the user selects before make().
}
Or alternatively:
make <- function(...){
process_imports(...)
build_targets(...) # Uses whatever future::plan() the user selects before make().
}
where
process_imports <- function(...){
if (jobs > 1L) {
future::local_plan(multiprocess)
launch_futures_for_imports(...)
} else {
just_loop_through_imports(...)
}
}
I am picturing something similar to the
with_*()andlocal_*()functions fromwithr. Example use case: ropensci/drake#561. When adrakeuser selects a distributed parallel backend, the imported functions and files are still processed locally. (If we are just hashing an input data file, it does not make sense to send the job to a compute node on a cluster.) Indrake, it might be nice to write something like this:Or alternatively:
where