Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DEoptim - Future Package #13

Open
johncancian opened this issue Nov 11, 2021 · 7 comments
Open

DEoptim - Future Package #13

johncancian opened this issue Nov 11, 2021 · 7 comments

Comments

@johncancian
Copy link

Hello, my group uses DEoptim with a fitness function that is computationally intense and requires a significant amount of RAM. DEoptim cannot be run in parallel because the server it executes on runs out of RAM. We have been running it sequentially but it unfortunately takes quite a long time to find a solution. DEOptim is part of a larger calculation which we are transitioning to using future and doFuture packages with a Kubernetes cluster. Can DEoptim using parallelType=2 use doFuture to parallelize the population across multiple nodes?

@joshuaulrich
Copy link
Collaborator

Hi @johncancian, DEoptim uses foreach to run in parallel. So you can use any foreach backend.

@johncancian
Copy link
Author

Hi Joshua,
Thank you for your assistance. Have you had any experience with running DEoptim using parallelType=2 where the fitness function is also executing parallel tasks? This would be a nested parallelization case.

@joshuaulrich
Copy link
Collaborator

joshuaulrich commented Dec 13, 2021 via email

@johncancian
Copy link
Author

Our intention is to setup a Kubernetes cluster that follows the work of Chris Paciorek (https://www.jottr.org/2021/04/08/future-and-kubernetes/). We are current running DEoptim with parallelType = 0 and using foreach parallelization within the fitness function. Being able to use DEoptim with parallelType = 2 and foreach parallelization in our fitness function, we believe could significantly increase our process performance.

@joshuaulrich
Copy link
Collaborator

That should work. I don't have any experience doing something like that though.

@braverock
Copy link
Collaborator

foreach loops can be nested, as long as you have different engines, as you've suggested here. So the parent foreach will run single threaded, but your fitness function will use parallelType=2. This should be fine.

@johncancian
Copy link
Author

johncancian commented Mar 8, 2022

Hello Joshua and Brian, We have continued working on our process of migrating to a horizontal parallelization infrastructure using Kubernetes. We are running DEoptim sequentially using a fitness function that contains a foreach that has a %dopar%. Our parallel backend parallelizes the processing in the fitness function.

Would it be possible to have a nested construct in DEoptim? I tried setting up a parallelType = 3 in the attached code and replaced foreach::"%dopar%" with foreach::"%:%". It's not working: "Error in apply(i, 1, fn, ...) : object 'i' not found". Would you have any advice in this area?

  else if(ctrl$parallelType == 3) { ## use nested foreach 
    if(!foreach::getDoParRegistered()) {
      foreach::registerDoSEQ()
    }
    args <-  ctrl$foreachArgs
    fnPop <- function(params, ...) {
      my_chunksize <- ceiling(NROW(params)/foreach::getDoParWorkers())
      my_iter <- iterators::iter(params,by="row",chunksize=my_chunksize)
      args$i <- my_iter
      args$.combine <- c
      if (!is.null(args$.export))
        args$.export = c(args$.export, "fn")
      else
        args$.export = "fn"
      if (is.null(args$.errorhandling))
        args$.errorhandling = c('stop', 'remove', 'pass')
      if (is.null(args$.verbose))
        args$.verbose = FALSE
      if (is.null(args$.inorder))
        args$.inorder = TRUE
      if (is.null(args$.multicombine))
        args$.multicombine = FALSE
      foreach::"%:%"(do.call(foreach::foreach, args), apply(i,1,fn,...))
      
    }
  }

DEoptim.zip

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants