You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently skip_existing operates on all keys without any granularity.
This is a problem in RL when in a loss module for example you may want to skip existing "values" but you definitely never want to skip existing "memory" in a memory based model (RNN). AKA if you use skip_existing on memory keys you will never update your memory.
This is needed to support rnns in torch rl (issue pytorch/rl#1060)
We need a solution to make skip_existing more granular.
This is really simple and consists in feeding to the set_skip_existing funtion the keys we actually want to skip.
with set_skip_existing(["value", "value_target]):
loss(td) # Will use existing values but not existing hidden memory
by default, if no keys are passed, the beahvior remains the same as the current set_skip_existing=True
The text was updated successfully, but these errors were encountered:
Currently
skip_existing
operates on all keys without any granularity.This is a problem in RL when in a loss module for example you may want to skip existing "values" but you definitely never want to skip existing "memory" in a memory based model (RNN). AKA if you use skip_existing on memory keys you will never update your memory.
This is needed to support rnns in torch rl (issue pytorch/rl#1060)
We need a solution to make
skip_existing
more granular.This is really simple and consists in feeding to the
set_skip_existing
funtion the keys we actually want to skip.by default, if no keys are passed, the beahvior remains the same as the current
set_skip_existing=True
The text was updated successfully, but these errors were encountered: