Skip to content

Commit

Permalink
Empty cache after model unloading for normal vram and lower.
Browse files Browse the repository at this point in the history
  • Loading branch information
comfyanonymous committed Jul 9, 2023
1 parent d3f5998 commit 0ae81c0
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions comfy/model_management.py
Original file line number Diff line number Diff line change
Expand Up @@ -238,6 +238,8 @@ def unload_model():
current_loaded_model.model_patches_to(current_loaded_model.offload_device)
current_loaded_model.unpatch_model()
current_loaded_model = None
if vram_state != VRAMState.HIGH_VRAM:
soft_empty_cache()

if vram_state != VRAMState.HIGH_VRAM:
if len(current_gpu_controlnets) > 0:
Expand Down

0 comments on commit 0ae81c0

Please sign in to comment.