-
Notifications
You must be signed in to change notification settings - Fork 159
Open
Description
im using the 🖥️LLM-GGUF Loader and the 🖥️Local LLM general link after 3 queues the vram is filled 98% is there a way to unload the model after the request because even the comfyui unload buttons don't work. i've to restart comfyui to free the vram.
i've disabled is_locked, is_memory but that doesn't change anything.
Metadata
Metadata
Assignees
Labels
No labels