Setting context length Setting a larger context length will increase the amount of memory required to run a model. Ensure you have enough VRAM available to increase the context length.
This setting is in ollama desktop interface. Does it set it for the terminal too? Or are these two separate instances?