Setup a local LLM and a generic application for carrying out common LLM tasks.
OLlama Allows you to host the models locally on your won machine.
Anything LLM A generic LLM utility application that can be used with local and remotely hosted LLM.
Download and install Ollama on your machine.
You can Watch this video to learn more about Ollama.
Open a terminal windown on your machine. Run the command below. This will download the gemma2 model weights to the file system cache.
Watch this video to learn more about how to use other models with OLlama
ollama run gemma2
Launch the AnythingLLM desktop application
Provide email address to register
Setup AnythingLLM to use local OLlama
Think of this as a session e.g., name it ’test'
Interact with local LLM !!