Running local AI code assist to power your IDEs with Ollama
May 24, 2024
I'm intrigued in how effective it is to run code assist models locally. I'm keen to explore the available IDE extensions and AI models. Let's start with VSCode, the Code GPT extension and models run locally with Ollama.
Locally Running GenAI and Large Language Models with Ollama
May 16, 2024
If you are interested in exploring Generative AI without relying on cloud services, Ollama can run open models entirely locally, giving you a chance to explore GenAI APIs and capabilities.