This post outlines the process of setting up Ollama as a server and Ellama as an Emacs client. The result is a localized AI companion that enhances your Emacs experience.
Embracing the world of Emacs has been a delightful journey, and I can’t wait to share it with all of you!
Here’s another addition to my aspiring blog: https://www.rahuljuliato.com/posts/ellama
This post outlines the process of setting up Ollama as a server and Ellama as an Emacs client. The result is a localized AI companion that enhances your Emacs experience.
Embracing the world of Emacs has been a delightful journey, and I can’t wait to share it with all of you!
that’s so cool, thanks!
I think I’m gonna need new hardware if I want to play with this…
Nice!
Ollama runs in CPU mode if you don’t have a GPU, also there are models that can be used with 4Gb of RAM.
Things won’t be fast but for experimenting it is enought :)