Running your own AI in-house – Part 3 – LLM/LVM – Ollama on Telegram
In this quick guide I'll show you how to connect Ollama to Telegram with a pre-made bot – ready to use!
In this quick guide I'll show you how to connect Ollama to Telegram with a pre-made bot – ready to use!
Let's continue our article series by taking a look at the HTTP API, optimizing the systemd service, putting Nginx in front of it with HTTPS and making sure authentication is required; finally, we set up our code editor config to use our remote instance.
In this guide I'll show you how to install Ollama and running your first LLM/LVM, recommend some models, give you examples on how to script with it and finally how to describe images. This is the first article in a series of articles on how to run AI in-house.