Ollama

Using Ollama Web Search API in Python

Using Ollama Web Search API in Python

Build AI search agents with Python and Ollama

Ollama’s Python library now includes native OLlama web search capabilities. With just a few lines of code, you can augment your local LLMs with real-time information from the web, reducing hallucinations and improving accuracy.

Ollama Enshittification - the Early Signs

Ollama Enshittification - the Early Signs

My view on current state of Ollama development

Ollama has quickly become one of the most popular tools for running LLMs locally. Its simple CLI, and streamlined model management have made it a go-to option for developers who want to work with AI models outside the cloud. But as with many promising platforms, there are already signs of Enshittification:

Chat UIs for Local Ollama Instances

Chat UIs for Local Ollama Instances

Quick overview of most prominent UIs for Ollama in 2025

Locally hosted Ollama allows to run large language models on your own machine, but using it via command-line isn’t user-friendly. Here are several open-source projects provide ChatGPT-style interfaces that connect to a local Ollama.