llms.py
llms.py is a Python-based interface for local language models that emphasizes extensibility and customization over polish. While Open WebUI provides a ChatGPT-like experience out of the box, llms.py gives you a framework for building exactly the AI interface you need.
The tool provides core functionality for model interaction, conversation management, and tool integration, but leaves the interface and workflow design to you. This makes it more work to set up initially but more flexible for specific use cases. You're building your own AI interface rather than configuring someone else's vision.
Extensibility Over Polish
The value proposition is control. You can modify how conversations are structured, how context is managed, how tools are integrated, and how results are presented. The codebase is straightforward Python, making it easy to understand and extend. When you need specific functionality, you add it directly rather than working around limitations of a pre-built system.
This approach works well when you have specific requirements that don't fit standard chat interfaces. Maybe you want to integrate with particular APIs, process data in specific ways, or present results in custom formats. llms.py provides the foundation without imposing constraints on what you build.
Mobile Experience Challenges
The current limitation is mobile interface design. While the backend works well, creating a mobile-friendly interface requires additional development. This is the trade-off of extensibility: you get exactly what you build, but you have to build it. For desktop use through a browser, the flexibility is valuable. For mobile, the lack of polish becomes more apparent.
The path forward is either investing time in mobile interface development or accepting that llms.py serves desktop use cases while other tools handle mobile. The extensibility that makes it powerful for custom workflows also makes it more work to achieve feature parity with polished alternatives.
Related Topics:
- Open WebUI - Polished alternative
- Ollama Help - Local LLM server
- Self-Hosting a Home Server - Infrastructure guide