matrix_ai_integrations_with_baibot #14
114
src/content/matrix_ai_integrations_with_baibot.md
Normal file
114
src/content/matrix_ai_integrations_with_baibot.md
Normal file
@ -0,0 +1,114 @@
|
|||||||
|
Title: Intergrating Ollama and Matrix with Baibot
|
||||||
|
Date: 2025-06-25 20:00
|
||||||
|
Modified: 2025-06-30 08:00
|
||||||
|
Category: AI, Data, Matrix
|
||||||
|
Tags: ai, kubernetes, matrix
|
||||||
|
Slug: ollama-matrix-integration
|
||||||
|
Authors: Andrew Ridgway
|
||||||
|
Summary: Integrating a Local LLM to a personal matrix server all the fun AND data sovereignty
|
||||||
|
|
||||||
|
### _Human Introduction_
|
||||||
|
I've been experimenting with AI and integrations I'm particuarly excited by the idea of using LLM's to integrate between different systems (Stay tuned for a blog [MCP](https://modelcontextprotocol.io/introduction) at some point in the future!)
|
||||||
|
|
||||||
|
Below I've thrown together some notes and had AI build a very quick how to on a cool little project that took next to no time to put together that I thought might be interesting for the group.. Enjoy!
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
# Matrix AI Integrations with baibot: A Fun Journey into Home Automation and LLMs
|
||||||
|
|
||||||
|
Alright, so I’ve been messing around with this cool project called **baibot**, which is a locally deployable bot for integrating Large Language Models (LLMs) into Matrix chatrooms. If you’re anything like me, you run your own Matrix server to keep things private and under control—whether it’s for family communication or interacting with the tech community. But one day, I thought, “Why not have my LLMs right where I’m already managing everything else?” Enter baibot.
|
||||||
|
|
||||||
|
**Setting Up My Own Matrix Server with baibot**
|
||||||
|
|
||||||
|
First off, I’ve got a home Matrix server running Element. Integrating baibot into this environment makes sense because it allows me to connect directly via the same platform. The key was getting the configuration right using examples from [baibot’s GitHub](https://github.com/etkecc/baibot/blob/main/docs/sample-provider-configs/ollama.yml). For instance, connecting to an Ollama gemma3 model with a specific prompt ensures it’s lighthearted yet responsive:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
base_url: http://<my_ollama_ip>:11434/v1
|
||||||
|
text_generation:
|
||||||
|
model_id: gemma3:latest
|
||||||
|
prompt: 'You are a lighthearted bot...'
|
||||||
|
temperature: 0.9
|
||||||
|
max_response_tokens: 4096
|
||||||
|
max_context_tokens: 128000
|
||||||
|
```
|
||||||
|
|
||||||
|
This gives me precise control over the bot’s behavior, ensuring each instance in Matrix rooms behaves exactly as intended.
|
||||||
|
|
||||||
|
**Deploying to Kubernetes**
|
||||||
|
|
||||||
|
To ensure reliability, I used Kubernetes. Here's a breakdown of the key files:
|
||||||
|
|
||||||
|
* **Deployment.yaml**: Manages pod replicas, security contexts, and volume mounts for persistence.
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
apiVersion: apps/v1
|
||||||
|
kind: Deployment
|
||||||
|
metadata:
|
||||||
|
labels:
|
||||||
|
app: ridgway-bot
|
||||||
|
name: ridgway-bot
|
||||||
|
spec:
|
||||||
|
replicas: 1
|
||||||
|
strategy:
|
||||||
|
type: Recreate
|
||||||
|
template:
|
||||||
|
spec:
|
||||||
|
containers:
|
||||||
|
- image: ghcr.io/etkecc/baibot:v1.7.4
|
||||||
|
name: baibot
|
||||||
|
volumeMounts:
|
||||||
|
- name: ridgway-bot-cm
|
||||||
|
mountPath: /app/config.yml
|
||||||
|
- name: ridgway-bot-pv
|
||||||
|
mountPath: /data
|
||||||
|
volumes:
|
||||||
|
- name: ridgway-bot-cm
|
||||||
|
configMap:
|
||||||
|
name: ridgway-bot
|
||||||
|
- name: ridgway-bot-pv
|
||||||
|
persistentVolumeClaim:
|
||||||
|
claimName: ridgway-bot-storage
|
||||||
|
```
|
||||||
|
|
||||||
|
* **Persistent Volume Claim (PVC)** ensures data storage for baibot.
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
apiVersion: v1
|
||||||
|
kind: PersistentVolumeClaim
|
||||||
|
metadata:
|
||||||
|
name: ridgway-bot-storage
|
||||||
|
spec:
|
||||||
|
accessModes:
|
||||||
|
- ReadWriteMany
|
||||||
|
resources:
|
||||||
|
requests:
|
||||||
|
storage: 500Mi
|
||||||
|
```
|
||||||
|
|
||||||
|
The deployment script handles namespace creation, config maps, PVCs, and waits for the pod to be ready before copying data.
|
||||||
|
|
||||||
|
**Integrating with OpenWebUI for RAG**
|
||||||
|
|
||||||
|
Another cool aspect is integrating baibot with **OpenWebUI**, which acts as an OpenAI-compatible API. This allows me to leverage models I’ve created in OpenWebUI that include knowledge bases (RAG). The config here uses OpenWebUI’s endpoints:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
base_url: 'https://<my-openwebui-endpoint>/api/'
|
||||||
|
api_key: <my-openwebui-api-key>
|
||||||
|
text_generation:
|
||||||
|
model_id: andrew-knowledge-base
|
||||||
|
prompt: 'Your name is Rodergast...'
|
||||||
|
```
|
||||||
|
|
||||||
|
This setup lets me access RAG capabilities directly within Matrix chats, all without writing a single line of code. It’s like having my very own AI research assistant right there in the chatroom.
|
||||||
|
|
||||||
|
**Future Steps and Challenges**
|
||||||
|
|
||||||
|
Now that baibot is up and running, I’m already thinking about expanding its use cases. The next step might be integrating it with **Home Assistant** for alarm notifications or other automation tasks. However, my current setup uses an older gaming PC, which struggles with computational demands. This could lead to a rearchitecting effort—perhaps moving to a dedicated server or optimizing the hardware.
|
||||||
|
|
||||||
|
**Conclusion**
|
||||||
|
|
||||||
|
Baibot has been a fantastic tool for experimenting with AI integrations in Matrix. By leveraging existing infrastructure and OpenWebUI’s capabilities, I’ve achieved full control over data privacy and customization. The next frontier is expanding these integrations into more practical applications like home automation. Stay tuned for updates!
|
||||||
|
|
||||||
|
**Final Thoughts**
|
||||||
|
|
||||||
|
It’s incredibly rewarding to see how open-source projects like baibot democratize AI access. Whether you’re a hobbyist or a pro, having tools that let you run LLMs locally without vendor lock-in is game-changing. If you’re interested in diving deeper, check out the [baibot GitHub](https://github.com/etkecc/baibot) and explore its documentation. Happy coding!
|
Loading…
x
Reference in New Issue
Block a user