blog/src/content/matrix_ai_integrations_with_baibot.md
2025-06-27 06:22:56 +00:00

3.2 KiB
Raw Blame History

Matrix AI Integrations with baibot

I've been experimenting with baibot (https://github.com/etkecc/baibot), a locally deployable bot for integrating Large Language Models (LLMs) into Matrix chatrooms. This setup allows me to interact with LLMs directly within my own Matrix server, enhancing both personal and community communication.

Key Setup Steps

  1. Configuration:
  2. Kubernetes Deployment:
    • Deploy using a custom Deployment.yaml and PersistentVolumeClaim (PVC) for storage persistence.
    • Example Deployment.yaml:
apiVersion: apps/v1
kind: Deployment
metadata:
  ...
spec:
  template:
    spec:
      containers:
        - env:
            - name: BAIBOT_PERSISTENCE_DATA_DIR_PATH
              value: /data
          image: ghcr.io/etkecc/baibot:v1.7.4
          volumeMounts:
            - mountPath: /app/config.yml
              subPath: config.yml
            - mountPath: /data
              subPath: data
  • PVC setup (pvc-ridgway-bot.yaml):
apiVersion: v1
kind: PersistentVolumeClaim
metadata:
  name: ridgway-bot-storage
spec:
  storageClassName: longhorn
  accessModes:
    - ReadWriteMany
  resources:
    requests:
      storage: 500Mi
  1. Kubernetes Deployment Script:
kubectl delete namespace ridgway-bot
kubectl create namespace ridgway-bot
kubectl -n ridgway-bot create cm ridgway-bot --from-file=config.yml=./config.yml
kubectl apply -f pvc-ridgway-bot.yaml
kubectl apply -f Deployment.yaml
sleep 90 && kubectl cp data/* $(kubectl get pods -o custom-columns=":metadata.name" -n ridgway_bot | head -n1):/data
  1. Post-Deployment:
    • Connect the bot to Matrix rooms via Elements admin interface.
    • Fine-tune configurations (e.g., temperature, prompts) for specific rooms.

Example Configurations

Ollama Integration:

base_url: http://192.168.178.45:11434/v1
text_generation:
  model_id: gemma3:latest
  prompt: 'You are a lighthearted bot...'
  temperature: 0.9
  max_response_tokens: 4096
  max_context_tokens: 128000

Openwebui Integration (RAG):

base_url: https://ai.aridgwayweb.com/api/
api_key: <my-openwebui-api-key>
text_generation:
  model_id: andrew-knowledge-base
  prompt: 'Your name is Rodergast...'
  temperature: 0.7
  max_response_tokens: 4096
  max_context_tokens: 128000

Benefits of Local Deployment

  • Full Control: Data privacy and compliance without third-party dependencies.
  • Scalability: Kubernetes enables easy scaling as needed.
  • Flexibility: Combine with services like openwebui for rich contextual responses.

Future Plans

Next, I aim to integrate baibot with Home Assistant for alarm notifications. However, current hardware limitations (a 10-year-old PC) may necessitate a more powerful setup in the future.

Stay tuned for updates!

Conclusion

baibot enhances Matrix interactions by enabling direct LLM integration, offering seamless control over room-specific behaviors. Combining local deployment with RAG capabilities via openwebui demonstrates DIY tech stack potential.

Explore further and share your experiences! 🚀🤖