8.6 KiB
Title: MCP and Ollama - Local Assistant is getting nearer Date: 2025-07-24 20:00 Modified: 2025-07-24 20:00 Category: AI Tags: tech, ai, ollama, mcp, ai-tools Slug: mcp-ollama-local-assistant-soon Authors: Andrew Ridgway Summary: An Exploration of the Model Context Protocol and its potential to revolutionise how we interact with AI
Human Introduction
So for today's blog I've upped the model paramters on both the editors and a couple drafters.. and I have to say I think we've nailed what my meagre hardware can achieve in terms of content production. The process take 30 more minutes to churn now but that quality output more than makes up for it. For context we are now using:
- Editor: Gemma3:27b
- Journalist 1: phi4-mini:latest
- Journalist 2: phi4:latest
- Journalist 3: deepseek-r1:14b <-> I know but it is good even if it won't talk about Tiananmen Square
- Journalist 4: qwen3:14b
As you can see if you compare some of the other blogs this Blog has really nailed tone and flow. Some of the content was wrong.. it though I "wrote" MCPO, I didn't, I wrapped it, and the sign off was very cringe but otherwise the blog is largely what came out from the editor.
It's very exciting to see as I get better hardware and can run better models I fully see this being something that could potentially not need much editing on this side.. have to see how it goes moving forward... anyways, without futher adieu, Behold.. MCP and Ollama - A blog ABOUT AI BY AI
Introduction: Beyond the Buzzwords – A Real Shift in AI
For the last couple of weeks, I’ve been diving deep into MCP – both for work and personal projects. It’s that weird intersection where hobbies and professional life collide. Honestly, I was starting to think the whole AI hype was just that – hype. But MCP? It’s different. It’s not just another buzzword; it feels like a genuine shift in how we interact with AI. It’s like finally getting a decent internet connection after years of dial-up.
The core of this change is the Model Context Protocol itself. It’s an open specification, spearheaded by Anthropic, but rapidly gaining traction across the industry. Google’s thrown its weight behind it with MCP Tools, and Amazon’s building it into Bedrock Agent Core. Even Apple, with its usual air of exclusivity, is likely eyeing this space.
What Is MCP, Anyway? Demystifying the Protocol
Okay, let’s break it down. MCP is essentially a standardized way for Large Language Models (LLMs) to interact with tools. Think of it as giving your AI a set of keys to your digital kingdom. Instead of just talking about doing things, it can actually do them.
Traditionally, getting an LLM to control your smart home, access your code repository, or even just send an email required a ton of custom coding and API wrangling. MCP simplifies this process by providing a common language and framework. It’s like switching from a bunch of incompatible power adapters to a universal charger.
The beauty of MCP is its openness. It’s not controlled by a single company, which fosters innovation and collaboration. It’s a bit like the early days of the internet – a wild west of possibilities.
My MCP Playground: Building a Gateway with mcpo
I wanted to get my hands dirty, so I built a little project wrapping mcpo in a container that can pull in config to create a containerised service. It’s a gateway that connects OpenWebUI – a fantastic tool for running LLMs locally – with various MCP servers.
The goal? To create a flexible and extensible platform for experimenting with different AI agent tools within my build pipeline. I wanted to be able to quickly swap out different models, connect to different services, and see what happens. It’s a bit like having a LEGO set for AI – you can build whatever you want.
You can check out the project here. If you’re feeling adventurous, I encourage you to clone it and play around. I’ve got it running in my k3s cluster (a lightweight Kubernetes distribution), but you can easily adapt it to Docker or other containerization platforms.
Connecting the Dots: Home Assistant and Gitea Integration
Right now my wrapper, supports two key services: Home Assistant and Gitea.
Home Assistant is my smart home hub – it controls everything from the lights and thermostat to the security system. Integrating it with mcpo allows me to control these devices using natural language commands. Imagine saying, “Hey AI, dim the lights and play some jazz,” and it just happens. It’s like living in a sci-fi movie.
Gitea is my self-hosted Git service – it’s where I store all my code. Integrating it with mcpo allows me to use natural language to manage my repositories, create pull requests, and even automate code reviews. It’s like having a personal coding assistant.
I initially built a custom Gitea MCP server to get familiar with the protocol. But the official Gitea-MCP project (here) is much more robust and feature-rich. It’s always best to leverage existing tools when possible.
The Low-Parameter Model Challenge: Balancing Power and Efficiency
I’m currently experimenting with low-parameter models like Qwen3:4B and DeepSeek-R1:14B. These models are relatively small and efficient, which makes them ideal for running on local hardware. However, they also have limitations.
One of the biggest challenges is getting these models to understand complex instructions. They require very precise and detailed prompts. It’s like explaining something to a child – you have to break it down into simple steps.
Another challenge is managing the context window. These models have a limited memory, so they can only remember a certain amount of information. This can make it difficult to have long and complex conversations.
The Future of AI Agents: Prompt Engineering and Context Management
I believe the future of AI lies in the development of intelligent agents that can seamlessly interact with the world around us. These agents will need to be able to understand natural language, manage complex tasks, and adapt to changing circumstances.
Prompt engineering will be a critical skill for building these agents. We’ll need to learn how to craft prompts that elicit the desired behavior from the models. Almost like coding in a way but with far less structure and no need to understand the "syntax". But we're a long way from here yet
Context management will also be crucial. We’ll need to develop techniques for storing and retrieving relevant information, so the models can make informed decisions.
Papering Over the Cracks: Using MCP to Integrate Legacy Systems
At my workplace, we’re exploring how to use MCP to integrate legacy systems. Many organizations have a patchwork of different applications and databases that don’t easily communicate with each other.
MCP can act as a bridge between these systems, allowing them to share data and functionality. It’s like building a universal translator for your IT infrastructure.
This can significantly reduce the cost and complexity of integrating new applications and services, if we get the boilerplate right.
Conclusion: The Dawn of a New Era in AI
MCP is not a silver bullet, but it’s a significant step forward in the evolution of AI. It provides a standardized and flexible framework for building intelligent agents that can seamlessly interact with the world around us.
I’m excited to see what the future holds for this technology. I believe it has the potential to transform the way we live and work.
If you’re interested in learning more about MCP, I encourage you to check out the official website (https://modelcontextprotocol.io/introduction) and explore the various projects and resources that are available.
And if you’re feeling adventurous, I encourage you to clone my mcpo project (https://git.aridgwayweb.com/armistace/mcpo_mcp_servers) and start building your own AI agents.
It's been a bit of a ride. Hopefully I'll get a few more projects that can utilise some of these services but with so much new stuff happening my 'ooo squirell' mentality could prove a bit of a headache... might be time to crack open the blog_creator and use crew ai and mcp to create some research assistants on top of the drafters and editor!
Talk soon!