4.3 KiB
Creating an Ollama Blog Writer: Your Ultimate DevOps Guide
As software developers and DevOps enthusiasts, we all know the struggles of creating blog content. Whether it's finding inspiration or dealing with writer's block, the process can be daunting. But fear not, dear engineers! In this blog, we're going to share our secret weapon: Ollama, the AI writing assistant that will help you create high-quality content effortlessly.
Step 1: Communicate with your local Ollama instance First, make sure you have a local Ollama instance running on your machine. If you don't already have one, we recommend checking out their official documentation to get started. Once it's up and running, create a Python script that can communicate with it. Here's an example:
import requests
from ollama_client import OllamaClient
# Set your API key here
api_key = "your_api_key"
# Connect to the local Ollamma instance
ollama_url = "http://localhost:5000"
ollama_client = OllamaClient(ollama_url, api_key)
# Prompt Ollama to generate some content for us
prompt = "Write a 1000-word blog post about DevOps best practices."
response = ollama_client.create(text=prompt)
content = response["generated"]
print(content)
Step 2: Connect to Trilum for structured notes as prompts Trilum is a powerful personal knowledge base and note-taking app that allows you to structure your notes in a hierarchical and interconnected way. It also has a Python API, which we can use to connect to it and fetch our notes as prompts for Ollama. Here's an example:
import trilium_py
from trilium import nodes
from trilium.api import Client
from trilium.exceptions import InvalidApiKeyException, UnknownNodeTypeException
# Set your API key and Trilium URL here
api_key = "your_api_key"
trilium_url = "https://example.com"
# Connect to the local Trilium instance
client = Client(trilium_url, api_key)
# Fetch a specific node as our prompt
node_id = "some_node_id"
try:
node = client.get_node(node_id).data
# Convert the Trilium Node to plain text for Ollama
prompt = nodes.TextNode(node)
except (InvalidApiKeyException, UnknownNodeTypeException):
# Handle API key or node type errors gracefully
pass
# Call Ollama with our prompt
prompt_text = prompt.content
ollama_url = "http://localhost:5000"
ollama_client = OllamaClient(ollama_url, api_key)
response = ollama_client.create(text=prompt_text)
content = response["generated"]
print(content)
Step 3: Create a blog entry in your Git repo and send a notification to Matrix Once you have some content generated by Ollama, it's time to create the actual blog post. For this, we recommend using Git for version control and a CI/CD pipeline that automatically deploys your blog to your website. Here's an example of how you might do this:
import os
from git import Git
from git.repo import BaseRepository
from git.exc import InvalidGitRepositoryError
from git.remote import RemoteAction
# Set the path to your blog repo here
blog_repo = "/path/to/your/blog/repo"
# Checkout a new branch and create a new file for our blog post
branch_name = "new-post"
try:
repo = Git(blog_repo)
repo.checkout("-b", branch_name, "origin/main")
with open("my-blog-post.md", "w") as f:
f.write(content)
except InvalidGitRepositoryError:
# Handle repository errors gracefully
pass
# Add and commit the changes to Git
repo.add("my-blog-post.md")
repo.commit("-m", "Added new blog post about DevOps best practices.")
# Push the changes to Git and create a PR
repo.remote().push("refs/heads/{0}:refs/for/main".format(branch_name), "--set-upstream")
base_branch = "origin/main"
target_branch = "main"
pr_title = "DevOps best practices"
try:
repo.create_head("{0}-{1}", base=base_branch, message="{}".format(pr_title))
except RemoteAction.GitExitStatus as e:
# Handle Git exit status errors gracefully
pass
# Send a notification to Matrix about the new PR
matrix_server = "https://example.com"
room_id = "#blog-updates"
content = "[Blog] {} - Review please!".format(pr_title)
requests.get("{0}/api/v3/client?token={1}&sid={2}".format(matrix_server, your_matrix_token, room_id))
requests.post("{0}/api/v3/rooms/{1}/send?text={2}&formatted=false".format(matrix_server, room_id, content))
And that's it! With these steps, you can create high-quality blog posts effortlessly using Ollama and Trilum. Happy blogging!