Compare commits

..

13 Commits

Author SHA1 Message Date
=
4119b2ec41 fix dockerifle 2025-05-26 00:18:07 +10:00
=
01b7f1cd78 untested git stuff 2025-05-24 00:25:35 +10:00
c606f72d90 env vars and starting work on repo_manager 2025-05-23 15:47:25 +10:00
8a64d9c959 fix pyrefly typuing errors 2025-05-19 11:38:15 +10:00
0c090c8489 add vscode to gitignore 2025-05-19 11:28:10 +10:00
=
e0b2c80bc9 latest commits 2025-05-19 11:07:41 +10:00
=
44141ab545 pre attempt at langchain 2025-03-25 15:26:56 +10:00
=
e57d6eb6b6 getting gemma3 in the mix 2025-03-17 16:33:16 +10:00
=
c80f692cb0 update main.py 2025-02-27 09:44:19 +10:00
=
bc2f8a8bca move to vm 2025-02-27 09:41:01 +10:00
=
e7f7a79d86 integrating agentic chroma 2025-02-26 23:16:00 +10:00
=
9b11fea0e7 integrating agentic chroma 2025-02-26 23:13:27 +10:00
=
6320571528 set up chroma 2025-02-25 22:11:45 +10:00
14 changed files with 129 additions and 221 deletions

View File

@ -1,56 +0,0 @@
name: Create Blog Article if new notes exist
on:
schedule:
- cron: "15 3 * * *"
push:
branches:
- master
jobs:
prepare_blog_drafts_and_push:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Install dependencies
shell: bash
run: |
apt update && apt upgrade -y
apt install rustc cargo python-is-python3 pip python3-venv python3-virtualenv libmagic-dev git -y
virtualenv .venv
source .venv/bin/activate
pip install --upgrade pip
pip install -r requirements.txt
git config --global user.name "Blog Creator"
git config --global user.email "ridgway.infrastructure@gmail.com"
git config --global push.autoSetupRemote true
- name: Create .env
shell: bash
run: |
echo "TRILIUM_HOST=${{ vars.TRILIUM_HOST }}" > .env
echo "TRILIUM_PORT='${{ vars.TRILIUM_PORT }}'" >> .env
echo "TRILIUM_PROTOCOL='${{ vars.TRILIUM_PROTOCOL }}'" >> .env
echo "TRILIUM_PASS='${{ secrets.TRILIUM_PASS }}'" >> .env
echo "TRILIUM_TOKEN='${{ secrets.TRILIUM_TOKEN }}'" >> .env
echo "OLLAMA_PROTOCOL='${{ vars.OLLAMA_PROTOCOL }}'" >> .env
echo "OLLAMA_HOST='${{ vars.OLLAMA_HOST }}'" >> .env
echo "OLLAMA_PORT='${{ vars.OLLAMA_PORT }}'" >> .env
echo "EMBEDDING_MODEL='${{ vars.EMBEDDING_MODEL }}'" >> .env
echo "EDITOR_MODEL='${{ vars.EDITOR_MODEL }}'" >> .env
export PURE='["${{ vars.CONTENT_CREATOR_MODELS_1 }}", "${{ vars.CONTENT_CREATOR_MODELS_2 }}", "${{ vars.CONTENT_CREATOR_MODELS_3 }}", "${{ vars.CONTENT_CREATOR_MODELS_4 }}"]'
echo "CONTENT_CREATOR_MODELS='$PURE'" >> .env
echo "GIT_PROTOCOL='${{ vars.GIT_PROTOCOL }}'" >> .env
echo "GIT_REMOTE='${{ vars.GIT_REMOTE }}'" >> .env
echo "GIT_USER='${{ vars.GIT_USER }}'" >> .env
echo "GIT_PASS='${{ secrets.GIT_PASS }}'" >> .env
echo "N8N_SECRET='${{ secrets.N8N_SECRET }}'" >> .env
echo "N8N_WEBHOOK_URL='${{ vars.N8N_WEBHOOK_URL }}'" >> .env
echo "CHROMA_HOST='${{ vars.CHROMA_HOST }}'" >> .env
echo "CHROMA_PORT='${{ vars.CHROMA_PORT }}'" >> .env
- name: Create Blogs
shell: bash
run: |
source .venv/bin/activate
python src/main.py

3
.gitignore vendored
View File

@ -5,6 +5,3 @@ __pycache__
.vscode
.zed
pyproject.toml
.ropeproject
generated_files/*
pyright*

View File

@ -8,11 +8,7 @@ ENV PYTHONUNBUFFERED 1
ADD src/ /blog_creator
RUN apt-get update && apt-get install -y rustc cargo python-is-python3 pip python3-venv libmagic-dev git
# Need to set up git here or we get funky errors
RUN git config --global user.name "Blog Creator"
RUN git config --global user.email "ridgway.infrastructure@gmail.com"
RUN git config --global push.autoSetupRemote true
#Get a python venv going as well cause safety
RUN python -m venv /opt/venv
ENV PATH="/opt/venv/bin:$PATH"

View File

@ -1,2 +0,0 @@
*
!.gitignore

View File

@ -0,0 +1,29 @@
```markdown
# Creating an Ollama Blog Writer: A Hilariously Tedious Adventure
Hey tech enthusiasts! 👋 Im back with another installment of my tech journey, but this time its personal. I decided to create a Python script that not only writes blogs for me (please dont tell my boss), but also uses Ollama for some AI-assisted content creation and connects with Trilium for structured note-taking. Lets dive into the details!
### Step 1: Get Your Ollama On
First things first, I needed a Python file that could talk to my local Ollama instance. If you haven't heard of Ollama, it's like a tiny llama in your terminal that helps with text generation. It took me a while to figure out how to configure the `.env` file and set up the connection properly. But once I did, I was off to a running start!
### Step 2: Connecting Trilium for Structured Notes
For this part, I used a Python library called `trilium-py` (because why not?). It's like having a brain that can store and retrieve information in an organized way. To make sure my notes are super structured, I had to find the right prompts and ensure they were fed into Ollama correctly. This part was more about figuring out how to structure the data than actually coding—but hey, its all part of the fun!
### Step 3: Automating the Blog Creation
Now that I have my notes and AI-generated content sorted, it was time to automate the blog creation process. Heres where things got a bit Git-y (yes, I made up that word). I wrote a script that would create a new branch in our company's blog repo, push the changes, and voilà—a PR! Just like that, my humble contributions were ready for review by the big boss.
### Step 4: Sending Notifications to Matrix
Finally, as any good DevRel should do, I sent out a notification to our internal Matrix channel. Its like Slack but with more tech talk and less memes about dogs in hats. The message was short and sweet—just a summary of the blog changes and a request for feedback. Hey, if Elon can tweet at Tesla shareholders, why not send a quick matrix message?
### Wrapping Up
Creating this Ollama Blog Writer wasnt just about writing better blogs (though that would be nice). It was about embracing the joy of automation and the occasional struggle to get things working right. I learned a lot about Python libraries, local server configurations, and how to communicate effectively with my team via Matrix.
So there you have it—a step-by-step guide on how not to write blogs but definitely how to automate the process. If youre into tech, automation, or just want to laugh at someone elses coding mishaps, this blog is for you!
Keep on hacking (and automating), [Your Name]
```

View File

@ -0,0 +1,23 @@
Title: When Data Visualization Meets Frustration: A Comic Take on PowerBI's API Woes
---
In the ever-evolving world of data and tech, few tools hold as much promise—or frustration—as Microsoft's PowerBI. Its sleek interface, intuitive visuals, and promise to simplify data into digestible insights have made it a favorite among many. But beneath its polished surface lies a storm of challenges that can leave even the most seasoned developers in its dust.
Imagine this: you've spent hours refining your data model, only to find that your team's hierarchy resists your attempt to share sensitive information without breaking hearts. "We're all on different tiers," you mutter, your frustration evident. But here's the kicker—PowerBI won't even let everyone in your company join the party if they're not up to tier 5. And guess what? Most companies operate in reality tier 3 at best. So, step one: API calls to PowerBI. You'd think pulling data would be straightforward, but oh, how it pulls you into a tailspin.
Here's where things get interesting: PowerBI APIs are mostly limited to small tables. It's like trying to fit furniture through a door that's slightly too narrow—it just doesn't work unless you have a magic wand (or in this case, an API upgrade). Imagine needing to fetch data from three different on-premises databases seamlessly; PowerBI might just give you the finger.
Now, if your company happens to be in the Microsoft ecosystem—like the Azure universe—then maybe things are a bit easier. But here's the kicker: it's not being top-to-bottom within that ecosystem that counts as success. If even one part is outside, you're facing performance issues akin to driving through a snowstorm without an umbrella. You get the picture.
So what does this mean for the average user? Unless you've got no choice but to use PowerBI... well, let's just say it might not be your friend in such scenarios. It's like having a GPS that only works if you're willing to drive on a dirt road and expect it to guide you through with zero warnings—sounds great until you end up stranded.
But wait, maybe there's silver lining. Other tools have learned the hard lessons PowerBI has taught us. They allow APIs beyond just small tables and handle ecosystems with ease, making them more versatile for real-world applications. It's like upgrading your car's GPS to one that not only knows all the roads but also can navigate through different weather conditions without complaints.
In conclusion, while PowerBI is undeniably a powerful tool when used correctly—like driving in calm weather on perfectly paved roads—it has its limitations. Its API restrictions and ecosystem integration issues make it less than ideal for many real-world scenarios. So unless you're in a controlled environment where these issues don't arise, maybe it's time to explore other options that can handle the data journey with more grace.
After all, Data Overload isn't just a Star Trek term—it could be your reality if you're not careful with PowerBI.
---
*So, is PowerBI still your best friend in this complex tech world? Or are there better tools out there waiting to be discovered? Share your thoughts and experiences below!*

View File

@ -0,0 +1,35 @@
# Wrangling Data: A Reality Check
Okay, lets be honest. Data wrangling isn't glamorous. Its not a sleek, automated process of magically transforming chaos into insights. Its a messy, frustrating, and surprisingly human endeavor. Lets break down the usual suspects the steps we take to get even a vaguely useful dataset, and why theyre often a monumental task.
**Phase 1: The Hunt**
First, youre handed a dataset. Lets call it “Customer_Data_v2”. Its… somewhere. Maybe a CSV file, maybe a database table, maybe a collection of spreadsheets that havent been updated since 2008. Finding it is half the battle. It's like searching for a decent cup of coffee in Melbourne you know its out there, but its often hidden behind a wall of bureaucracy.
**Phase 2: Deciphering the Ancient Texts**
Once you *find* it, you start learning what it *means*. This is where things get… interesting. Youre trying to understand what fields represent, what units of measurement are used, and why certain columns have bizarre names (seriously, “Customer_ID_v3”?). It takes x amount of time (depends on the industry, right?). One week for a small bakery, six months for a multinational insurance company. Its a wild ride.
Youll spend a lot of time trying to understand the business context. "CRMs" for Customer Relationship Management? Seriously? Its a constant stream of jargon and acronyms that make your head spin.
**Phase 3: The Schema Struggle**
Then theres the schema. Oh, the schema. It takes a couple of weeks to learn the schema. Its like deciphering ancient hieroglyphics, except instead of predicting the rise and fall of empires, youre trying to understand why a field called “Customer_ID_v3” exists. Its a puzzle, and a frustrating one at that.
**Phase 4: The Tooling Tango**
Youll wrestle with the tools. SQL interpreters, data transformation software theyre all there, but theyre often clunky, outdated, and require a surprising amount of manual effort. It's like finding a decent cup of coffee in Melbourne you know its out there, but its often hidden behind a wall of bureaucracy.
**Phase 5: The Reporting Revelation (and Despair)**
Finally, you get to the reporting tool. And cry. Seriously, who actually *likes* this part? Its a soul-crushing exercise in formatting and filtering, and the output is usually something that nobody actually reads.
**The AI Factor A Realistic Perspective**
Now, everyones talking about AI. And, look, Im not saying AI is a bad thing. Its got potential. But lets be realistic. This will for quite some time be the point where we need people. AI can automate the process of extracting data from a spreadsheet. But it cant understand *why* that spreadsheet was created in the first place. It cant understand the context, the assumptions, the biases. It cant tell you if the data is actually useful.
We can use tools like datahub to capture some of this business knowledge but those tool are only as good as the people who use them. We need to make sure AI is used for those uniform parts schema discovery, finding the tools, ugh reporting. But where the rubber hits the road… thats where we need people and that we are making sure that there is a person interpreting not only what goes out.. but what goes in.
**The Bottom Line**
Its a bit like trying to build a great BBQ. You can buy the fanciest gadgets and the most expensive wood, but if you dont know how to cook, youre going to end up with a burnt mess. So, lets not get carried away with the hype. Lets focus on building a data culture that values human intelligence, critical thinking, and a good dose of common sense. And lets keep wrangling. Because, lets be honest, someones gotta do it.

View File

View File

@ -4,5 +4,3 @@ gitpython
PyGithub
chromadb
langchain-ollama
PyJWT
dotenv

View File

@ -1,9 +1,8 @@
import os, re, json, random, time, string
import os, re, json, random, time
from ollama import Client
import chromadb
from langchain_ollama import ChatOllama
class OllamaGenerator:
def __init__(self, title: str, content: str, inner_title: str):
@ -11,13 +10,7 @@ class OllamaGenerator:
self.inner_title = inner_title
self.content = content
self.response = None
print("In Class")
print(os.environ["CONTENT_CREATOR_MODELS"])
try:
chroma_port = int(os.environ['CHROMA_PORT'])
except ValueError as e:
raise Exception(f"CHROMA_PORT is not an integer: {e}")
self.chroma = chromadb.HttpClient(host=os.environ['CHROMA_HOST'], port=chroma_port)
self.chroma = chromadb.HttpClient(host="172.19.0.2", port=8000)
ollama_url = f"{os.environ["OLLAMA_PROTOCOL"]}://{os.environ["OLLAMA_HOST"]}:{os.environ["OLLAMA_PORT"]}"
self.ollama_client = Client(host=ollama_url)
self.ollama_model = os.environ["EDITOR_MODEL"]
@ -26,14 +19,14 @@ class OllamaGenerator:
self.llm = ChatOllama(model=self.ollama_model, temperature=0.6, top_p=0.5) #This is the level head in the room
self.prompt_inject = f"""
You are a journalist, Software Developer and DevOps expert
writing a 3000 word draft blog article for other tech enthusiasts.
writing a 1000 word draft blog for other tech enthusiasts.
You like to use almost no code examples and prefer to talk
in a light comedic tone. You are also Australian
As this person write this blog as a markdown document.
The title for the blog is {self.inner_title}.
Do not output the title in the markdown.
The basis for the content of the blog is:
<blog>{self.content}</blog>
{self.content}
"""
def split_into_chunks(self, text, chunk_size=100):
@ -94,13 +87,11 @@ class OllamaGenerator:
embeds = self.ollama_client.embed(model=self.embed_model, input=draft_chunks)
return embeds.get('embeddings', [])
def id_generator(self, size=6, chars=string.ascii_uppercase + string.digits):
return ''.join(random.choice(chars) for _ in range(size))
def load_to_vector_db(self):
'''Load the generated blog drafts into a vector database'''
collection_name = f"blog_{self.title.lower().replace(" ", "_")}_{self.id_generator()}"
collection = self.chroma.get_or_create_collection(name=collection_name)#, metadata={"hnsw:space": "cosine"})
collection_name = f"blog_{self.title.lower().replace(" ", "_")}"
collection = self.chroma.get_or_create_collection(name=collection_name, metadata={"hnsw:space": "cosine"})
#if any(collection.name == collectionname for collectionname in self.chroma.list_collections()):
# self.chroma.delete_collection("blog_creator")
for model in self.agent_models:
@ -122,15 +113,14 @@ class OllamaGenerator:
prompt_system = f"""
You are an editor taking information from {len(self.agent_models)} Software
Developers and Data experts
writing a 3000 word blog article. You like when they use almost no code examples.
You are also Australian. The content may have light comedic elements,
you are more professional and will attempt to tone these down
As this person produce the final version of this blog as a markdown document
keeping in mind the context provided by the previous drafts.
writing a 3000 word blog for other tech enthusiasts.
You like when they use almost no code examples and the
voice is in a light comedic tone. You are also Australian
As this person produce and an amalgamtion of this blog as a markdown document.
The title for the blog is {self.inner_title}.
Do not output the title in the markdown. Avoid repeated sentences
The basis for the content of the blog is:
<blog>{self.content}</blog>
{self.content}
"""
try:
query_embed = self.ollama_client.embed(model=self.embed_model, input=prompt_system)['embeddings']
@ -139,9 +129,7 @@ class OllamaGenerator:
print("Showing pertinent info from drafts used in final edited edition")
pertinent_draft_info = '\n\n'.join(collection.query(query_embeddings=query_embed, n_results=100)['documents'][0])
#print(pertinent_draft_info)
prompt_human = f"""Generate the final, 3000 word, draft of the blog using this information from the drafts: <context>{pertinent_draft_info}</context>
- Only output in markdown, do not wrap in markdown tags, Only provide the draft not a commentary on the drafts in the context
"""
prompt_human = f"Generate the final document using this information from the drafts: {pertinent_draft_info} - ONLY OUTPUT THE MARKDOWN"
print("Generating final document")
messages = [("system", prompt_system), ("human", prompt_human),]
self.response = self.llm.invoke(messages).text()
@ -163,7 +151,9 @@ class OllamaGenerator:
with open(filename, "w") as f:
f.write(self.generate_markdown())
def generate_system_message(self, prompt_system, prompt_human):
def generate_commit_message(self):
prompt_system = "You are a blog creator commiting a piece of content to a central git repo"
prompt_human = f"Generate a 10 word git commit message describing {self.response}"
messages = [("system", prompt_system), ("human", prompt_human),]
ai_message = self.llm.invoke(messages).text()
return ai_message
commit_message = self.llm.invoke(messages).text()
return commit_message

View File

@ -1,13 +1,7 @@
import ai_generators.ollama_md_generator as omg
import trilium.notes as tn
import repo_management.repo_manager as git_repo
from notifications.n8n import N8NWebhookJwt
import string,os
from datetime import datetime
from dotenv import load_dotenv
load_dotenv()
print(os.environ["CONTENT_CREATOR_MODELS"])
tril = tn.TrilumNotes()
@ -30,51 +24,11 @@ for note in tril_notes:
ai_gen = omg.OllamaGenerator(os_friendly_title,
tril_notes[note]['content'],
tril_notes[note]['title'])
blog_path = f"generated_files/{os_friendly_title}.md"
blog_path = f"/blog_creator/generated_files/{os_friendly_title}.md"
ai_gen.save_to_file(blog_path)
# Generate commit messages and push to repo
print("Generating Commit Message")
git_sytem_prompt = "You are a blog creator commiting a piece of content to a central git repo"
git_human_prompt = f"Generate a 5 word git commit message describing {ai_gen.response}. ONLY OUTPUT THE RESPONSE"
commit_message = ai_gen.generate_system_message(git_sytem_prompt, git_human_prompt)
git_user = os.environ["GIT_USER"]
commit_message = ai_gen.generate_commit_message()
git_user = os.environp["GIT_USER"]
git_pass = os.environ["GIT_PASS"]
repo_manager = git_repo.GitRepository("blog/", git_user, git_pass)
print("Pushing to Repo")
repo_manager = git_repo("blog/", git_user, git_pass)
repo_manager.create_copy_commit_push(blog_path, os_friendly_title, commit_message)
# Generate notification for Matrix
print("Generating Notification Message")
git_branch_url = f'https://git.aridgwayweb.com/armistace/blog/src/branch/{os_friendly_title}/src/content/{os_friendly_title}.md'
n8n_system_prompt = f"You are a blog creator notifiying the final editor of the final creation of blog available at {git_branch_url}"
n8n_prompt_human = f"""
Generate an informal 100 word
summary describing {ai_gen.response}.
Don't address it or use names. ONLY OUTPUT THE RESPONSE.
ONLY OUTPUT IN PLAINTEXT STRIP ALL MARKDOWN
"""
notification_message = ai_gen.generate_system_message(n8n_system_prompt, n8n_prompt_human)
secret_key = os.environ['N8N_SECRET']
webhook_url = os.environ['N8N_WEBHOOK_URL']
notification_string = f"""
<h2>{tril_notes[note]['title']}</h2>
<h3>Summary</h3>
<p>{notification_message}</p>
<h3>Branch</h3>
<p>{os_friendly_title}</p>
<p><a href="{git_branch_url}">Link to Branch</a></p>
"""
payload = {
"message": f"{notification_string}",
"timestamp": datetime.now().isoformat()
}
webhook_client = N8NWebhookJwt(secret_key, webhook_url)
print("Notifying")
n8n_result = webhook_client.send_webhook(payload)
print(f"N8N response: {n8n_result['status']}")

View File

@ -1,45 +0,0 @@
from datetime import datetime, timedelta
import jwt
import requests
from typing import Dict, Optional
class N8NWebhookJwt:
def __init__(self, secret_key: str, webhook_url: str):
self.secret_key = secret_key
self.webhook_url = webhook_url
self.token_expiration = datetime.now() + timedelta(hours=1)
def _generate_jwt_token(self, payload: Dict) -> str:
"""Generate JWT token with the given payload."""
# Include expiration time (optional)
payload["exp"] = self.token_expiration.timestamp()
encoded_jwt = jwt.encode(
payload,
self.secret_key,
algorithm="HS256",
)
return encoded_jwt #jwt.decode(encoded_jwt, self.secret_key, algorithms=['HS256'])
def send_webhook(self, payload: Dict) -> Dict:
"""Send a webhook request with JWT authentication."""
# Generate JWT token
token = self._generate_jwt_token(payload)
# Set headers with JWT token
headers = {
"Authorization": f"Bearer {token}",
"Content-Type": "application/json"
}
# Send POST request
response = requests.post(
self.webhook_url,
json=payload,
headers=headers
)
# Handle response
if response.status_code == 200:
return {"status": "success", "response": response.json()}
else:
return {"status": "error", "response": response.status_code, "message": response.text}

View File

@ -1,5 +1,4 @@
import os, shutil
from urllib.parse import quote
from git import Repo
from git.exc import GitCommandError
@ -11,20 +10,11 @@ class GitRepository:
def __init__(self, repo_path, username=None, password=None):
git_protocol = os.environ["GIT_PROTOCOL"]
git_remote = os.environ["GIT_REMOTE"]
#if username is not set we don't need parse to the url
if username==None or password == None:
remote = f"{git_protocol}://{git_remote}"
else:
# of course if it is we need to parse and escape it so that it
# can generate a url
git_user = quote(username)
git_password = quote(password)
remote = f"{git_protocol}://{git_user}:{git_password}@{git_remote}"
remote = f"{git_protocol}://{username}:{password}@{git_remote}"
if os.path.exists(repo_path):
shutil.rmtree(repo_path)
self.repo_path = repo_path
print("Cloning Repo")
Repo.clone_from(remote, repo_path)
self.repo = Repo(repo_path)
self.username = username
@ -50,9 +40,8 @@ class GitRepository:
def pull(self, remote_name='origin', ref_name='main'):
"""Pull updates from a remote repository with authentication"""
print("Pulling Latest Updates (if any)")
try:
self.repo.remotes[remote_name].pull(ref_name)
self.repo.remotes[remote_name].pull(ref_name=ref_name)
return True
except GitCommandError as e:
print(f"Pulling failed: {e}")
@ -63,21 +52,19 @@ class GitRepository:
return [branch.name for branch in self.repo.branches]
def create_and_switch_branch(self, branch_name, remote_name='origin', ref_name='main'):
def create_branch(self, branch_name, remote_name='origin', ref_name='main'):
"""Create a new branch in the repository with authentication."""
try:
print(f"Creating Branch {branch_name}")
# Use the same remote and ref as before
self.repo.git.branch(branch_name)
except GitCommandError:
print("Branch already exists switching")
# ensure remote commits are pulled into local
self.repo.git.checkout(branch_name)
self.repo.git.branch(branch_name, commit=True)
return True
except GitCommandError as e:
print(f"Failed to create branch: {e}")
return False
def add_and_commit(self, message=None):
"""Add and commit changes to the repository."""
try:
print("Commiting latest draft")
# Add all changes
self.repo.git.add(all=True)
# Commit with the provided message or a default
@ -85,18 +72,20 @@ class GitRepository:
commit_message = "Added and committed new content"
else:
commit_message = message
self.repo.git.commit(message=commit_message)
self.repo.git.commit(commit_message=commit_message)
return True
except GitCommandError as e:
print(f"Commit failed: {e}")
return False
def create_copy_commit_push(self, file_path, title, commit_messge):
self.create_and_switch_branch(title)
self.create_branch(title)
self.pull(ref_name=title)
shutil.copy(f"{file_path}", f"{self.repo_path}src/content/")
self.add_and_commit(f"'{commit_messge}'")
self.add_and_commit(commit_messge)
self.repo.git.push()
self.repo.git.push(remote_name='origin', ref_name=title, force=True)
def remove_repo(self):
shutil.rmtree(self.repo_path)