Compare commits
13 Commits
master
...
rag_inclus
Author | SHA1 | Date | |
---|---|---|---|
![]() |
4119b2ec41 | ||
![]() |
01b7f1cd78 | ||
c606f72d90 | |||
8a64d9c959 | |||
0c090c8489 | |||
![]() |
e0b2c80bc9 | ||
![]() |
44141ab545 | ||
![]() |
e57d6eb6b6 | ||
![]() |
c80f692cb0 | ||
![]() |
bc2f8a8bca | ||
![]() |
e7f7a79d86 | ||
![]() |
9b11fea0e7 | ||
![]() |
6320571528 |
@ -1,56 +0,0 @@
|
|||||||
name: Create Blog Article if new notes exist
|
|
||||||
on:
|
|
||||||
schedule:
|
|
||||||
- cron: "15 3 * * *"
|
|
||||||
push:
|
|
||||||
branches:
|
|
||||||
- master
|
|
||||||
jobs:
|
|
||||||
prepare_blog_drafts_and_push:
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- name: Checkout repository
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Install dependencies
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
apt update && apt upgrade -y
|
|
||||||
apt install rustc cargo python-is-python3 pip python3-venv python3-virtualenv libmagic-dev git -y
|
|
||||||
virtualenv .venv
|
|
||||||
source .venv/bin/activate
|
|
||||||
pip install --upgrade pip
|
|
||||||
pip install -r requirements.txt
|
|
||||||
git config --global user.name "Blog Creator"
|
|
||||||
git config --global user.email "ridgway.infrastructure@gmail.com"
|
|
||||||
git config --global push.autoSetupRemote true
|
|
||||||
|
|
||||||
- name: Create .env
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
echo "TRILIUM_HOST=${{ vars.TRILIUM_HOST }}" > .env
|
|
||||||
echo "TRILIUM_PORT='${{ vars.TRILIUM_PORT }}'" >> .env
|
|
||||||
echo "TRILIUM_PROTOCOL='${{ vars.TRILIUM_PROTOCOL }}'" >> .env
|
|
||||||
echo "TRILIUM_PASS='${{ secrets.TRILIUM_PASS }}'" >> .env
|
|
||||||
echo "TRILIUM_TOKEN='${{ secrets.TRILIUM_TOKEN }}'" >> .env
|
|
||||||
echo "OLLAMA_PROTOCOL='${{ vars.OLLAMA_PROTOCOL }}'" >> .env
|
|
||||||
echo "OLLAMA_HOST='${{ vars.OLLAMA_HOST }}'" >> .env
|
|
||||||
echo "OLLAMA_PORT='${{ vars.OLLAMA_PORT }}'" >> .env
|
|
||||||
echo "EMBEDDING_MODEL='${{ vars.EMBEDDING_MODEL }}'" >> .env
|
|
||||||
echo "EDITOR_MODEL='${{ vars.EDITOR_MODEL }}'" >> .env
|
|
||||||
export PURE='["${{ vars.CONTENT_CREATOR_MODELS_1 }}", "${{ vars.CONTENT_CREATOR_MODELS_2 }}", "${{ vars.CONTENT_CREATOR_MODELS_3 }}", "${{ vars.CONTENT_CREATOR_MODELS_4 }}"]'
|
|
||||||
echo "CONTENT_CREATOR_MODELS='$PURE'" >> .env
|
|
||||||
echo "GIT_PROTOCOL='${{ vars.GIT_PROTOCOL }}'" >> .env
|
|
||||||
echo "GIT_REMOTE='${{ vars.GIT_REMOTE }}'" >> .env
|
|
||||||
echo "GIT_USER='${{ vars.GIT_USER }}'" >> .env
|
|
||||||
echo "GIT_PASS='${{ secrets.GIT_PASS }}'" >> .env
|
|
||||||
echo "N8N_SECRET='${{ secrets.N8N_SECRET }}'" >> .env
|
|
||||||
echo "N8N_WEBHOOK_URL='${{ vars.N8N_WEBHOOK_URL }}'" >> .env
|
|
||||||
echo "CHROMA_HOST='${{ vars.CHROMA_HOST }}'" >> .env
|
|
||||||
echo "CHROMA_PORT='${{ vars.CHROMA_PORT }}'" >> .env
|
|
||||||
|
|
||||||
- name: Create Blogs
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
source .venv/bin/activate
|
|
||||||
python src/main.py
|
|
3
.gitignore
vendored
3
.gitignore
vendored
@ -5,6 +5,3 @@ __pycache__
|
|||||||
.vscode
|
.vscode
|
||||||
.zed
|
.zed
|
||||||
pyproject.toml
|
pyproject.toml
|
||||||
.ropeproject
|
|
||||||
generated_files/*
|
|
||||||
pyright*
|
|
||||||
|
@ -7,12 +7,8 @@ ENV PYTHONUNBUFFERED 1
|
|||||||
|
|
||||||
ADD src/ /blog_creator
|
ADD src/ /blog_creator
|
||||||
|
|
||||||
RUN apt-get update && apt-get install -y rustc cargo python-is-python3 pip python3-venv libmagic-dev git
|
RUN apt-get update && apt-get install -y rustc cargo python-is-python3 pip python3-venv libmagic-dev git
|
||||||
# Need to set up git here or we get funky errors
|
|
||||||
RUN git config --global user.name "Blog Creator"
|
|
||||||
RUN git config --global user.email "ridgway.infrastructure@gmail.com"
|
|
||||||
RUN git config --global push.autoSetupRemote true
|
|
||||||
#Get a python venv going as well cause safety
|
|
||||||
RUN python -m venv /opt/venv
|
RUN python -m venv /opt/venv
|
||||||
ENV PATH="/opt/venv/bin:$PATH"
|
ENV PATH="/opt/venv/bin:$PATH"
|
||||||
|
|
||||||
|
2
generated_files/.gitignore
vendored
2
generated_files/.gitignore
vendored
@ -1,2 +0,0 @@
|
|||||||
*
|
|
||||||
!.gitignore
|
|
29
generated_files/creating_an_ollama_blog_writer.md
Normal file
29
generated_files/creating_an_ollama_blog_writer.md
Normal file
@ -0,0 +1,29 @@
|
|||||||
|
```markdown
|
||||||
|
# Creating an Ollama Blog Writer: A Hilariously Tedious Adventure
|
||||||
|
|
||||||
|
Hey tech enthusiasts! 👋 I’m back with another installment of my tech journey, but this time it’s personal. I decided to create a Python script that not only writes blogs for me (please don’t tell my boss), but also uses Ollama for some AI-assisted content creation and connects with Trilium for structured note-taking. Let’s dive into the details!
|
||||||
|
|
||||||
|
### Step 1: Get Your Ollama On
|
||||||
|
|
||||||
|
First things first, I needed a Python file that could talk to my local Ollama instance. If you haven't heard of Ollama, it's like a tiny llama in your terminal that helps with text generation. It took me a while to figure out how to configure the `.env` file and set up the connection properly. But once I did, I was off to a running start!
|
||||||
|
|
||||||
|
### Step 2: Connecting Trilium for Structured Notes
|
||||||
|
|
||||||
|
For this part, I used a Python library called `trilium-py` (because why not?). It's like having a brain that can store and retrieve information in an organized way. To make sure my notes are super structured, I had to find the right prompts and ensure they were fed into Ollama correctly. This part was more about figuring out how to structure the data than actually coding—but hey, it’s all part of the fun!
|
||||||
|
|
||||||
|
### Step 3: Automating the Blog Creation
|
||||||
|
|
||||||
|
Now that I have my notes and AI-generated content sorted, it was time to automate the blog creation process. Here’s where things got a bit Git-y (yes, I made up that word). I wrote a script that would create a new branch in our company's blog repo, push the changes, and voilà—a PR! Just like that, my humble contributions were ready for review by the big boss.
|
||||||
|
|
||||||
|
### Step 4: Sending Notifications to Matrix
|
||||||
|
|
||||||
|
Finally, as any good DevRel should do, I sent out a notification to our internal Matrix channel. It’s like Slack but with more tech talk and less memes about dogs in hats. The message was short and sweet—just a summary of the blog changes and a request for feedback. Hey, if Elon can tweet at Tesla shareholders, why not send a quick matrix message?
|
||||||
|
|
||||||
|
### Wrapping Up
|
||||||
|
|
||||||
|
Creating this Ollama Blog Writer wasn’t just about writing better blogs (though that would be nice). It was about embracing the joy of automation and the occasional struggle to get things working right. I learned a lot about Python libraries, local server configurations, and how to communicate effectively with my team via Matrix.
|
||||||
|
|
||||||
|
So there you have it—a step-by-step guide on how not to write blogs but definitely how to automate the process. If you’re into tech, automation, or just want to laugh at someone else’s coding mishaps, this blog is for you!
|
||||||
|
|
||||||
|
Keep on hacking (and automating), [Your Name]
|
||||||
|
```
|
23
generated_files/powerbi_and_api_performance.md
Normal file
23
generated_files/powerbi_and_api_performance.md
Normal file
@ -0,0 +1,23 @@
|
|||||||
|
Title: When Data Visualization Meets Frustration: A Comic Take on PowerBI's API Woes
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
In the ever-evolving world of data and tech, few tools hold as much promise—or frustration—as Microsoft's PowerBI. Its sleek interface, intuitive visuals, and promise to simplify data into digestible insights have made it a favorite among many. But beneath its polished surface lies a storm of challenges that can leave even the most seasoned developers in its dust.
|
||||||
|
|
||||||
|
Imagine this: you've spent hours refining your data model, only to find that your team's hierarchy resists your attempt to share sensitive information without breaking hearts. "We're all on different tiers," you mutter, your frustration evident. But here's the kicker—PowerBI won't even let everyone in your company join the party if they're not up to tier 5. And guess what? Most companies operate in reality tier 3 at best. So, step one: API calls to PowerBI. You'd think pulling data would be straightforward, but oh, how it pulls you into a tailspin.
|
||||||
|
|
||||||
|
Here's where things get interesting: PowerBI APIs are mostly limited to small tables. It's like trying to fit furniture through a door that's slightly too narrow—it just doesn't work unless you have a magic wand (or in this case, an API upgrade). Imagine needing to fetch data from three different on-premises databases seamlessly; PowerBI might just give you the finger.
|
||||||
|
|
||||||
|
Now, if your company happens to be in the Microsoft ecosystem—like the Azure universe—then maybe things are a bit easier. But here's the kicker: it's not being top-to-bottom within that ecosystem that counts as success. If even one part is outside, you're facing performance issues akin to driving through a snowstorm without an umbrella. You get the picture.
|
||||||
|
|
||||||
|
So what does this mean for the average user? Unless you've got no choice but to use PowerBI... well, let's just say it might not be your friend in such scenarios. It's like having a GPS that only works if you're willing to drive on a dirt road and expect it to guide you through with zero warnings—sounds great until you end up stranded.
|
||||||
|
|
||||||
|
But wait, maybe there's silver lining. Other tools have learned the hard lessons PowerBI has taught us. They allow APIs beyond just small tables and handle ecosystems with ease, making them more versatile for real-world applications. It's like upgrading your car's GPS to one that not only knows all the roads but also can navigate through different weather conditions without complaints.
|
||||||
|
|
||||||
|
In conclusion, while PowerBI is undeniably a powerful tool when used correctly—like driving in calm weather on perfectly paved roads—it has its limitations. Its API restrictions and ecosystem integration issues make it less than ideal for many real-world scenarios. So unless you're in a controlled environment where these issues don't arise, maybe it's time to explore other options that can handle the data journey with more grace.
|
||||||
|
|
||||||
|
After all, Data Overload isn't just a Star Trek term—it could be your reality if you're not careful with PowerBI.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
*So, is PowerBI still your best friend in this complex tech world? Or are there better tools out there waiting to be discovered? Share your thoughts and experiences below!*
|
35
generated_files/the_melding_of_data_engineering_and_ai.md
Normal file
35
generated_files/the_melding_of_data_engineering_and_ai.md
Normal file
@ -0,0 +1,35 @@
|
|||||||
|
# Wrangling Data: A Reality Check
|
||||||
|
|
||||||
|
Okay, let’s be honest. Data wrangling isn't glamorous. It’s not a sleek, automated process of magically transforming chaos into insights. It’s a messy, frustrating, and surprisingly human endeavor. Let’s break down the usual suspects – the steps we take to get even a vaguely useful dataset, and why they’re often a monumental task.
|
||||||
|
|
||||||
|
**Phase 1: The Hunt**
|
||||||
|
|
||||||
|
First, you’re handed a dataset. Let’s call it “Customer_Data_v2”. It’s… somewhere. Maybe a CSV file, maybe a database table, maybe a collection of spreadsheets that haven’t been updated since 2008. Finding it is half the battle. It's like searching for a decent cup of coffee in Melbourne – you know it’s out there, but it’s often hidden behind a wall of bureaucracy.
|
||||||
|
|
||||||
|
**Phase 2: Deciphering the Ancient Texts**
|
||||||
|
|
||||||
|
Once you *find* it, you start learning what it *means*. This is where things get… interesting. You’re trying to understand what fields represent, what units of measurement are used, and why certain columns have bizarre names (seriously, “Customer_ID_v3”?). It takes x amount of time (depends on the industry, right?). One week for a small bakery, six months for a multinational insurance company. It’s a wild ride.
|
||||||
|
|
||||||
|
You’ll spend a lot of time trying to understand the business context. "CRMs" for Customer Relationship Management? Seriously? It’s a constant stream of jargon and acronyms that make your head spin.
|
||||||
|
|
||||||
|
**Phase 3: The Schema Struggle**
|
||||||
|
|
||||||
|
Then there’s the schema. Oh, the schema. It takes a couple of weeks to learn the schema. It’s like deciphering ancient hieroglyphics, except instead of predicting the rise and fall of empires, you’re trying to understand why a field called “Customer_ID_v3” exists. It’s a puzzle, and a frustrating one at that.
|
||||||
|
|
||||||
|
**Phase 4: The Tooling Tango**
|
||||||
|
|
||||||
|
You’ll wrestle with the tools. SQL interpreters, data transformation software – they’re all there, but they’re often clunky, outdated, and require a surprising amount of manual effort. It's like finding a decent cup of coffee in Melbourne – you know it’s out there, but it’s often hidden behind a wall of bureaucracy.
|
||||||
|
|
||||||
|
**Phase 5: The Reporting Revelation (and Despair)**
|
||||||
|
|
||||||
|
Finally, you get to the reporting tool. And cry. Seriously, who actually *likes* this part? It’s a soul-crushing exercise in formatting and filtering, and the output is usually something that nobody actually reads.
|
||||||
|
|
||||||
|
**The AI Factor – A Realistic Perspective**
|
||||||
|
|
||||||
|
Now, everyone’s talking about AI. And, look, I’m not saying AI is a bad thing. It’s got potential. But let’s be realistic. This will for quite some time be the point where we need people. AI can automate the process of extracting data from a spreadsheet. But it can’t understand *why* that spreadsheet was created in the first place. It can’t understand the context, the assumptions, the biases. It can’t tell you if the data is actually useful.
|
||||||
|
|
||||||
|
We can use tools like datahub to capture some of this business knowledge but those tool are only as good as the people who use them. We need to make sure AI is used for those uniform parts – schema discovery, finding the tools, ugh reporting. But where the rubber hits the road… thats where we need people and that we are making sure that there is a person interpreting not only what goes out.. but what goes in.
|
||||||
|
|
||||||
|
**The Bottom Line**
|
||||||
|
|
||||||
|
It’s a bit like trying to build a great BBQ. You can buy the fanciest gadgets and the most expensive wood, but if you don’t know how to cook, you’re going to end up with a burnt mess. So, let’s not get carried away with the hype. Let’s focus on building a data culture that values human intelligence, critical thinking, and a good dose of common sense. And let’s keep wrangling. Because, let’s be honest, someone’s gotta do it.
|
0
generated_files/when_to_use_ai.md
Normal file
0
generated_files/when_to_use_ai.md
Normal file
@ -4,5 +4,3 @@ gitpython
|
|||||||
PyGithub
|
PyGithub
|
||||||
chromadb
|
chromadb
|
||||||
langchain-ollama
|
langchain-ollama
|
||||||
PyJWT
|
|
||||||
dotenv
|
|
||||||
|
@ -1,9 +1,8 @@
|
|||||||
import os, re, json, random, time, string
|
import os, re, json, random, time
|
||||||
from ollama import Client
|
from ollama import Client
|
||||||
import chromadb
|
import chromadb
|
||||||
from langchain_ollama import ChatOllama
|
from langchain_ollama import ChatOllama
|
||||||
|
|
||||||
|
|
||||||
class OllamaGenerator:
|
class OllamaGenerator:
|
||||||
|
|
||||||
def __init__(self, title: str, content: str, inner_title: str):
|
def __init__(self, title: str, content: str, inner_title: str):
|
||||||
@ -11,13 +10,7 @@ class OllamaGenerator:
|
|||||||
self.inner_title = inner_title
|
self.inner_title = inner_title
|
||||||
self.content = content
|
self.content = content
|
||||||
self.response = None
|
self.response = None
|
||||||
print("In Class")
|
self.chroma = chromadb.HttpClient(host="172.19.0.2", port=8000)
|
||||||
print(os.environ["CONTENT_CREATOR_MODELS"])
|
|
||||||
try:
|
|
||||||
chroma_port = int(os.environ['CHROMA_PORT'])
|
|
||||||
except ValueError as e:
|
|
||||||
raise Exception(f"CHROMA_PORT is not an integer: {e}")
|
|
||||||
self.chroma = chromadb.HttpClient(host=os.environ['CHROMA_HOST'], port=chroma_port)
|
|
||||||
ollama_url = f"{os.environ["OLLAMA_PROTOCOL"]}://{os.environ["OLLAMA_HOST"]}:{os.environ["OLLAMA_PORT"]}"
|
ollama_url = f"{os.environ["OLLAMA_PROTOCOL"]}://{os.environ["OLLAMA_HOST"]}:{os.environ["OLLAMA_PORT"]}"
|
||||||
self.ollama_client = Client(host=ollama_url)
|
self.ollama_client = Client(host=ollama_url)
|
||||||
self.ollama_model = os.environ["EDITOR_MODEL"]
|
self.ollama_model = os.environ["EDITOR_MODEL"]
|
||||||
@ -26,14 +19,14 @@ class OllamaGenerator:
|
|||||||
self.llm = ChatOllama(model=self.ollama_model, temperature=0.6, top_p=0.5) #This is the level head in the room
|
self.llm = ChatOllama(model=self.ollama_model, temperature=0.6, top_p=0.5) #This is the level head in the room
|
||||||
self.prompt_inject = f"""
|
self.prompt_inject = f"""
|
||||||
You are a journalist, Software Developer and DevOps expert
|
You are a journalist, Software Developer and DevOps expert
|
||||||
writing a 3000 word draft blog article for other tech enthusiasts.
|
writing a 1000 word draft blog for other tech enthusiasts.
|
||||||
You like to use almost no code examples and prefer to talk
|
You like to use almost no code examples and prefer to talk
|
||||||
in a light comedic tone. You are also Australian
|
in a light comedic tone. You are also Australian
|
||||||
As this person write this blog as a markdown document.
|
As this person write this blog as a markdown document.
|
||||||
The title for the blog is {self.inner_title}.
|
The title for the blog is {self.inner_title}.
|
||||||
Do not output the title in the markdown.
|
Do not output the title in the markdown.
|
||||||
The basis for the content of the blog is:
|
The basis for the content of the blog is:
|
||||||
<blog>{self.content}</blog>
|
{self.content}
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def split_into_chunks(self, text, chunk_size=100):
|
def split_into_chunks(self, text, chunk_size=100):
|
||||||
@ -94,13 +87,11 @@ class OllamaGenerator:
|
|||||||
embeds = self.ollama_client.embed(model=self.embed_model, input=draft_chunks)
|
embeds = self.ollama_client.embed(model=self.embed_model, input=draft_chunks)
|
||||||
return embeds.get('embeddings', [])
|
return embeds.get('embeddings', [])
|
||||||
|
|
||||||
def id_generator(self, size=6, chars=string.ascii_uppercase + string.digits):
|
|
||||||
return ''.join(random.choice(chars) for _ in range(size))
|
|
||||||
|
|
||||||
def load_to_vector_db(self):
|
def load_to_vector_db(self):
|
||||||
'''Load the generated blog drafts into a vector database'''
|
'''Load the generated blog drafts into a vector database'''
|
||||||
collection_name = f"blog_{self.title.lower().replace(" ", "_")}_{self.id_generator()}"
|
collection_name = f"blog_{self.title.lower().replace(" ", "_")}"
|
||||||
collection = self.chroma.get_or_create_collection(name=collection_name)#, metadata={"hnsw:space": "cosine"})
|
collection = self.chroma.get_or_create_collection(name=collection_name, metadata={"hnsw:space": "cosine"})
|
||||||
#if any(collection.name == collectionname for collectionname in self.chroma.list_collections()):
|
#if any(collection.name == collectionname for collectionname in self.chroma.list_collections()):
|
||||||
# self.chroma.delete_collection("blog_creator")
|
# self.chroma.delete_collection("blog_creator")
|
||||||
for model in self.agent_models:
|
for model in self.agent_models:
|
||||||
@ -122,15 +113,14 @@ class OllamaGenerator:
|
|||||||
prompt_system = f"""
|
prompt_system = f"""
|
||||||
You are an editor taking information from {len(self.agent_models)} Software
|
You are an editor taking information from {len(self.agent_models)} Software
|
||||||
Developers and Data experts
|
Developers and Data experts
|
||||||
writing a 3000 word blog article. You like when they use almost no code examples.
|
writing a 3000 word blog for other tech enthusiasts.
|
||||||
You are also Australian. The content may have light comedic elements,
|
You like when they use almost no code examples and the
|
||||||
you are more professional and will attempt to tone these down
|
voice is in a light comedic tone. You are also Australian
|
||||||
As this person produce the final version of this blog as a markdown document
|
As this person produce and an amalgamtion of this blog as a markdown document.
|
||||||
keeping in mind the context provided by the previous drafts.
|
|
||||||
The title for the blog is {self.inner_title}.
|
The title for the blog is {self.inner_title}.
|
||||||
Do not output the title in the markdown. Avoid repeated sentences
|
Do not output the title in the markdown. Avoid repeated sentences
|
||||||
The basis for the content of the blog is:
|
The basis for the content of the blog is:
|
||||||
<blog>{self.content}</blog>
|
{self.content}
|
||||||
"""
|
"""
|
||||||
try:
|
try:
|
||||||
query_embed = self.ollama_client.embed(model=self.embed_model, input=prompt_system)['embeddings']
|
query_embed = self.ollama_client.embed(model=self.embed_model, input=prompt_system)['embeddings']
|
||||||
@ -139,9 +129,7 @@ class OllamaGenerator:
|
|||||||
print("Showing pertinent info from drafts used in final edited edition")
|
print("Showing pertinent info from drafts used in final edited edition")
|
||||||
pertinent_draft_info = '\n\n'.join(collection.query(query_embeddings=query_embed, n_results=100)['documents'][0])
|
pertinent_draft_info = '\n\n'.join(collection.query(query_embeddings=query_embed, n_results=100)['documents'][0])
|
||||||
#print(pertinent_draft_info)
|
#print(pertinent_draft_info)
|
||||||
prompt_human = f"""Generate the final, 3000 word, draft of the blog using this information from the drafts: <context>{pertinent_draft_info}</context>
|
prompt_human = f"Generate the final document using this information from the drafts: {pertinent_draft_info} - ONLY OUTPUT THE MARKDOWN"
|
||||||
- Only output in markdown, do not wrap in markdown tags, Only provide the draft not a commentary on the drafts in the context
|
|
||||||
"""
|
|
||||||
print("Generating final document")
|
print("Generating final document")
|
||||||
messages = [("system", prompt_system), ("human", prompt_human),]
|
messages = [("system", prompt_system), ("human", prompt_human),]
|
||||||
self.response = self.llm.invoke(messages).text()
|
self.response = self.llm.invoke(messages).text()
|
||||||
@ -163,7 +151,9 @@ class OllamaGenerator:
|
|||||||
with open(filename, "w") as f:
|
with open(filename, "w") as f:
|
||||||
f.write(self.generate_markdown())
|
f.write(self.generate_markdown())
|
||||||
|
|
||||||
def generate_system_message(self, prompt_system, prompt_human):
|
def generate_commit_message(self):
|
||||||
|
prompt_system = "You are a blog creator commiting a piece of content to a central git repo"
|
||||||
|
prompt_human = f"Generate a 10 word git commit message describing {self.response}"
|
||||||
messages = [("system", prompt_system), ("human", prompt_human),]
|
messages = [("system", prompt_system), ("human", prompt_human),]
|
||||||
ai_message = self.llm.invoke(messages).text()
|
commit_message = self.llm.invoke(messages).text()
|
||||||
return ai_message
|
return commit_message
|
54
src/main.py
54
src/main.py
@ -1,13 +1,7 @@
|
|||||||
import ai_generators.ollama_md_generator as omg
|
import ai_generators.ollama_md_generator as omg
|
||||||
import trilium.notes as tn
|
import trilium.notes as tn
|
||||||
import repo_management.repo_manager as git_repo
|
import repo_management.repo_manager as git_repo
|
||||||
from notifications.n8n import N8NWebhookJwt
|
|
||||||
import string,os
|
import string,os
|
||||||
from datetime import datetime
|
|
||||||
from dotenv import load_dotenv
|
|
||||||
load_dotenv()
|
|
||||||
print(os.environ["CONTENT_CREATOR_MODELS"])
|
|
||||||
|
|
||||||
|
|
||||||
tril = tn.TrilumNotes()
|
tril = tn.TrilumNotes()
|
||||||
|
|
||||||
@ -30,51 +24,11 @@ for note in tril_notes:
|
|||||||
ai_gen = omg.OllamaGenerator(os_friendly_title,
|
ai_gen = omg.OllamaGenerator(os_friendly_title,
|
||||||
tril_notes[note]['content'],
|
tril_notes[note]['content'],
|
||||||
tril_notes[note]['title'])
|
tril_notes[note]['title'])
|
||||||
blog_path = f"generated_files/{os_friendly_title}.md"
|
blog_path = f"/blog_creator/generated_files/{os_friendly_title}.md"
|
||||||
ai_gen.save_to_file(blog_path)
|
ai_gen.save_to_file(blog_path)
|
||||||
|
|
||||||
|
|
||||||
# Generate commit messages and push to repo
|
# Generate commit messages and push to repo
|
||||||
print("Generating Commit Message")
|
commit_message = ai_gen.generate_commit_message()
|
||||||
git_sytem_prompt = "You are a blog creator commiting a piece of content to a central git repo"
|
git_user = os.environp["GIT_USER"]
|
||||||
git_human_prompt = f"Generate a 5 word git commit message describing {ai_gen.response}. ONLY OUTPUT THE RESPONSE"
|
|
||||||
commit_message = ai_gen.generate_system_message(git_sytem_prompt, git_human_prompt)
|
|
||||||
git_user = os.environ["GIT_USER"]
|
|
||||||
git_pass = os.environ["GIT_PASS"]
|
git_pass = os.environ["GIT_PASS"]
|
||||||
repo_manager = git_repo.GitRepository("blog/", git_user, git_pass)
|
repo_manager = git_repo("blog/", git_user, git_pass)
|
||||||
print("Pushing to Repo")
|
|
||||||
repo_manager.create_copy_commit_push(blog_path, os_friendly_title, commit_message)
|
repo_manager.create_copy_commit_push(blog_path, os_friendly_title, commit_message)
|
||||||
|
|
||||||
# Generate notification for Matrix
|
|
||||||
print("Generating Notification Message")
|
|
||||||
git_branch_url = f'https://git.aridgwayweb.com/armistace/blog/src/branch/{os_friendly_title}/src/content/{os_friendly_title}.md'
|
|
||||||
n8n_system_prompt = f"You are a blog creator notifiying the final editor of the final creation of blog available at {git_branch_url}"
|
|
||||||
n8n_prompt_human = f"""
|
|
||||||
Generate an informal 100 word
|
|
||||||
summary describing {ai_gen.response}.
|
|
||||||
Don't address it or use names. ONLY OUTPUT THE RESPONSE.
|
|
||||||
ONLY OUTPUT IN PLAINTEXT STRIP ALL MARKDOWN
|
|
||||||
"""
|
|
||||||
notification_message = ai_gen.generate_system_message(n8n_system_prompt, n8n_prompt_human)
|
|
||||||
secret_key = os.environ['N8N_SECRET']
|
|
||||||
webhook_url = os.environ['N8N_WEBHOOK_URL']
|
|
||||||
notification_string = f"""
|
|
||||||
<h2>{tril_notes[note]['title']}</h2>
|
|
||||||
<h3>Summary</h3>
|
|
||||||
<p>{notification_message}</p>
|
|
||||||
<h3>Branch</h3>
|
|
||||||
<p>{os_friendly_title}</p>
|
|
||||||
<p><a href="{git_branch_url}">Link to Branch</a></p>
|
|
||||||
"""
|
|
||||||
|
|
||||||
payload = {
|
|
||||||
"message": f"{notification_string}",
|
|
||||||
"timestamp": datetime.now().isoformat()
|
|
||||||
}
|
|
||||||
|
|
||||||
webhook_client = N8NWebhookJwt(secret_key, webhook_url)
|
|
||||||
|
|
||||||
print("Notifying")
|
|
||||||
n8n_result = webhook_client.send_webhook(payload)
|
|
||||||
|
|
||||||
print(f"N8N response: {n8n_result['status']}")
|
|
||||||
|
@ -1,45 +0,0 @@
|
|||||||
from datetime import datetime, timedelta
|
|
||||||
import jwt
|
|
||||||
import requests
|
|
||||||
from typing import Dict, Optional
|
|
||||||
|
|
||||||
class N8NWebhookJwt:
|
|
||||||
def __init__(self, secret_key: str, webhook_url: str):
|
|
||||||
self.secret_key = secret_key
|
|
||||||
self.webhook_url = webhook_url
|
|
||||||
self.token_expiration = datetime.now() + timedelta(hours=1)
|
|
||||||
|
|
||||||
def _generate_jwt_token(self, payload: Dict) -> str:
|
|
||||||
"""Generate JWT token with the given payload."""
|
|
||||||
# Include expiration time (optional)
|
|
||||||
payload["exp"] = self.token_expiration.timestamp()
|
|
||||||
encoded_jwt = jwt.encode(
|
|
||||||
payload,
|
|
||||||
self.secret_key,
|
|
||||||
algorithm="HS256",
|
|
||||||
)
|
|
||||||
return encoded_jwt #jwt.decode(encoded_jwt, self.secret_key, algorithms=['HS256'])
|
|
||||||
|
|
||||||
def send_webhook(self, payload: Dict) -> Dict:
|
|
||||||
"""Send a webhook request with JWT authentication."""
|
|
||||||
# Generate JWT token
|
|
||||||
token = self._generate_jwt_token(payload)
|
|
||||||
|
|
||||||
# Set headers with JWT token
|
|
||||||
headers = {
|
|
||||||
"Authorization": f"Bearer {token}",
|
|
||||||
"Content-Type": "application/json"
|
|
||||||
}
|
|
||||||
|
|
||||||
# Send POST request
|
|
||||||
response = requests.post(
|
|
||||||
self.webhook_url,
|
|
||||||
json=payload,
|
|
||||||
headers=headers
|
|
||||||
)
|
|
||||||
|
|
||||||
# Handle response
|
|
||||||
if response.status_code == 200:
|
|
||||||
return {"status": "success", "response": response.json()}
|
|
||||||
else:
|
|
||||||
return {"status": "error", "response": response.status_code, "message": response.text}
|
|
@ -1,5 +1,4 @@
|
|||||||
import os, shutil
|
import os, shutil
|
||||||
from urllib.parse import quote
|
|
||||||
from git import Repo
|
from git import Repo
|
||||||
from git.exc import GitCommandError
|
from git.exc import GitCommandError
|
||||||
|
|
||||||
@ -11,20 +10,11 @@ class GitRepository:
|
|||||||
def __init__(self, repo_path, username=None, password=None):
|
def __init__(self, repo_path, username=None, password=None):
|
||||||
git_protocol = os.environ["GIT_PROTOCOL"]
|
git_protocol = os.environ["GIT_PROTOCOL"]
|
||||||
git_remote = os.environ["GIT_REMOTE"]
|
git_remote = os.environ["GIT_REMOTE"]
|
||||||
#if username is not set we don't need parse to the url
|
remote = f"{git_protocol}://{username}:{password}@{git_remote}"
|
||||||
if username==None or password == None:
|
|
||||||
remote = f"{git_protocol}://{git_remote}"
|
|
||||||
else:
|
|
||||||
# of course if it is we need to parse and escape it so that it
|
|
||||||
# can generate a url
|
|
||||||
git_user = quote(username)
|
|
||||||
git_password = quote(password)
|
|
||||||
remote = f"{git_protocol}://{git_user}:{git_password}@{git_remote}"
|
|
||||||
|
|
||||||
if os.path.exists(repo_path):
|
if os.path.exists(repo_path):
|
||||||
shutil.rmtree(repo_path)
|
shutil.rmtree(repo_path)
|
||||||
self.repo_path = repo_path
|
self.repo_path = repo_path
|
||||||
print("Cloning Repo")
|
|
||||||
Repo.clone_from(remote, repo_path)
|
Repo.clone_from(remote, repo_path)
|
||||||
self.repo = Repo(repo_path)
|
self.repo = Repo(repo_path)
|
||||||
self.username = username
|
self.username = username
|
||||||
@ -50,9 +40,8 @@ class GitRepository:
|
|||||||
|
|
||||||
def pull(self, remote_name='origin', ref_name='main'):
|
def pull(self, remote_name='origin', ref_name='main'):
|
||||||
"""Pull updates from a remote repository with authentication"""
|
"""Pull updates from a remote repository with authentication"""
|
||||||
print("Pulling Latest Updates (if any)")
|
|
||||||
try:
|
try:
|
||||||
self.repo.remotes[remote_name].pull(ref_name)
|
self.repo.remotes[remote_name].pull(ref_name=ref_name)
|
||||||
return True
|
return True
|
||||||
except GitCommandError as e:
|
except GitCommandError as e:
|
||||||
print(f"Pulling failed: {e}")
|
print(f"Pulling failed: {e}")
|
||||||
@ -61,23 +50,21 @@ class GitRepository:
|
|||||||
def get_branches(self):
|
def get_branches(self):
|
||||||
"""List all branches in the repository"""
|
"""List all branches in the repository"""
|
||||||
return [branch.name for branch in self.repo.branches]
|
return [branch.name for branch in self.repo.branches]
|
||||||
|
|
||||||
|
|
||||||
def create_and_switch_branch(self, branch_name, remote_name='origin', ref_name='main'):
|
def create_branch(self, branch_name, remote_name='origin', ref_name='main'):
|
||||||
"""Create a new branch in the repository with authentication."""
|
"""Create a new branch in the repository with authentication."""
|
||||||
try:
|
try:
|
||||||
print(f"Creating Branch {branch_name}")
|
|
||||||
# Use the same remote and ref as before
|
# Use the same remote and ref as before
|
||||||
self.repo.git.branch(branch_name)
|
self.repo.git.branch(branch_name, commit=True)
|
||||||
except GitCommandError:
|
return True
|
||||||
print("Branch already exists switching")
|
except GitCommandError as e:
|
||||||
# ensure remote commits are pulled into local
|
print(f"Failed to create branch: {e}")
|
||||||
self.repo.git.checkout(branch_name)
|
return False
|
||||||
|
|
||||||
def add_and_commit(self, message=None):
|
def add_and_commit(self, message=None):
|
||||||
"""Add and commit changes to the repository."""
|
"""Add and commit changes to the repository."""
|
||||||
try:
|
try:
|
||||||
print("Commiting latest draft")
|
|
||||||
# Add all changes
|
# Add all changes
|
||||||
self.repo.git.add(all=True)
|
self.repo.git.add(all=True)
|
||||||
# Commit with the provided message or a default
|
# Commit with the provided message or a default
|
||||||
@ -85,18 +72,20 @@ class GitRepository:
|
|||||||
commit_message = "Added and committed new content"
|
commit_message = "Added and committed new content"
|
||||||
else:
|
else:
|
||||||
commit_message = message
|
commit_message = message
|
||||||
self.repo.git.commit(message=commit_message)
|
self.repo.git.commit(commit_message=commit_message)
|
||||||
return True
|
return True
|
||||||
except GitCommandError as e:
|
except GitCommandError as e:
|
||||||
print(f"Commit failed: {e}")
|
print(f"Commit failed: {e}")
|
||||||
return False
|
return False
|
||||||
|
|
||||||
def create_copy_commit_push(self, file_path, title, commit_messge):
|
def create_copy_commit_push(self, file_path, title, commit_messge):
|
||||||
self.create_and_switch_branch(title)
|
self.create_branch(title)
|
||||||
|
|
||||||
self.pull(ref_name=title)
|
|
||||||
shutil.copy(f"{file_path}", f"{self.repo_path}src/content/")
|
shutil.copy(f"{file_path}", f"{self.repo_path}src/content/")
|
||||||
|
|
||||||
self.add_and_commit(f"'{commit_messge}'")
|
self.add_and_commit(commit_messge)
|
||||||
|
|
||||||
self.repo.git.push()
|
self.repo.git.push(remote_name='origin', ref_name=title, force=True)
|
||||||
|
|
||||||
|
def remove_repo(self):
|
||||||
|
shutil.rmtree(self.repo_path)
|
Loading…
x
Reference in New Issue
Block a user