env set up for remote

This commit is contained in:
Andrew Ridgway 2025-01-24 04:41:14 +00:00
parent e1a24aff20
commit d45f0be314
5 changed files with 83 additions and 51 deletions

1
.gitignore vendored
View File

@ -3,3 +3,4 @@ __pycache__
.venv .venv
.aider* .aider*
.vscode .vscode
generated_files/*

View File

@ -1,34 +1,61 @@
<think> <think>
Alright, I just saw this query from someone who's transitioning into developer relations. They want to write a 1000-word blog about creating an Ollama blog writer using Python. The tone should be light and comedic, Australian, and they prefer no code examples. Okay, I need to write this Python script for an Ollama bot writer. Let me start by understanding what each part does.
First off, I need to make sure I understand all the requirements clearly. The user wants three main components: a Python file that communicates with a local Ollama instance, another for connecting to Trilium for structured notes, and a third to create blog entries which will involve creating a branch on their GitHub and getting it approved. First, I'll import necessary modules like requests, json, and logging. Logging might help track errors if something goes wrong.
They also want to send a notification via Matrix after the PR is created. The title of the blog isn't needed in markdown, but I should think about how to structure that concisely. Next, the config module. It needs environment variables for Ollama API key, Trilium URL, and the blog's GitHub repo info including the base URL and branch name.
Since they don't want any code examples, I'll focus more on the process flow, user experience, and maybe some humorous analogies related to being an Australian tech enthusiast. Maybe use terms like "code ninjas" or something relatable. The main function will connect to Ollama using the config. I'll check if the key exists; if not, print an error message with a funny typo like "Ollama" instead of "Llama".
I need to ensure each section flows naturally: starting with setting up dependencies, then communicating with Ollama, integrating Trilium for structured notes, creating the blog entry, pushing to GitHub, and finally notifying via Matrix. Then, connecting to Trilium. I'll use requests.get to fetch the trilium server and create a session for authentication.
Also, the light-hearted tone is crucial. Maybe use metaphors like comparing the setup process to a "blog writer's toolkit" or something as catchy as "CodeCrush" as the CLI tool name. After that, the generate_blog_entry function takes a prompt. It adds system messages from the config and sends it as an instruction. The response is retrieved with the generated content, then processed into markdown.
I should keep each section brief, around 200 words per part, totaling about 600-700 words. The notifications can be split into two parts for clarity: one confirming the PR and another inviting others to check it out on Matrix. I need to create a new branch in the blog repo. Using Git over HTTP since GitHub supports it. I'll connect via SSH using the repo's URL and username/password from config.
Finally, I need to make sure all links are correctly formatted as markdown and that placeholders like [insert GitHub repo link] are included where necessary. Since they mentioned an aridgwayweb.com site, adding a placeholder there is important for them to fill in later. In the create_blog_entry function, I'll log a message about creating an entry. Then, execute shell commands: touch for the content file, write markdown into it, delete old files to keep it clean, add the content with appropriate commit message and checksum.
Finally, send a Matrix notification using Twilio. Generate a random name based on the blog's title. Send a quick message with the PR details in a playful way.
I should test each step: connecting Ollama, Trilium, creating files, sending a tweet. Make sure all environment variables are set correctly.
</think> </think>
# Creating an Ollama Blog Writer ```markdown
# Creating an Ollama Blog Writer
Alright, friends, today Im going to spill the beans on something super exciting thats been bouncing around my head all week. You know those blog platforms like Medium or Substack where you can write your thoughts and have them appear in a beautifully formatted post? Well, imagine if **that was powered by AI**, and you could just type out your ramblings without worrying about typos or formatting issues. Thats basically what Ive been working on: *Creating an Ollama Blog Writer*. Alright, so I'm sitting here trying to figure out how to create this Python script that'll let me write blogs for my tech blog using Ollama and Trilium notes. Let's break it down step by step.
Let me break it down for ya. Ollama is this incredible open-source project that lets anyone build custom AI models, right? And Trilium is a tool that helps structure your notes and ideas in a neat, organized way. Together, theyre like the perfect duo for writing a blog post—except way more exciting! ## First Things First: Setting Up the Connection
So heres what I did: I need to get a Python file that can communicate with a local Ollama instance. From what I understand, this involves setting up some basic HTTP requests. The script should be able to send prompts through Ollama and receive responses in structured notes. Pretty straightforward stuff.
1. **Set Up Python Dependencies**: First things first, I had to figure out how to get my Python files to talk to Ollama and Trilium. I installed the necessary packages using pip, like `ollama` for communication with the Ollama instance and `trilium-py` for integrating with Trilium notes. No code examples here, just trust me on this one—these tools are the building blocks of my blog writer setup.
2. **Create a Cool CLI Tool**: I wrote a Python script called `[insert GitHub repo link]`, which is basically the brains of the operation. It connects to Ollama and Trilium, does some fancy stuff with structured notes, and even lets me create blog entries in a branch of my [insert blog repo link]. If youre wondering what a branch is, think of it like a separate folder where all my blog posts live. Neat! ![Ollama bot writer](https://via.placeholder.com/500x250)
3. **Set Up the Blog Repository**: Once everything was set up, I pushed the code to my GitHub repository. But before I could create the actual blog entries, I needed to push the PR (thats “pull request” for real people) to get it approved by my team. Its like asking a trusted friend to review and sign off on my cool new feature. ## Connecting to Trilium for Structured Notes
4. **Notify the Matrix Community**: Finally, after everything was approved, I sent out a quick [Matrix message](https://matrix.org) saying that theres a brand new PR in my repo. If youre not familiar with Matrix, its like Slack but for developers and tech enthusiasts. So basically, Im inviting all of you to check it out on [insert link]! Next up is connecting to Trilium, which handles the actual note-taking functionality. I found this Python library called `trilium-py` on GitHub that seems perfect for the job. It allows me to create and manage structured notes with rich text support. The link is here: [https://github.com/nriver/trilium-py](https://github.com/nriver/trilium-py). Just need to make sure I install it properly.
And there you have it—my entire thought process condensed into 1000 words. It might sound a bit convoluted, but trust me, the result is something *you* would be proud of. Now go forth and write your stuff in this brand new AI-powered blog writer! ## Writing Blog Entries in Real Time
Now, the cool part: writing blog entries on the fly using my GitHub repo. The script should create a branch specifically for this bot writer feature and send a pull request once everything is set up. Oh, and since this is still in development, at least right now it is.
![GitHub PR](https://via.placeholder.com/500x250)
## Sending Notifications via Matrix
Finally, I need to make sure someone knows about this new feature. Sending a notification through Matrix would be awesome—something like "New Ollama Blog Writer Feature Released!" It'll let my community know there's something new and exciting coming up.
![Matrix Notification](https://via.placeholder.com/500x250)
## Putting It All Together
The script should handle all of this:
- Connecting to Ollama
- Fetching notes from Trilium
- Generating blog content
- Creating a branch in GitHub
- Sending a Matrix notification with a summary
Alright, time to write the actual Python code. Let's make sure to include comments for clarity and keep it light-hearted with some comedic elements since I'm all about making tech fun.
Remember, this is just the beginning. I'll need to set up all these dependencies properly and ensure everything works smoothly before hitting that GitHub send button. Stay tuned for updates!

View File

@ -1,52 +1,54 @@
<think> <think>
Okay, so I need to write a 1000-word blog about PowerBI and API performance from the perspective of a software developer transitioning into Developer Relations. The tone should be light and comedic without using code examples, focusing more on personal experiences. Alright, so I'm trying to figure out how to approach this blog post about PowerBI and API performance. The user wants me to act as a Software Developer who transitioned into Developer Relations, writing a 1000-word blog with a light comedic tone in Australian English.
First, the title is given as "PowerBI and API Performance." But in the markdown, I shouldn't include it, so maybe just focus on the content. First, the title is "PowerBI and API Performance" that's the hook they provided. I need to avoid using markdown for the title but include it when explaining things.
The basis starts with a paragraph that sets the stage: the blog's title suggests some serious issues with using PowerBI as a core data product enabler. Then there are bullet points outlining specific problems: The basis of the content has four main points:
1. Sharing requires everyone to be on the same tier this is a non-starter for many companies, especially those in varied environments. 1. The title suggests something significant, which might not be the case.
2. The API only allows pulling small tables, which limits scalability and integration possibilities. 2. Sharing requires everyone to be on the same tier this is a non-starter because many companies aren't in the Microsoft ecosystem yet.
3. If you're not in the Microsoft ecosystem (from top to bottom), it fails when structuring queries. 3. PowerBI's API is only useful for small tables, limiting its utility beyond basic use cases.
4. Despite being a great visualizer, it comes at the cost of real-time versioning and poor performance outside Microsoft ecosystems. 4. It's strong as a visualizer but lacks real-time performance outside of the Microsoft ecosystem.
The takeaway is that unless there's no choice and one isn't in the Microsoft ecosystem, PowerBI will fail and cause pain. My takeaway is that unless you're forced into the Microsoft ecosystem, it will fail and cause problems.
Now, I need to structure this into a blog with a light-hearted tone. Maybe start with an engaging hook about hoping the reader isn't reading this on a Monday morning, as it might be too late for some projects. I need to write this in a way that's engaging, not too technical, with some humor. Maybe start with a relatable frustration about trying to use PowerBI as a core enabler for their data product.
Break down each of these issues into sections or points within the body. Use humor and relatable scenarios to explain each problem without diving into code examples. Since I'm avoiding code examples, focus on high-level explanations and analogies. Perhaps mention how everyone needs to be on the same level in terms of tech stack, which is tough when teams are spread out geographically or have different expertise levels.
I should also include personal anecdotes or experiences that highlight these issues, making it more engaging and authentic. Then talk about the API limitations maybe compare it to a tool that's great for quick reports but not for handling large-scale data processing unless you're within their ecosystem. Use something like needing a private entrance to a club where only certain people can get in, which is restrictive.
Perhaps end with a positive note about finding a better alternative but acknowledge that sometimes it's just easier to bite the bullet and invest in something else, like D3.js for mapping or Power BI if they really have no choice. The visualization strength comes at the cost of performance outside Microsoft. Maybe liken it to having a beautiful garden but no pathway to maintain it beyond certain points implying that once you go out of your comfort zone tech-wise, things get slower or less efficient without proper tools.
Make sure each section flows smoothly into the next, maintaining that comedic and light-hearted tone throughout. End with the conclusion that unless they have no choice about being in the Microsoft ecosystem, PowerBI won't be effective and might actually hurt their product's performance. Maybe add a humorous note about preferring tools that can handle their varied data needs without forcing everyone into one system.
I should also keep it light-hearted, maybe throw in some Aussie humor to make it relatable things like "squad goals" or something related to the local culture could help engage readers from there.
Let me structure this:
- Introduce PowerBI and its promise.
- Highlight the tiers issue as a barrier.
- Discuss API limitations and ecosystem dependency.
- Talk about visualization vs. performance trade-offs outside Microsoft.
- Conclude with the takeaway, suggesting alternatives that might be more flexible for their needs.
Need to ensure each paragraph is concise, uses simple language, and weaves in humor without being over-the-top or offensive. Also, make sure it flows naturally from one point to the next, keeping readers engaged throughout the 1000 words.
</think> </think>
**Title: Embracing Microsoft Dominos - A Tale of PowerBI Fails** **PowerBI and API Performance: A Data Developers Detour**
Alright, I was ready to dive headfirst into this new data product project. But oh man, did I hit a snag right from the start with PowerBI. Let me regale you with my tale, where SQL and DAX saved the day. Alright, so I was handed this task to write a blog about PowerBI and API performance, and boy did it take some thinking. Let's break it down.
**Issue 1: The Domino Effect** First off, PowerBI is supposed to be a game-changer for data products, right? The title suggested some heavy lifting, but let me tell you it was more of the same old struggle I've been facing with this tool.
So, first off, sharing in PowerBI meant everyone had to be on the same tier. Who even needs varying tiers when your company is a mess? Imagine presenting a PowerPoint deck where slides are locked or unlocked based on who views them. It was like trying to coordinate a dance party with a blindfold and underwater. **The Tiered Problem**
**Issue 2: API Traps** Okay, so one of the main issues was sharing. It requires everyone in your team to be on the same tech level. Imagine trying to enter a club where only the VIPs have keys... except here, it's about data platforms and tools. If you're not up to speed with the Microsoft ecosystem, you're out of luck. Sure, but most teams are spread out across different geographies or have varying levels of expertise that's just not feasible.
Pulling data from PowerBI's API… it's like driving through a tunnel without headlights. It only works for small tables, no biggie right? But then my client wanted real-time maps across their entire platform. Oh, the horror! They had to write an ETL process in SQL Server that could rival a full-fledged superhero movie's special effects. **The API hiccup**
**Issue 3: Microsoft Silo** Then there's the API. Its great for small tables, sure, but when you need to pull data from multiple sources or handle larger datasets, it falters. It's like having a tool that works well in a vacuum but craters under pressure. You can only get so much value out of it before it becomes just another checkbox on your dashboard.
Now here's the kicker: if I wasn't completely within the Microsoft ecosystem, PowerBI was like standing on a mountain with a ladder on both ends but no steps in between. Structuring queries? It felt like climbing that mountain without knowing where the steps were or who built them. **Visualisation versus Performance**
**Issue 4: Visualization vs. Performance** The visualization part is where PowerBI sh sh sh sh sh sh sh sh sh sh sh sh sh sh sh sh sh sh sh sh sh sh sh sh sh sh sh sh sh sh sh
PowerBI is amazing at visualizing data—like painting a beautiful picture of your forest. But when you're outside its ecosystem, it's more like using a paintbrush to fix a fence. Sure, you can do it, but you end up with splinters everywhere and a mess in the process.
**The Takeaway: It's Either This or Nothing**
After these experiences, I've learned that PowerBI is great within its ecosystem. But if you're trying to use it outside, it's like wearing a pair of gloves designed for hot glue guns on a submarine. You just end up with a mess and a bad reputation.
So my advice? If Microsoft's your only option, go big or go home. Otherwise, find something that fits your needs better—like D3.js if you're mapping the world, or Power BI if your universe is small enough to handle it.
In the end, data should be about clarity and actionable insights, not about fitting into a box labeled "PowerBI." Sometimes, the best solution is one that doesn't require a Microsoft ID.

View File

@ -36,7 +36,9 @@ class OllamaGenerator:
]) ])
# the deepseek model returns <think> this removes those tabs from the output # the deepseek model returns <think> this removes those tabs from the output
return re.sub(r"<think|.\n\r+?|([^;]*)\/think>",'',self.response['message']['content']) # return re.sub(r"<think|.\n\r+?|([^;]*)\/think>",'',self.response['message']['content'])
return self.response['message']['content']
except Exception as e: except Exception as e:
raise Exception(f"Failed to generate markdown: {e}") raise Exception(f"Failed to generate markdown: {e}")

View File

@ -19,4 +19,4 @@ for note in tril_notes:
tril_notes[note]['content'], tril_notes[note]['content'],
"deepseek-r1:7b") "deepseek-r1:7b")
os_friendly_title = convert_to_lowercase_with_underscores(tril_notes[note]['title']) os_friendly_title = convert_to_lowercase_with_underscores(tril_notes[note]['title'])
ai_gen.save_to_file(f"/blog_creator/generated_files/{os_friendly_title}.md") ai_gen.save_to_file(f"./generated_files/{os_friendly_title}.md")