-
This step is even more complex than it might be as I want to cache checkouts between builds to avoid having to download all 50GB of my site on every build—the Act Runner differs from the GitHub Actions runner in that it doesn’t use stable checkout locations, necessitating some contortions. ↩
- download links
- screenshots
- documentation
- consolidate shared resources in the source tree
Week 4—Monday
Archiving, Emulation, and Painting
The week got off to a slow start as I found myself spending much of Monday writing up the remainder of week 3. Beyond that, I spent time putting a final base coat of paint on the sign for our friend’s coffee shop, briefly revisited the world of Psion emulation, and set my intentions for the week ahead.
Psion ROMs
Nigel, the Psion community’s resident MAME expert, has been on a renewed push to get folks to dig out their Psions (not something that needs much encouragement), check their ROM versions, and dump them if they’re not already on record.
Much to my surprise, neither of the builds on my Revo Plus or Series 7 had been dumped, so I broke out PsiROMx and set about rectifying this travesty. Fortunately, the process of dumping an EPOC32 ROM is easy: simply select ‘Save ROM’ and specify the output location. (There seem to be some issues dumping Series 5mx Pro devices, but there are still many earlier devices we need to archive, and PsiROMx serves us well here.)
PsiROMx’s interface is wonderfully simple
I uploaded these two new ROMs to the Psion-ROM archive on GitHub and was rewarded not long after by the following screenshots of my Revo Plus ROM running in MAME:
Since I was already poking around in the Psion-ROM repository, I also took a few minutes to set up automated builds that package ROMs for use with MAME—I’d love to a establish a single source of ROMs for MAME-based Psion emulation and use this in PsiEmu to make it easy for folks to get started.
Next Steps
Having seen some indication that the hanging issues I’ve been seeing with my website builds might be a bug in InContext—my static site builder—I’ve decided to focus on that for the week. Beyond debugging the hang, I have a growing list of fixes and improvements to make and, if time permits, I’d love to flesh out the Linux support.
Week 3—Tuesday Onwards
Nothing but Infrastructure
Keen to make forward progress in spite of other (mostly administrative) distractions, I found myself spending the rest of the week 3 bouncing between infrastructural tasks that I hope will help lay the foundations for future work: trying Portainer, self-hosting Forgejo, and installing FreshRSS. I also took a little time out to continue with some real world maintenance.
Trying Portainer
While I’ve been running home infrastructure for a little while, I’m fairly new to the whole thing and I’ve yet to establish my own preferences and best practices. Using Docker to run services, for example, still makes me deeply uncomfortable: I’ve been using Docker Compose—docker-compose.yml files are easy to version using a combination git Git and Ansible—but I find updates and container life cycles hard to manage. With that in mind, I decided to take a shot at using Portainer.
Bootstrapping Portainer proved incredibly easy using Docker Compose:
services:
portainer:
container_name: portainer
image: portainer/portainer-ce:lts
restart: always
volumes:
- /var/run/docker.sock:/var/run/docker.sock
- /storage/services/portainer/data:/data
ports:
- 9443:9443
- 8000:8000
networks:
default:
name: portainer_network
(This deviates very slightly from the off-the-shelf configuration, mapping /data to /storage/services/portainer/data to ensure it’s stored on my ZFS pool and easy to back up.)
Once it was up and running (docker compose up -d), I found Portainer offers a comprehensive management interface, allowing you to create new container ‘stacks’ using Docker Compose files:
Given how easy it was to set up, I wish I’d tried Portainer earlier: I’m already finding its container and image management very convenient, and it feels like a great low-effort tool for trying out new services. I may, however, still turn to manually managed compose files for services that I choose to keep around.
Self-Hosting Forgejo
Over the past couple of weeks, I’ve noticed an uptick in hangs with my website builds using GitHub Actions. This has got in the way of writing, and I decided to see what I could do to improve things. Thinking the issue might be related to a longstanding issue in .NET process management and something that’s unlikely to get fixed any time soon (that the GitHub Actions runner might be failing to notice my build script terminating successfully), I decided to make the most of the opportunity to try out Forgejo to store and build my site.
With Portainer ready to go, installing Forgejo provided incredibly easy—I just used the Docker Compose file from their installation instructions (again mapping the data volume to my ZFS storage):
services:
forgejo:
image: codeberg.org/forgejo/forgejo:13
container_name: forgejo
restart: always
environment:
- USER_UID=1000
- USER_GID=1000
volumes:
- /storage/services/forgejo/data:/data
- /etc/timezone:/etc/timezone:ro
- /etc/localtime:/etc/localtime:ro
ports:
- "3000:3000"
- "222:22"
Everything just worked and, armed with a working Forgejo instance, I pushed my website—including 50GB of LFS files. Much to my surprise, this also worked (once I remembered to run git lfs fetch --all origin main to ensure I had a local copy of all LFS data)—I had been fully expecting Git LFS to require further setup, but it seems it’s provisioned out of the box with Forgejo.
Configuring automated builds using Forgejo Actions proved a little more nuanced however: the Forgejo runner doesn’t support macOS, so you have to use Gitea’s Act Runner instead. I also encountered real problems with Git LFS checkouts on the runner, necessitating a manual—and worryingly precarious—checkout step1:
- name: Checkout source
shell: bash
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
set -euo pipefail
URL_BASE="${{ github.server_url }}"
mkdir -p "$CHECKOUT_ROOT"
if [ ! -d "$CHECKOUT_ROOT/.git" ]; then
git clone --origin origin --no-checkout "https://${URL_BASE#https://}/${{ github.repository }}.git" "$CHECKOUT_ROOT"
fi
cd "$CHECKOUT_ROOT"
git clean -fdx
git config url."https://${GITHUB_TOKEN}@${URL_BASE#https://}/".insteadOf "https://${URL_BASE#https://}/"
git config --unset-all http.${{ github.server_url }}/.extraheader || true # Unhappy LFS workaround.
git fetch --prune --tags origin
git checkout "${{ github.ref_name }}"
git reset --hard "origin/${{ github.ref_name }}"
From the discussion on Forgejo’s issue tracker, it seems the LFS issue might be related to my running Forgejo behind an nginx proxy. I’d like to investigate this but, for the time being, I have an approach that works.
With the checkout issues resolved, the rest of the build worked with no changes, lifted verbatim from my GitHub Actions workflow—it’s really quite impressive and fills me with hope for migrating other projects. I think I might still be seeing infrequent hangs in my builds though, so I wonder if there’s a race condition buried deep in InContext’s asynchronous code. Something to keep an eye on. 👀
Installing FreshRSS
Having written about the process of installing two services already this week, I’ll not go into much detail about FreshRSS. Suffice to say, I used Docker, Portainer, and nginx as a reverse proxy. It went smoothly.
It’s nice to finally have a self-hosted feed reader again after nearly 20 years (though I still long for Shaun Inman’s Fever feed reader), and I’ve been pleasantly surprised to discover it’s well supported by client apps like NetNewsWire. I’m hopeful the switch will unlock syncing with various retro computers in the future.
Signage
The week also brought an analogue pursuit in the form of some maintenance work on the sign for Mele Mele, our friend’s coffee shop. The Hawaiian climate is brutal and will destroy near-everything, so I took the opportunity of Sarah repainting the sign to reinforce it in the hope it’ll last another few years.
Week 3—Monday
Documentation and Downloads
Week 3 of the year continues the concerted (if somewhat piecemeal) process of project spring cleaning. Shipping the Qt version of OpoLua remains a priority and I spent much of the day on release adjacent tasks. I also found my thoughts turning to grand ideas of finally packaging things like Reporter, changes, and InContext, but I must resist! (For the time being, at least.) That said, I did take a little time out to work on the endless task that is home infrastructure.
OpoLua
Last week, before getting distracted by packaging all the things, I identified a few remaining tasks for shipping OpoLua, our modern OPL runtime:
Of these, the highest priorities seemed to be ‘download links’, and ‘documentation’. Both of these will help users install the app: links to downloadable packages; and , for Linux, instructions for how to add our (as yet non-existent) package repositories.
Download Links
The OpoLua website is a Jekyll static site that lives the docs folder of the source tree, and is built as part of our GitHub Actions build workflow. This makes it really easy to inject details like the current version number into the site build and, armed with the jekyll-environment-variables plugin, it was easy to update the template to add a link to the latest GitHub release:
<p class="download-links">
<a href="https://apps.apple.com/app/opolua/id1604029880">
<img src="images/Download_on_the_App_Store_Badge_US-UK_RGB_blk_092917.svg" />
</a>
<br/>
<a href="https://github.com/inseven/opolua/releases/tag/{{ site.env.VERSION_NUMBER }}">
Download for macOS, Windows, and Linux
</a>
</p>
While I’d like to increase the prominence of these new platforms (the App Store link dominates somewhat), something is better than nothing, and I’m pretty pleased with the result:
Documentation
As OpoLua has grown to target multiple platforms, support different OPL versions, and include a suite of CLI utilities, it’s become clear that short FAQ we created when we first shipped was insufficient. Keen to make room for growth, I added a documentation section, styled after Just The Docs:
Our documentation is sparse but now there’s somewhere to put it
To complement these structural changes, I also added highlight.js, Mermaid, and jekyll-gfm-admonitions, to provide us with conveniences for syntax highlighting, diagramming, and GitHub-style admonitions respectively.
Home Infrastructure
Over the past few days, I’ve been slowly pushing on with my home infrastructure setup. I finally resigned myself to using Ansible for managing my NAS/server—hopefully this will make it easier to keep the configuration in source control and increase my confidence as I continue to move away from Big Tech hosted services.
I also took delivery of a new 1U rack shelf and spent a little while moving my LincStation N2 NAS/server into the rack, indulging in the prerequisite CAD and 3D printing to mount the network switch alongside it: