Compare commits

...

2 Commits

Author SHA1 Message Date
3da4e12f15 New dev post with simple image
All checks were successful
Build and Deploy Resume Site / setup-website-content (push) Successful in 14s
2024-10-08 16:02:53 -07:00
6ab5a735ff New dev post :cowboy: 2024-10-08 15:59:05 -07:00
2 changed files with 91 additions and 0 deletions

View File

@ -0,0 +1,88 @@
---
date: '2024-10-08T15:33:49-07:00'
title: Easier Updates Finally
description: Back on track to updating things normally
thumbnail: /img/dev/hammer-wrench.png
article: true
---
# On Updating this Site
For a while now I've been updating this whole site manually since I took down my
old Fargate + S3 infrastructure. Even though that made updating the site super
easy it did mean that I was basically totally vendor locked and switching to a
new provider like Vultr was a massive pain.
Now that I have my own Git server and CI/CD with that Git server I have a setup
that let's me comfortably update my stuff without being totally locked into
someone else's CI stuff. Gitlab is great but I really wanted my own thing
so now I'm on Gitea basically.
## Ergonomics of the new pipeline
Because Gitea uses the same type of workflow files as Github I can literally
do the following:
```yaml
name: Build and Deploy Resume Site
on:
push:
branches:
- master
jobs:
setup-website-content:
runs-on: ubuntu-latest
container:
image: shockrah/hugo
steps:
- run: git init
- run: git remote add origin https://git.shockrah.xyz/shockrah/temper-tv.git
- run: git pull origin master
- name: Build website content
run: cd main-site && pwd && ls -a && hugo
- name: Copy files with rsync
uses: tempersama/rsync-copy@2.6
with:
host: shockrah.xyz
username: ${{ secrets.USER }}
source: "main-site/public/"
destination: /opt/nginx/temper.tv/
key: ${{ secrets.PRIVATE_KEY }}
```
General steps are basically ( per pipeline run ):
* Clone the repo
* Run `hugo` to build all the content
* Copy files over using rsync Github action that I wrote myself
The last part was a bit tricky to get working with Gitea in a container,
acceptable key distribution, and user administration.
Ended up working out and now I have my own Github action for Rsync which is pretty neat.
> Wait why not just ansible/scp?
I'm not using ansible because I'm just copying files... I don't need such a
massive tool to accomplish that; even if I do have an
[ansible dockerhub image](https://hub.docker.com/r/shockrah/ansible) that works.
While I could in theory keep a super up to date and clean transfer flow with
Ansible... this is a short meme blog that doesn't need that much engineering :)
I tried SCP ( and it worked ) but I ran into the issue that SCP doesn't copy
directory trees which makes copying static site structures super annoying.
> You could tar the site then scp that over!
Yes... and end up scp'ing a hugo tarball every single time... no thanks
data transfer rates are already bad enough as it is.
## Going Forward
Now that I have keys and a simple dev setup on windows ( where I game/stream from )
it should be super easy to actually update this site without having to
constantly hop between Windows/Linux all the time. So here's to lost of fun &
easy updates in the future :partying_face:

BIN
main-site/static/img/dev/hammer-wrench.png (Stored with Git LFS) Executable file

Binary file not shown.