Compare commits
6 Commits
b96391d813
...
master
| Author | SHA1 | Date | |
|---|---|---|---|
| 4873fd9981 | |||
| 8b2c0f301e | |||
| ff671f3c35 | |||
| bcc9753a81 | |||
| ca66e70b4a | |||
| fea2ed5d51 |
27
.gitea/workflows/build-and-deploy.yaml
Normal file
27
.gitea/workflows/build-and-deploy.yaml
Normal file
@@ -0,0 +1,27 @@
|
|||||||
|
name: Build and Deploy Blog Site
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches:
|
||||||
|
- master
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
setup-website-content:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
container:
|
||||||
|
image: shockrah/hugo
|
||||||
|
steps:
|
||||||
|
- run: git init
|
||||||
|
- run: git remote add origin https://git.shockrah.xyz/shockrah/blog.git
|
||||||
|
- run: git pull origin master
|
||||||
|
- name: Build website content
|
||||||
|
run: hugo
|
||||||
|
- name: Copy files with rsync
|
||||||
|
uses: tempersama/rsync-copy@2.6
|
||||||
|
with:
|
||||||
|
host: shockrah.xyz
|
||||||
|
username: ${{ secrets.USER }}
|
||||||
|
source: "public/"
|
||||||
|
destination: /opt/nginx/shockrah.xyz/
|
||||||
|
key: ${{ secrets.PRIVATE_KEY }}
|
||||||
|
|
||||||
|
|
||||||
@@ -1,6 +1,6 @@
|
|||||||
---
|
---
|
||||||
title: "How this site came to be"
|
title: "How this site came to be"
|
||||||
date: July 22, 2018
|
date: 2018-06-22
|
||||||
draft: true
|
draft: true
|
||||||
---
|
---
|
||||||
|
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
---
|
---
|
||||||
title: "Delivering whole OS's in Gitlab's CI/CD"
|
title: "Delivering whole OS's in Gitlab's CI/CD"
|
||||||
date: 0000-00-00
|
date: 2021-08-15
|
||||||
draft: true
|
draft: true
|
||||||
---
|
---
|
||||||
|
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
---
|
---
|
||||||
title: "Drive Recover"
|
title: "Drive Recover"
|
||||||
date: 0000-00-00
|
date: 2021-08-14
|
||||||
draft: true
|
draft: true
|
||||||
---
|
---
|
||||||
|
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
---
|
---
|
||||||
title: Economical Eats
|
title: Economical Eats
|
||||||
date: 0000-00-00
|
date: 2021-08-14
|
||||||
draft: true
|
draft: true
|
||||||
---
|
---
|
||||||
|
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
---
|
---
|
||||||
title: Esports Post
|
title: Esports Post
|
||||||
date: 0000-00-00
|
date: 2021-08-14
|
||||||
draft: true
|
draft: true
|
||||||
---
|
---
|
||||||
|
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
---
|
---
|
||||||
title: "DWM and POP!\_OS"
|
title: "DWM and POP!\_OS"
|
||||||
date: 0000-00-00
|
date: 2021-08-14
|
||||||
draft: true
|
draft: true
|
||||||
---
|
---
|
||||||
|
|
||||||
|
|||||||
@@ -24,5 +24,4 @@ It's _very_ stream of concious-y and often not super coherent so often there gap
|
|||||||
|
|
||||||
## Design things
|
## Design things
|
||||||
|
|
||||||
* Nothing Yet
|
* :wave: [Bubble Chat and it's User Data](/notes/rationalizing-user-data)
|
||||||
|
|
||||||
|
|||||||
43
content/notes/rationalizing-user-data.md
Normal file
43
content/notes/rationalizing-user-data.md
Normal file
@@ -0,0 +1,43 @@
|
|||||||
|
---
|
||||||
|
title: Rationalizing User Data
|
||||||
|
description: "Like how do I approach storing sensitive user data in Bubble?"
|
||||||
|
date: 2025-01-07T21:34:12-08:00
|
||||||
|
draft: false
|
||||||
|
category: article
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
This is a stream of concious post where I go through the process of figuring out
|
||||||
|
how I am going to store user data in my [bubble project](https://git.shockrah.xyz/shockrah/bubble)
|
||||||
|
|
||||||
|
Schema for the users can be found here: [link](https://git.shockrah.xyz/shockrah/bubble/src/branch/main/db/setup-tables.sql)
|
||||||
|
|
||||||
|
|
||||||
|
# Pre-requisites
|
||||||
|
|
||||||
|
* Using Postgres
|
||||||
|
* Hashed and salted passwords
|
||||||
|
|
||||||
|
# User ID's
|
||||||
|
|
||||||
|
By default I was going to use regular ID's that increment on each insertion.
|
||||||
|
Considering using some form of UUID's for the sake of a slight increase in sec.
|
||||||
|
Even though this isn't a major form of security, every layer towards
|
||||||
|
better security counts, and I see this as one more layer to add.
|
||||||
|
|
||||||
|
|
||||||
|
UUID's tend to suffer from indexing issues but later versions of UUID seem to
|
||||||
|
have this figured out.
|
||||||
|
|
||||||
|
After some reading v7 seems like the move.
|
||||||
|
|
||||||
|
* Time based meaning sorting ( and thus searching ) is reasonably performant
|
||||||
|
* Sufficient entropy for this case ( 74 bits ). We want ****some* entropy but also
|
||||||
|
don't want to nuke performance
|
||||||
|
* While we want to make it harder to guess, aiming for "unguessable" is just not
|
||||||
|
reasonable. UUIDv4 is best for that but compromises performance so hard it's not
|
||||||
|
worth it in the grand scheme of things
|
||||||
|
|
||||||
|
|
||||||
|
IDK v7 feels like a good middle ground for security + performance for the **user id**
|
||||||
|
|
||||||
@@ -1,6 +1,6 @@
|
|||||||
---
|
---
|
||||||
title: Building a Video Streaming Service with Rust
|
title: Building a Video Streaming Service with Rust
|
||||||
date: 2021-13-10
|
date: 2021-10-13
|
||||||
draft: true
|
draft: true
|
||||||
description: The real state of Rocket right now
|
description: The real state of Rocket right now
|
||||||
category: article
|
category: article
|
||||||
|
|||||||
64
content/posts/migrating-to-vultr.md
Normal file
64
content/posts/migrating-to-vultr.md
Normal file
@@ -0,0 +1,64 @@
|
|||||||
|
---
|
||||||
|
title: Migrating to Vulr
|
||||||
|
description: Finally moving away from AWS little by little
|
||||||
|
date: 2024-10-27T20:59:05-07:00
|
||||||
|
draft: false
|
||||||
|
category: article
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
# What and Why
|
||||||
|
|
||||||
|
|
||||||
|
For some time now I've been trying my best to get away from using AWS
|
||||||
|
for my infrastructure due to the constantly rising price of everything.
|
||||||
|
|
||||||
|
|
||||||
|
## Main causes for higher cost
|
||||||
|
|
||||||
|
* Fargate
|
||||||
|
|
||||||
|
This one is mostly my own fault lmao since Fargate ( without an application load
|
||||||
|
balancer ) is actually not that bad in terms of pricing. The issue comes in
|
||||||
|
if you are trying to host a variety of services like myself on one host.
|
||||||
|
Services which, are only ever really used for personal and singular use.
|
||||||
|
|
||||||
|
Recall that with fargate we are paying for things on a core count and if you
|
||||||
|
containize everything this effectively means you are paying per container
|
||||||
|
more/less. Couple this with lots of contains and your pricing starts to
|
||||||
|
get really expensive really fast.
|
||||||
|
|
||||||
|
* Application Load Balancer
|
||||||
|
|
||||||
|
These are just expensive for small projects what else can I say...
|
||||||
|
I wouldn't suggest hosting personal sites behind one of these basically ever.
|
||||||
|
|
||||||
|
* Scaling
|
||||||
|
|
||||||
|
If you seriously need an ALB to sit in front of lots of microservices then
|
||||||
|
you're probably dealing with either an interesting project or just a need
|
||||||
|
to handle a lot of traffic. After a while I didn't really want a web server
|
||||||
|
to be my "interesting project" since this ended up eating way more of my time
|
||||||
|
than I would have ever liked it to...
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
# Why Vultr
|
||||||
|
|
||||||
|
shit's cheap yo...
|
||||||
|
|
||||||
|
10$ - 20$ roughly for a bare minimum Kubernetes cluster or about 10$ per
|
||||||
|
host as I'm doing now. Provision hosts with Terraform then configure with
|
||||||
|
Ansible and you have a somewhat reasonable infrastructure for hosting
|
||||||
|
personal projects.
|
||||||
|
|
||||||
|
## What Do I host now?
|
||||||
|
|
||||||
|
* shockrah.xyz
|
||||||
|
* git.shockrah.xyz <-- Gitea instance
|
||||||
|
* temper.tv <-- vr/funsies blog
|
||||||
|
|
||||||
|
Basically I'm hosting more stuff more effectively and it's an infrastructure
|
||||||
|
that is ""(([[{{platform agnostic}}]]""given its all Terraform anyway and
|
||||||
|
Ansible can be used basically anywhere there's a host.
|
||||||
|
|
||||||
41
content/posts/reviving-an-old-project.md
Normal file
41
content/posts/reviving-an-old-project.md
Normal file
@@ -0,0 +1,41 @@
|
|||||||
|
---
|
||||||
|
title: Reviving an Old Project
|
||||||
|
description: Clippable and it's current future
|
||||||
|
date: 2025-08-19T21:26:49-07:00
|
||||||
|
draft: false
|
||||||
|
category: article
|
||||||
|
---
|
||||||
|
|
||||||
|
# Context
|
||||||
|
|
||||||
|
A while back I created [Clippable](https://git.shockrah.xyz/shockrah/clippable),
|
||||||
|
which was meant to be a way for me to share gaming clips with others because I
|
||||||
|
didn't like using streamable, youtube, etc. I'm still not a huge fan of it
|
||||||
|
especially considering I have the technical know how to build my own clip
|
||||||
|
sharing system/website.
|
||||||
|
|
||||||
|
Now that I'm once again looking for work :wink: I'm in a position where I
|
||||||
|
need to have a new "main" project to bring up in interviews & resumes.
|
||||||
|
My choice this time around of course is [Clippable](https://git.shockrah.xyz/shockrah/clippable)
|
||||||
|
which should provide ample room for me to _demonstrate_ my skills.
|
||||||
|
|
||||||
|
_Also I can actually use the end result which is nice_.
|
||||||
|
|
||||||
|
|
||||||
|
# What is it?
|
||||||
|
|
||||||
|
|
||||||
|
I'll be honest, it's a self-hosted streamable clone whose
|
||||||
|
front-end got a huge facelift and is now being packaged with Gitea.
|
||||||
|
|
||||||
|
# Now what?
|
||||||
|
|
||||||
|
Now comes the fun part which is packaging everything and updating docs
|
||||||
|
so that this can be fully presentable and remain as a proper "receipt"
|
||||||
|
of my skill because it's easier if my work can speak for itself in interviews :wink:
|
||||||
|
|
||||||
|
# Where?
|
||||||
|
|
||||||
|
The link to the project is here :link: https://git.shockrah.xyz/shockrah/clippable
|
||||||
|
|
||||||
|
Link to the docs is here :link: https://shockrah.gitlab.io/clippable/
|
||||||
7
scripts/compare.py
Normal file
7
scripts/compare.py
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
import hashlib
|
||||||
|
|
||||||
|
with open('public/tags/index.xml') as file:
|
||||||
|
data = hashlib.sha1(file.read().encode('utf-8'))
|
||||||
|
|
||||||
|
remote = '99d66a9e171feaf11be88b831bc69c55d85c1b4b'
|
||||||
|
print(remote == data.hexdigest())
|
||||||
29
scripts/neocities.py
Normal file
29
scripts/neocities.py
Normal file
@@ -0,0 +1,29 @@
|
|||||||
|
from os import environ
|
||||||
|
from requests import get
|
||||||
|
from json import dumps
|
||||||
|
|
||||||
|
def request(key: str, uri: str):
|
||||||
|
url = f'https://neocities.org{uri}'
|
||||||
|
headers = { 'Authorization': f'Bearer {key}' }
|
||||||
|
response = get(url, headers=headers)
|
||||||
|
return response.json()
|
||||||
|
|
||||||
|
def get_new_content(key: str):
|
||||||
|
'''
|
||||||
|
Fetches a list of all files on neocities
|
||||||
|
'''
|
||||||
|
# First fetch the hashes of all our files
|
||||||
|
response = request(key, '/api/list')
|
||||||
|
remote_files = response['files']
|
||||||
|
|
||||||
|
# COmpare remote hashes to local hashes
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
key = environ.get('NEOCITIES_API_KEY')
|
||||||
|
if key is None:
|
||||||
|
print('Check to ensure NEOCITIES_API_KEY is set in the env vars')
|
||||||
|
exit(1)
|
||||||
|
else:
|
||||||
|
get_public_contents(key)
|
||||||
|
|
||||||
Reference in New Issue
Block a user