Uses

══════════════════════════════

This page is a window into the workshop behind Hidden Den Cafe. It is less about a perfect spec sheet and more about the tools I actually reach for to write code, run infrastructure, and keep exploring how much of the internet I can build on my own terms. Some of it lives on my desk, some of it lives in the homelab, and some of it lives in carefully chosen external systems.

Personal Computing Environment

Windows + Ubuntu on WSL2

My day-to-day machine is a Windows workstation with an Ubuntu environment inside WSL2. It gives me a practical desktop setup while keeping most of my actual work in a Linux shell where I can script, build, and debug with less friction.

Terminal + Bash

Bash is still where a lot of real work happens for me. If a task can be expressed as a clean command, script, or container build, that is usually the route I prefer because it stays close to the system instead of hiding it behind layers of UI.

Zed

Zed is my primary editor. I like it because it is extremely fast, modern without feeling bloated, and built for the kind of collaborative and AI-assisted development workflow I spend a lot of time experimenting with.

Fast tools, low friction

I am happiest when tools respond immediately and get out of the way. Markdown, plain text configs, and lightweight utilities matter to me because they keep the system inspectable and make it easier to move from idea to experiment without losing momentum.

Development Tools

Python first, Astro on purpose

A lot of my background is in Python and Flask-style thinking: simple services, clear behavior, and tooling I can reason about. Cozy Den is also where I am learning Astro because it fits the small-web, static-first approach I want for personal sites.

Git + self-hosted Gitea

Version control lives on my own Gitea instance at git.hiddenden.cafe. I use it both as a code forge and as part of the deployment path.

Docker

Containers are the default way I package and move projects. I like being able to build something once, understand its runtime, and ship the same artifact to the machines that need it.

TypeScript, Markdown, and plain files

I use typed configs where they help, Markdown where writing should stay lightweight, and plain files wherever possible so the project remains durable without a huge stack around it.

Editor-driven exploration

A lot of experimentation starts in the editor: quick prototypes, note fragments, config drafts, and partial implementations. I prefer tools that make it easy to move between writing, coding, terminal work, and AI-assisted iteration in one place.

AI Tools

Model mix, not model worship

I actively experiment with multiple AI systems, including OpenAI, OpenAI Business, Claude, Mistral, Ollama, OpenRouter, and Microsoft Foundry. Different tools are better at different kinds of work, so I treat them as instruments rather than as a single source of truth.

Development and drafting support

AI is useful for implementation support, debugging odd edge cases, brainstorming approaches, writing rough drafts, and pressure-testing architectural ideas. It helps me move faster, but it does not get to replace judgment, taste, or authorship.

Local and hosted experimentation

Some experiments belong in cloud APIs, some belong in local model runtimes. I am interested in both, especially where privacy, controllability, and cost start to matter.

Human ideas stay human

The point is not to automate away authorship. The point is to extend what I can build, test, and think through while keeping the taste, priorities, and final decisions my own.

Infrastructure & Homelab

Proxmox cluster

A lot of the lab side of my work is built around a Proxmox-based cluster. It acts as a workshop environment where services and ideas get built, tested, broken, rebuilt, and slowly turned into something stable enough to keep.

Multiple nodes, mixed workloads

The homelab is not a single-box setup. It spans multiple nodes running both experimental and production services, which makes it useful for trying things out without forcing every idea directly into the same environment.

Self-hosted services

Gitea is part of that stack already, and more services tend to follow the same philosophy: if I can run it myself without making life worse, I would rather understand the system than outsource it by default.

Persistent storage

Persistent data matters, so storage gets treated as infrastructure, not an afterthought. That includes self-hosted storage with tools like OpenMediaVault and the kind of planning that keeps the lab useful as it grows rather than turning into a graveyard of disks.

Networking foundation

The network is built around a UniFi UDM Pro Max, which acts as the core router and network controller for the environment. I want the network to be stable first, then segmented where useful, with secure remote access and reliable routing between local services and the systems that need to be reachable from the outside.

Tailscale for remote reachability

Tailscale is part of the connective tissue that makes the lab feel usable from anywhere. I like tools that make remote access simple without turning the whole setup into a networking puzzle.

Identity & Cloud Services

Microsoft Entra

Entra is part of how I think about identity and access in the broader environment. It is useful where centralized identity, policy, and account management make sense.

Microsoft Intune

Intune fits into the device and policy side of the stack. I do not see that as a replacement for understanding systems directly, but it is practical where managed devices and consistent policy matter.

Microsoft Azure

Azure is part of the cloud experimentation side of the workshop: useful for testing ideas, running services that do not belong at home, and understanding how managed infrastructure behaves in the real world.

Homelab first, cloud where it helps

These Microsoft services complement the homelab rather than replacing it. I still prefer running and understanding systems directly whenever possible, but I am not interested in pretending cloud tooling has no value when it clearly does.

External Hosting

OVH

OVH is part of the external VPS layer I use when a service needs public accessibility, geographic separation, or a home outside the local lab.

VPS.play.hosting

VPS.play.hosting fills a similar role for additional external services. I like having more than one place to run things when the goal is resilience rather than putting every dependency in one box.

Hybrid infrastructure

Not everything belongs in the homelab, and not everything belongs in the cloud. The useful middle ground is a hybrid setup where local infrastructure and external VPS systems complement each other instead of competing.

Security & Identity

Hardware-backed authentication

I prefer security that is boring and strong. Hardware keys such as YubiKeys make more sense to me than pretending a password alone is enough for important accounts.

GPG and identity hygiene

Cryptographic identity is part of the workflow here. I publish my keys, use GPG where it is useful, and try to keep trust anchored in things I can verify instead of platforms asking to be trusted.

Password managers and compartmentalization

Good security is mostly consistency. Unique secrets, sensible separation between roles and systems, and fewer scattered accounts beat dramatic security theater every time.

Privacy as a design constraint

I do not chase privacy because it sounds noble. I care about it because it changes architecture choices: fewer unnecessary dependencies, less telemetry, tighter control over where data ends up, and a better understanding of what the system is doing.

Website Stack

Astro

Cozy Den itself is built with Astro because static HTML is still one of the nicest ways to publish a personal site.

Vanilla CSS + TypeScript

Styling stays close to the markup, and the TypeScript that exists is there to make content and structure safer rather than heavier.

Docker + nginx

The site builds in a container and ships as static files served by nginx, which keeps deployment small, repeatable, and easy to audit.

Gitea-driven deployment path

Source, registry, and release flow all stay close to my own infrastructure instead of depending on a hosted publishing platform.

Secondary to the workshop

The site stack matters, but it is only one part of the broader ecosystem. Cozy Den exists because of the surrounding tools, machines, services, and habits that make building on my own terms possible.

This page is a living snapshot of the tools and systems behind Cozy Den. It changes from time to time as experiments evolve and the workshop grows.