Under the Bonnet: How My Website Functions and Is Structured

last updated by ewan datetime loading...

(datetime loading...)

View on WhiteWind or see the record at atproto.at

4 min read • 683 words


If you've ever wondered what makes my website tick, you're in the right place. I thought I'd take a moment to peel back the curtain and share a bit about how everything fits together—partly for my own documentation, partly because I find this sort of thing oddly satisfying.

The Foundation: Forking and Customising

My website is, at its core, a personal fork of WhiteBreeze, which itself is a frontend for WhiteWind—a Markdown blog service built on the AT Protocol albeit you're probably reading it on WhiteWind, but if not, that's what it is. I've tweaked and tailored it to suit my own needs, but the underlying architecture remains true to the original. This means I get the best of both worlds: a robust, well-tested base and the freedom to make it my own.

Project Structure: Where Everything Lives

The project is neatly organised (well, as neatly as any project can be after a few rounds of tinkering):

  • src/: The heart of the application. Here you'll find all the Svelte components, utilities, and routes that power the site.
    • lib/: Shared logic and reusable components. For example, profile/profile.ts handles fetching and caching ATProto profile data, which is used across different parts of the site.
    • routes/: Each route corresponds to a page or section. The blog lives under routes/blog/, with individual posts handled by dynamic routes.
  • static/: Assets that are served directly, like images, favicons, and the all-important .well-known/atproto-did for AT Protocol configuration.
  • Configuration files: Everything from package.json (dependencies and scripts) to tailwind.config.ts (styling) and docker-compose.yaml (for containerisation) lives at the root.

I've tried to keep things as modular as possible, so if I ever want to swap out a component or add a new feature, it's (usually) just a matter of dropping it in the right place.

How Content Gets From Markdown to Blog Post

One of the joys of this setup is that all my posts are written in plain Markdown. There's a pipeline in place—built on the unified ecosystem—that takes care of parsing, sanitising, and rendering everything. Here's the gist:

  • Markdown parsing: Using remark-parse and remark-gfm for GitHub-flavoured Markdown.
  • HTML transformation: remark-rehype and rehype-raw convert Markdown (and any embedded HTML) into safe, display-ready content.
  • Sanitisation: A custom schema ensures that only safe HTML makes it through, with a few tweaks to support things like embedded YouTube videos.
  • Image upgrades: Any image URLs are automatically upgraded to HTTPS, because security matters.
  • Excerpt generation: For previews and metadata, a snippet of plain text is extracted from each post.

This means I can focus on writing, knowing that the technical side will handle itself.

Deployment: From Local Edits to Live Site

For development, it's as simple as running npm install and npm run dev. When it's time to go live, I have a couple of options:

  • Standalone: Build and run with Node.js, setting environment variables as needed.
  • Dockerised: Use Docker Compose for a fully containerised setup, which keeps things consistent across environments.

I've mapped the default port to avoid conflicts with other projects, and the multi-stage Docker build ensures the final image is lean and production-ready.

Why This Structure?

Honestly, it's about balance. I wanted something that's easy to maintain, flexible enough for future changes, and robust enough to handle whatever I throw at it. By building on top of WhiteBreeze and WhiteWind, I get seamless integration with the AT Protocol and Bluesky ecosystem, while still keeping everything under my own domain.

Final Thoughts

There's a certain satisfaction in knowing exactly how your website works—from the way content is parsed to how it's served to visitors. It's not just a blog; it's a little digital home that I've built, brick by brick (or, more accurately, file by file). If you're curious about the code, it's all open source and available on GitHub.

If you've made it this far, cheers for indulging my ramble. Maybe you'll find something useful here for your own projects—or at the very least, a bit of insight into how I like to do things.