description: "How a fake terminal UI spiraled into a browser-based AI prototype with attitude."
published: false
tags: [webdev, ai, llm, portfolio, gamedev, personalproject]
This started as a placeholder.
I wanted a portfolio. Something simple. Retro terminal aesthetic, green-on-black, maybe a flickering cursor for flavor. Basic stuff. I tossed in a few fake commands just to make it feel alive—one of them a totally nonsense link that led nowhere. Pure style.
But then something in my brain broke in exactly the right way.
That fake command? I made it do something. And then something else. Then I gave it a response. Then I wired in a bare-bones, fully browser-based AI model—no backend, no server, just a CPU-friendly fallback LLM running in the client. It talked back.
Poorly.
And rudely.
Now I’ve got a half-broken terminal UI that insults you if you ask it stupid questions, runs in-browser with no install, and is actively being shaped into a full-blown interactive frontend for my future AI stack.
The Spiral
I wasn’t planning any of this. I was just messing around. But when you let a late-night brain with a game design background and an unquenchable curiosity regarding computer systems and software, poke around long enough, things escalate. The moment the AI replied—even in its glitchy, snarky prototype state—it clicked.
This wasn’t a portfolio anymore. It was the start of something dumber, weirder, and a lot more fun.
So What Actually Works?
Right now? Just the basics:
- Fully client-side terminal UI
- DOS-style vibe with some flicker and retro noise
- Light LLM running in-browser
- Simple command routing
That’s it. No local stack. No big model switching yet. But the vibe’s there. When you load it up, it feels like a forgotten machine booting back up. Half-formed thoughts in green neon. Broken UI elements hiding in the shadows. And a little text prompt that just dares you to type something stupid.
And if you do? The prototype will let you know.
What It’s Becoming
I'm working on autodetect scripts that benchmark hardware on load. If you're on a budget phone, you get the fallback AI. If you're on something beefier, it'll upgrade automatically to a smarter model. Eventually, once that part works, the next layer will tie into my local stack—Stable Diffusion, video generation, command proxies. But always through the same interface: the terminal.
The whole point is that you talk to it like it’s alive. But not like...sentimental alive. More like sarcastic, low-key homicidal alive....for now.
Why I’m Writing This
Mostly because I thought I was making a portfolio. But now I’m accidentally prototyping a UI for my personal, local-only AI stack, complete with stable diffusion, video gen, and custom proxies to work with various interfaces, all wired through custom command logic. It’s part frontend, part test bed, part interactive joke that became a serious project.
I’ll probably drop a link later when it’s stable enough not to tell everyone to go to hell. I mean, part of it is up if your curious. It's out there.
Until then? Well, you can always follow me.