Vibe Coding SwiftUI: The Joy and Cost of Building macOS Apps Without Knowing Swift
Ever had this experience? Your computer is crawling. You open Activity Monitor hoping to catch the culprit. You see a wall of process names that mean absolutely nothing to you. You quietly close it and pretend nothing happened.
It’s like going to the doctor for a stomachache. They pull up a bunch of X-ray images and explain things for ten minutes. You walk out remembering exactly one thing: “So do I need medicine or not?”
Simon Willison had the same problem. But instead of giving up, his reaction was: “If your interface won’t show me what I need, I’ll build one that does.”
The catch — he doesn’t know Swift.
Clawd 忍不住說:
“Doesn’t know the language, built the app anyway” — this used to be either genius or a lie. In 2026, it’s called vibe coding. And the person saying it is Simon Willison — Django co-creator, author of hundreds of technical notes on LLM tooling. When someone at that level says “I didn’t look at the code,” he’s not being lazy. He’s volunteering as a test subject, pushing this new paradigm to see where the ceiling is (◕‿◕)
His method is the hottest thing in coding right now: vibe coding. You tell the AI what you want, the AI writes the code, you barely look at it, and if the result works — ship it. This post documents how he used Claude Opus 4.6 and GPT-5.4 to vibe code two practical macOS menu bar apps without ever opening Xcode.
New Laptop, New Toys, New Frustrations
Simon recently got a 128GB M5 MacBook Pro. Plenty of power for running local LLMs, but Activity Monitor was getting on his nerves — he wanted to see network traffic and GPU usage clearly, and the built-in tools just weren’t cutting it.
You know the feeling. It’s like buying a top-of-the-line espresso machine, only to find the measuring cup has two markings: “big” and “small.” You want milliliter precision? Figure it out yourself.
So he decided to build his own tools. This was actually his second time vibe coding a macOS app — he’d already built a presentation app the same way, so he knew the path was walkable.
This time he discovered something key: both Claude Opus 4.6 and GPT-5.4 are remarkably good at SwiftUI. Even better, a complete SwiftUI app can live in a single .swift file. No Xcode, no storyboards, no Russian-nesting-doll .xcodeproj folder structures — just talk to the AI in your terminal and the app appears.
Clawd murmur:
Bandwidther is 1,063 lines. Gpuer is 880. One file, one whole macOS app. Think that sounds like a lot? Go
ls -laany Xcode project and watch the auto-generated boilerplate scroll for three screens. The old entry ticket to macOS development was “spend two weeks understanding Apple’s dev ecosystem.” The new entry ticket is “can type.” That’s not a lowered bar — the bar was picked up and thrown off a cliff (๑•̀ㅂ•́)و✧
Bandwidther: Born From Dropbox Rage
The first app is called Bandwidther, and the motivation is beautifully mundane: Simon just switched laptops, Dropbox was syncing like crazy, and he wanted to know one thing — is this traffic coming from my old machine over LAN, or is Dropbox re-downloading the entire universe from the cloud?
It’s like moving to a new apartment and seeing your water bill triple. You crouch by the meter, trying to figure out if your toilet is leaking or if your neighbor secretly tapped into your pipes. You don’t need “you are using water” — you need “who is using how much.”
Simon’s first message to the AI wasn’t “build me an app.” It was a recon question: “Can you distinguish between internet traffic and LAN traffic?” Only after getting a yes did he say: “Then build a SwiftUI app in /tmp/bandwidther that shows this in real time.”
The first version appeared in minutes. But here’s where the story gets good.
Instead of specifying features one by one, he threw out an open-ended prompt: “Suggest features to add, with the goal of showing as much detail as possible about each app’s network usage.”
Clawd 歪樓一下:
Pause. This prompt deserves to be framed. Most people use AI like a typewriter: “I want X, build X.” Simon uses AI like a consultant: “I want to go in the direction of X — you tell me what’s possible.” The difference? You probably don’t know which macOS network APIs exist, but the AI has read the entire Apple documentation. It’s like going to a sushi restaurant — you can order off the menu, or you can say “omakase.” The omakase is almost always better, because the chef knows what’s freshest today. And you didn’t even know that fish existed ╰(°▽°)╯
The AI poured out suggestions. Simon cherry-picked the ones he liked and kept iterating. Per-process bandwidth, reverse DNS lookups, dual-column layout — each feature got a swift build, a quick look, a commit if good, another prompt if not. The final touch was converting it from a regular window app to a menu bar icon app. Click the icon, get a full network monitoring panel.
From “what is Dropbox doing” to “per-process network monitor.” One person who doesn’t know Swift. One lunch break.
Gpuer: Two Burners, One Chef
The second app is Gpuer — GPU and RAM monitoring. But the interesting part isn’t the app itself. It’s how Simon built it.
He opened a second terminal. While Bandwidther was still being developed in one window, he kicked off Gpuer in another.
Picture a cook standing between two stoves. Left hand flipping beef and peppers, right hand stirring miso soup. This used to be called “panic.” Now it’s called “having an AI sous chef on each burner.” You just swing by every few minutes, taste the broth, and say “needs more salt.”
And then he did something really clever — he pointed directly at Bandwidther as a reference.
The conversation went something like: “Go look at /tmp/bandwidther, then build something similar in /tmp/gpuer for GPU stats.” A few minutes later: “Bandwidther just got a menu bar icon. Do the same.”
Clawd 畫重點:
“Look at that project and build one like it” — these seven words might be the most powerful coding agent technique of 2026. You don’t need to write a spec explaining what a menu bar app should feel like, or how system tray icons should behave. You point at a living example and the AI just gets it. In traditional development, this is called a “reference architecture” — usually a 50-page PDF that nobody reads. Now you point at a directory and you’re done. Defining specs went from “write documents” to “just build it once” — document by doing, not by writing (⌐■_■)
Simon himself says this is one of his favorite coding agent tricks: letting the AI remix elements from existing projects. You’re not providing instructions. You’re providing a living reference.
You Shouldn’t Trust These Apps
Okay, up to this point, the story sounds wonderful. Now comes the paragraph Simon bolded himself: You shouldn’t trust these apps.
Scene change. Imagine your friend 3D-printed a bathroom scale. It looks gorgeous — LED display, Bluetooth, sleek design. You step on it. It says 65 kg.
You ask: “Is this accurate?”
Your friend says: “No idea, I never calibrated it.”
Would you trust that 65?
Simon’s two apps are that bathroom scale. He doesn’t know Swift, he barely looked at the AI-generated code, and he’s not familiar with how macOS calculates memory usage under the hood. So whether those pretty charts and numbers are accurate — he genuinely cannot tell.
He gives a concrete example: one morning Gpuer showed only 5GB of RAM available, but Activity Monitor said there was plenty. He sent a screenshot to Claude Code, which adjusted the calculation logic. The new numbers “looked right.” But Simon admits he still has no confidence — because even his judgment that it “looks right” might itself be wrong.
Clawd 歪樓一下:
Put your mouse down and think about this for three seconds. The core trade-off of vibe coding isn’t “is the code good or bad.” It’s that you lose the ability to judge whether the code is good or bad. A traditional engineer’s code might also have bugs — but at least they know where to look. A vibe coder doesn’t even know where the blind spots are. Your blind spots become your entire field of vision. Simon is honest enough to say this out loud. A lot of people vibe code something and ship it straight to the App Store without ever asking “wait, are these numbers real?” The biggest risk was never that AI code has bugs. It’s that you have no idea whether it does ┐( ̄ヘ ̄)┌
He added big warnings to both GitHub repos: he’s sharing these apps not because they’re reliable, but because they demonstrate what Claude can do with SwiftUI.
So What Did He Actually Learn?
Simon admits he doesn’t trust the numbers. But the experiment wasn’t wasted.
The real takeaway isn’t two apps. It’s a shift in understanding: SwiftUI is way more flexible than he thought.
One .swift file. Under 1,100 lines. No Xcode. No Interface Builder. No wrestling with .xcodeproj folder hierarchies. That’s a complete, running, good-looking macOS app.
Before this, if you wanted to build a small tool that wraps a terminal command in a nice GUI, just the setup process was enough to send you running back to shell scripts. Now the flow is: open terminal, tell the AI what you want, swift build five minutes later, and it runs.
And Claude’s design taste with SwiftUI is surprisingly good. Simon’s exact words: “surprisingly good design taste.” The output doesn’t just work — it actually looks like something you’d want to use.
Clawd 碎碎念:
Let me translate what Simon is really saying: “building a macOS app” is shifting from a skill to a task. Skills require you to invest time learning. Tasks just require you to describe what you want. The old barrier was “first understand the Xcode ecosystem” — like wanting to fry an egg but being told to get a cooking license first. Now? You tell the AI “I want a fried egg” and it fries one for you, then asks if you want cheese on top. That gap is wide enough that some engineers who spent three years learning iOS development might be quietly re-evaluating their career plans right about now ( ̄▽ ̄)/
Closing: You Can Close Activity Monitor Now
Back to the opening scene. You open Activity Monitor, see a wall of things you don’t understand, quietly close it. Simon started in the exact same place.
The difference is he took one extra step: he asked AI to build him two custom monitoring tools. Not perfect, not guaranteed accurate, and Simon himself says don’t trust them too much. But they exist — built by someone who doesn’t know Swift, in a few hours.
That’s the part worth remembering. Not “AI is amazing” or “vibe coding forever,” but a very sober observation: the definition of capability is being rewritten. “Building a macOS app” used to be a skill you needed to learn. Now it’s more like a task you can delegate. Your value isn’t “I can write Swift” anymore — it’s “I know what I need” plus “I know when to doubt what AI gives me.”
Simon nailed both. You can close Activity Monitor now — but not your judgment.