Building a Swift Button
29 Mar 2025 ∞

I really like the idea of interacting with computers in ways other than the keyboard and mouse, especially in ways that can control applications without concern for focus. That's a big part of why I like the Stream Deck and built a plugin for Capture One. Through happenstance I needed to build an input device to emulate a barcode scanner, so I rummaged around the parts bin and grabbed a Raspberry Pi Pico and a clicky arcade button and tossed a sample project together. Of course it didn't stop there.
Once I had the button put together I wondered what else I could do with it.
The Hardware
The sensible thing to do would have been to grab an off-the-shelf RP2040 board and call it a day, however I've been designing a custom PCB for my GPI Controller and, if you squint, the who projects are basically the same. All of the support electronics for the RP2040 (flash, crystal, decoupling capacitors, etc.) could be shared, it's just what happens to the GPIO lines. A quick copy and paste later and I was on my way to a fresh new PCB.
While the current prototype only had a single button I wanted to design the board to be a little more modular so I could use the design for more projects in the future. I landed on adding headers for four button/LED pairs, each on a JST connector for ease of use. This also opened the door for users designing their own enclosures and adding more buttons if they wanted.
Surprisingly enough the enclosure turned out to be more work than the PCB. The buttons I chose are long once the microswitch is installed and I wanted to keep the overall size down. This introduced a lot of constraints around the overall dimensions, and critically the maximum angle of the front face in order for the button the clear the PCB.
The initial versions technically fit everything inside, but at the expense of being impossible to assemble: there was no way to insert the LED and microswitch without disabling clipping in the physics engine of reality. Unhappy with the 3D printed enclosure I looked into designing a sheet metal enclosure which would be laser cut and bent into shape.
I still really like how it looks but there are limitations to what can be easily (and affordably) manufactured. Initially I designed a full wrap-around part, but metal breaks can't form parts into enclosed shapes. Next I broke the bottom part out, but in order to use an appropriate thickness of material the bends would interfere with the tapped holes. Then there were the tapped holes, lots of them which pushed the price beyond what was reasonable.
Eventually I went back to the "unibody" 3D printed enclosure, iterating on the design to make the smallest practical enclosure. At the end of the day I settled on a three-part design: the main enclosure, a back plate, and a special keyed spacer that includes ARRI anti-twist pins. It's only 66mm x 86mm x 85mm with the face at a 60° angle and there's a slight taper along the sides.
The main enclosure prints face-down without supports and gets four M3 threaded inserts to hold the back plate in place. The spacer clicks into place from the inside and is doubly secured once clamped between the enclosure and PCB. Finally the button is threaded in, sitting just above the USB-C port.




The Embedded Software
While developing the GPI Controller and GB Link I build a common USB data protocol to communicate with my embedded projects. The GPI Controller was programmed in C, but for the GB Link I used Swift for the embedded side as well. Embedded Swift is still fairly new so there are some hurdles to clear.
One big improvement with the button codebase over the previous iteration was putting the shared models into a separate library inside the Swift package. This way the files could be imported as a library by the macOS client code and referenced by the makefile
to build the embedded version. While this is better, especially using globs to find the files instead of manually referencing each one I'd still like to figure out how to move more of the build step into the Swift environment.
A key place this comes in is that there are a number of shared models, like the data header, which need to be copied from the USB protocol library into the embedded codebase because the library can't be imported. If the embedded codebase could be built with swift-build
it could, theoretically, import libraries so long as they've been annotated with the appropriate availability statements. This would mean both the embedded binary and client library could fully share common models which would be kept in sync as the API changes.
The Client App

The companion app for the GPI Controller, ControlRoom was the natural home for the new hardware project. Adding initial support was simple because most of the infrastructure had already been built for the previous projects. The major new task was adding configurable actions. Easy enough, right?
As it turns out, not entirely. The basic actions aren't difficult, but ensuring they work outside of being instantiated inside the SwiftUI view hierarchy proved more difficult. I ended up having to partially rewrite by configuration storage system so it wasn't tightly coupled to the hardware types, revise how I handled USB connection events to when connecting a button the actions could be wired up, and finally moved everything into a separate agent binary which can be run by launchd
independently from the main app.
The end result is quite nice though, and ends up working a little like the GPI Controller Stream Deck plugin. Each binary uses File Presenters to coordinate reading and writing the configuration files. In practice this means when the user changes the action or settings the changed is picked up by the agent and the button is reconfigured, even opening the JSON in another application will similarly update the UI.
The configuration UI, while a little basic, is particularly fun because the large squares on the left show the whether the button is physically pressed or not and whether the LED is on. Both states even combine into a pressed and illuminated visual.
Bonus: AppleScript
I wanted the buttons to be scriptable, both that they could trigger AppleScripts and that they could be configured via AppleScript. Using AppleScriptBridge I was able to easily implement custom special handlers in a user's script so they could receive keyDown
and keyUp
events with parameters for the device and the button number that was pressed.
Inside those handlers the user can perform whatever actions are needed, but they can also control the LED. It's kind of a small thing, but also very cool that you can use AppleScript to control external hardware.
Wrapping Up
Overall I'm really proud of the outcome. It combines nearly all of my hobbies into one project, and with all of the tangents (like sheet metal and embedded Swift) I ended up learning a lot about things I'd only dabbled in before. I'm not entirely sure what the plans are for the project, I'll probably sell it as a finished product as well as a kit.
I want to add more actions, although the scripting support can augment some of that. I can also see using this as a good excuse to test out ExtensionKit to allow for 3rd party actions, but maybe I'm getting ahead of myself.