ScriptSetter 2022.1

18 Jan 2022 ∞

The main window

After, checks notes, three years without any updates I scrapped my original background script plugin and replaced it with a proper application. In short Capture One's plugin API isn't great nor was I even really using it as designed and recent macOS updates broke it the rest of the way.

The new app version of ScriptSetter also gains some new abilities like being able to set scripts for any installed version of Capture One (whether it's running or not), actually be updated, and is properly signed.

You can check out the update on the project page.

applescript captureone scripting digital-tech

SwiftOccupancy

09 Jan 2022 ∞

A project that I've been working on for quite some time is figuring out a good way to do room occupancy using thermopile sensors. I tried lots of different combinations of hardware and software: Wemos D1 boards, MQTT, web sockets, Swift, Raspberry Pis, etc. After each attempt I'd be disappointed with the results and shelve it for a while.

Over the last couple of weeks, however, I feel like I've finally cobbled together a system that's been accurate enough to deploy to a couple rooms in my house. The current solution uses Raspberry Pis, Swift, and MQTT to tie everything together.

The Hardware

A Raspberry Pi Zero and AMG8833 thermopile sensor shoved in a small wooden box, and taped to the top of a door

The hardware is the most straight-forward. It uses an AMG8833 sensor and a Raspberry Pi. Placing the thermopile sensor above a door frame lets it watch people walk through the door. The plan is to eventually make some PCBs to clean up the installation.

The Software

This is, of course, the hard part. Watching a video it's really easy to tell when someone walks under the sensor. Convincing a computer to do that, less so. Luckily there's a lot of prior art I was able to leverage for examples.

The principles are pretty easy:

  1. The sensor is polled 10 times a second for new data
  2. A cluster detection algorithm is run to find a person in frame
  3. The OccupacyCounter is given that data and determines whether someone is walking from top to bottom or visa-versa
  4. The counts are updated and published via MQTT
  5. Other sensors watch MQTT and update their counts for each room

All of this is encapsulated in SwiftOccupancy. It's meant to be run locally on each Raspberry Pi to build a little network of distributed sensors around your house.

Unlike most of the projects I work on, which are apps or basic local command line utilities, debugging was difficult. If for no other reason than I had to get up and walk back and forth through my office door over and over again. This did lead to probably the most comprehensive logging I've done. There's straight logging to a file, several ways to publish raw sensor data to MQTT, a heat-map, and a companion app to pull it all together.

One of the little features I'm proud of is the concept of the æther as a room. If you don't want to track occupancy in a room, for instance a door leading outside, you can leave that side empty and internally people will just disappear when they walk into it.

Other Thoughts

The biggest limitation right now is the availability of Raspberry Pi Zero 2s. I bought a bunch of sensors but can't deploy them all because I can only get one Pi at a time.

Another, smaller, issue is power. Any door you want to place a sensor on needs to be close to an outlet. using 15-foot USB cables helps, but doesn't solve everything. For instance I want to place a sensor at the bottom of my stairs, which would mean running a cable from the other side of the hall, up and around the ceiling, then back down.

The other issue that isn't as easily resolved is the cost. A Pi Zero 2 is $15, the AMG88 sensor is ~$25, plus all of the other bits and bobs, each assembled sensor ends up costing $80, probably closer to $100 after shipping and taxes. On the plus side each sensor does cover two rooms, but they add up quickly for a whole house.

All that being said, it's been a fun project and there's still more to work on. If you want to try it out everything's over on GitHub.

homeautomation swift raspberrypi

Camera Setting Monitor Script

08 Jan 2022 ∞

A sample notification

infomercial black & white

Has this ever happened to you: you're in the middle of a shoot, images coming in faster than the computer can keep up with, and suddenly you notice something's wrong. Everything's coming in overexposed! 😫

fade to color

Introducing another script! Simply place the camera_settings_check script in Background Scripts, run it from Capture One's Scripts menu to save your current settings, and you'll be notified after each new capture if the camera's exposure settings have changed.

The script is available as part of my suite of Capture One scripts, for only six easy payments free! However, if you do find my scripts useful consider supporting me on GitHub.

applescript captureone scripting digital-tech

Introducing ScreeningRoom

03 Jan 2022 ∞

The main window

A common occurrence on photo sets is to be running multiple monitors; for the AD, client, stylist, etc. Another common occurrence is not being able to see all of those extra monitors and it's difficult to work when you can't see the display.

Another digital tech, Josh Marrah, had the original idea and asked if I might be able to build an app to let you monitor you monitors. A little bit of playing around with some low-level system APIs and ScreeningRoom was born.

ScreeningRoom lives in your menubar and give you an easy way to glance at all your connected displays, matching their physical layout. You can also create separate windows for any given display for more active monitoring.

It's out now and just $25 with a two-week trial.

captureone digital-tech

Plotting Polynomials

16 Oct 2021 ∞

The first render that worked

Earlier this week I got nerd-sniped by a video about Newton's Fractal, which covered Fun Facts about finding the roots of polynomials that result in a fractal pattern. While watching the video I got caught up in the movement of how a point finds its root, because where there's movement there's something to plot.

One of the nice things about 3Blue1Brown is all of the videos are generated from code, and that code is open source, so I figured I could probably implement the algorithm myself. Eventually I was right, but it took a while to make it all work.

The image to the right is the first render that behaved as expected. I was reimplementing a bunch of methods from JavaScript, most of which used odd syntax I wasn't familiar with. I managed to get most of them working, but the one that ended up causing the most trouble was a method that returned all combinations of the roots. My version never worked correctly, however someone pointed out that what I was trying to do is implemented in Swift Algorithms, so I used that instead and everything came together.


After I had the math part done I could turn my attention to the graphics part. I started off with simple black lines, but one of the more interesting elements of the fractal is which root a points converges on. With this in mind I added colors to the lines, tracing the paths to each root value.

Drawing both the paths and just the pixels at a high enough density shows the fractal emerging, although the path version gets a little too bright with that many lines.

A 6k render

A more reasonable density of lines creates a really nice nebula effect, with most of the paths going towards their closest root, but not all. The overlap produces striking bolts of light between roots.

I'm planning on rendering some for plotting, but first I need to decide whether I'm going to plot them in one color, or try to follow the digital versions and use different colors for each root.

art generative swiftgraphics swift

ISO Cube Plots

08 Jun 2021 ∞

A while back I bought a genuine Art Cube1 on a whim, as I'm wont to do. I knew I wanted to somehow plot on it, but didn't really have any concrete of ideas what to do with it. I wanted to do something in 3D, where the plot was continuous around the cube, however SwiftGraphics is a (mostly) 2D library, so plotting anything with continuous lines would require some cheating to make either make lines jump from one face on the net to the next or doing some complex cube projections.

While I was thinking about how I might even implement something like that I thought about other ways to visualize the cube in two dimensions. What came to mind was an isometric projection. Two isometric cubes would allow me to see both sides of the whole cube. However I didn't want to jump straight into 3D, and wanted to explore the isometric projection further, so I simplified a bit.

As with all things in SwiftGraphics I'm not just throwing lines on a canvas2, everything is driven by objects. So in this case I needed an isometric cube. Drawing the cube isn't difficult, making it smart, on the other hand, took more time. There are lots of shapes, but they're limited. A cube is made up of many faces, each defined by four vertices and a big part of being able to draw anything on the cube is knowing what cube and what face a point is on.

This is, luckily, a well trodden area and there are several options for determining whether a point is inside an arbitrary polygon. I opted for the winding number method by Dan Sunday. By far the most complex part was ensuing that a line would only draw in valid directions for a given face, and when moving to a new face would change direction correctly. I was initially going to reuse some of my ray tracing logic, unfortunately it wasn't quite compatible with what I wanted to do, for instance having lines change directions on a face, not just at boundaries.



Eventually I got the bugs worked out well enough to produce some nice looking plots. The debug view on this plot is particularly nice looking, to the point I considered just leaving it in, but eventually realized it would be too busy. I made a bunch of variations with different cube sizes and configurations. All of them are over in their gallery and for sale in the shop.

  1. It's just a wooden cube, but from an art supply store.

  2. I mean, I am.

art generative swiftgraphics swift plotter

Capturebot 2021.1

31 May 2021 ∞

Capturebot

It's been a little while since I've released an update to Capturebot, but after a couple betas earlier this year version 2021.1 is ready to go. There are a bevy up bug fixes and some important new features that have been a long time coming and bring Capturebot closer to what it's meant to be.

The addition of being able to create capture folders from the datasource's filenames and a new and improved mini-shotlist should really help techs out on set. Plus, the option to write metadata directly via Capture One, which has been disabled since the first beta, is now fully operational. XMP files can be a relic of the past.

There are also four new mapping options, including date conversion, to give even more control over the source metadata.

A handful of bug fixes, like Capturebot no longer freezing while writing metadata and correctly loading some saved datasource settings, makes this one of the biggest released to date.

Go check out all of the details over at the release page.

captureone digital-tech capturebot

Voigtländer Brillant

20 May 2021 ∞

A photo print of the back stock room of a grocery store with sunlight filtering in. There’s light leak in the corner and the film emulsion was horribly scratched

A while ago I picked up a Voigtländer Brillant from Seawood Photo. It sat in my collection for ages before I finally decided to put some film through it last month. The ground glass is, as the name implies, brilliant, and even despite having only three shutter speeds, three aperture stops, and is zone focussed the camera itself is a joy to shoot with. It's very much the point & shoot of its era.

While shooting with it is fun, the imagery it made had some issues. The first roll I shot I didn't load correctly (in a painfully obvious way) and the whole roll was scratched, full of light leaks, and nearly every frame overlapped another. As for the other two: one was entirely blank and the other mostly motion blur.

The roll that did survive, though, is quite fun. The photos are extra analogue with all of the imperfections. With my website refresh I've been trying to share more photos here, rather than just on Instagram, and with these ones I wanted to make prints instead of leaving them as just digital files. The whole series is over in the gallery.

art photography film

A couple days ago I completed a new series of plots based on spirals radiating out from a central circle. It took a little but of tuning to get the arms to increase in radius at a nice rate, but using an offset of the starting position ensured that the outer arm had a slow start and gradual ramp up, instead of all arms starting at the same position.

After I had a bunch of "normal" spirals I added a decay to the radius, which created some oddly loopy spirals with a point at one side.

There are six plots in the series, including one happy accident while I was trying to crop the paths of a giant spiral in Inkscape.

New Site & Store

Today I also published a new version of my website, featuring an all new gallery. There's also a new version of my shop where the spiral plots are available.

art generative swiftgraphics swift

Ray Tracing Redux

19 Jan 2021 ∞

I’ve been meaning to revisit my ray tracing system for a while, but never really had the motivation to dive back in. The way I’d built it was fragile and prone to infinite loops. The infinite loops are what finally got me to take another look: recent changes I’d made caused several sketches to never resolve.

My original ray tracing system simply collected line segments from each object it intersected. One of the big "improvements" I made was allowing objects to be reevaluated so a single ray could interact with the same object multiple times. Unfortunately I accomplished this with a recursive method. The closest intersection was found, then the method would be called on the next object. All of the intersections would be collected at the bottom of the stack and passed back up when the ray finished its journey. If it finished its journey.

A Formal Ray

For the second (major) attempt I switched over to a real Ray object to keep track of intersections. Another part of the code I updated was how a ray could be modified. A new RayTracable protocol gives an object the chance to modify a Ray directly, with the default to simply terminate it.

Once the new Ray was set up I began testing and quickly noticed that my circle intersections were wrong. The point of deflection wasn’t actually on the perimeter of the circle. I eventually discovered that I hadn’t understood the t value, which is used to find the point of intersection on the ray, correctly. I found an article which explained what I was missing and managed to fix the method.

Each time I conformed a shape to RayTracable I’d find similar issues. After a spot of pest control I finally had a system that behaved correctly.

Putting it All Together

Once the math was taken care of I revisited my largest ray tracing sketch. I’d never set out to necessarily make a performant ray tracer, but it was exceptionally slow. Looking into the drawing code I found I was generating the paths for each drawn layer. A quick change to separate the calculations and the on-screen drawing helped a lot.

The final tweak I made was how I was rendering the layers. Before I’d updated the drawing code I ended up with a particularly nice sample render where the rays would pool into bright spots, which I then accidentally "fixed" and couldn’t replicate. Eventually I figured out the correct blend mode and drawing order to be able to control the effect to great success.

art generative swiftgraphics swift raytracing