Plotting Polynomials

16 Oct 2021 ∞

The first render that worked

Earlier this week I got nerd-sniped by a video about Newton's Fractal, which covered Fun Facts about finding the roots of polynomials that result in a fractal pattern. While watching the video I got caught up in the movement of how a point finds its root, because where there's movement there's something to plot.

One of the nice things about 3Blue1Brown is all of the videos are generated from code, and that code is open source, so I figured I could probably implement the algorithm myself. Eventually I was right, but it took a while to make it all work.

The image to the right is the first render that behaved as expected. I was reimplementing a bunch of methods from JavaScript, most of which used odd syntax I wasn't familiar with. I managed to get most of them working, but the one that ended up causing the most trouble was a method that returned all combinations of the roots. My version never worked correctly, however someone pointed out that what I was trying to do is implemented in Swift Algorithms, so I used that instead and everything came together.

After I had the math part done I could turn my attention to the graphics part. I started off with simple black lines, but one of the more interesting elements of the fractal is which root a points converges on. With this in mind I added colors to the lines, tracing the paths to each root value.

Drawing both the paths and just the pixels at a high enough density shows the fractal emerging, although the path version gets a little too bright with that many lines.

A 6k render

A more reasonable density of lines creates a really nice nebula effect, with most of the paths going towards their closest root, but not all. The overlap produces striking bolts of light between roots.

I'm planning on rendering some for plotting, but first I need to decide whether I'm going to plot them in one color, or try to follow the digital versions and use different colors for each root.

art generative swiftgraphics swift

ISO Cube Plots

08 Jun 2021 ∞

A while back I bought a genuine Art Cube1 on a whim, as I'm wont to do. I knew I wanted to somehow plot on it, but didn't really have any concrete of ideas what to do with it. I wanted to do something in 3D, where the plot was continuous around the cube, however SwiftGraphics is a (mostly) 2D library, so plotting anything with continuous lines would require some cheating to make either make lines jump from one face on the net to the next or doing some complex cube projections.

While I was thinking about how I might even implement something like that I thought about other ways to visualize the cube in two dimensions. What came to mind was an isometric projection. Two isometric cubes would allow me to see both sides of the whole cube. However I didn't want to jump straight into 3D, and wanted to explore the isometric projection further, so I simplified a bit.

As with all things in SwiftGraphics I'm not just throwing lines on a canvas2, everything is driven by objects. So in this case I needed an isometric cube. Drawing the cube isn't difficult, making it smart, on the other hand, took more time. There are lots of shapes, but they're limited. A cube is made up of many faces, each defined by four vertices and a big part of being able to draw anything on the cube is knowing what cube and what face a point is on.

This is, luckily, a well trodden area and there are several options for determining whether a point is inside an arbitrary polygon. I opted for the winding number method by Dan Sunday. By far the most complex part was ensuing that a line would only draw in valid directions for a given face, and when moving to a new face would change direction correctly. I was initially going to reuse some of my ray tracing logic, unfortunately it wasn't quite compatible with what I wanted to do, for instance having lines change directions on a face, not just at boundaries.

Eventually I got the bugs worked out well enough to produce some nice looking plots. The debug view on this plot is particularly nice looking, to the point I considered just leaving it in, but eventually realized it would be too busy. I made a bunch of variations with different cube sizes and configurations. All of them are over in their gallery and for sale in the shop.

  1. It's just a wooden cube, but from an art supply store.

  2. I mean, I am.

art generative swiftgraphics swift plotter

Capturebot 2021.1

31 May 2021 ∞


It's been a little while since I've released an update to Capturebot, but after a couple betas earlier this year version 2021.1 is ready to go. There are a bevy up bug fixes and some important new features that have been a long time coming and bring Capturebot closer to what it's meant to be.

The addition of being able to create capture folders from the datasource's filenames and a new and improved mini-shotlist should really help techs out on set. Plus, the option to write metadata directly via Capture One, which has been disabled since the first beta, is now fully operational. XMP files can be a relic of the past.

There are also four new mapping options, including date conversion, to give even more control over the source metadata.

A handful of bug fixes, like Capturebot no longer freezing while writing metadata and correctly loading some saved datasource settings, makes this one of the biggest released to date.

Go check out all of the details over at the release page.

photography capturebot

Voigtländer Brillant

20 May 2021 ∞

A photo print of the back stock room of a grocery store with sunlight filtering in. There’s light leak in the corner and the film emulsion was horribly scratched

A while ago I picked up a Voigtländer Brillant from Seawood Photo. It sat in my collection for ages before I finally decided to put some film through it last month. The ground glass is, as the name implies, brilliant, and even despite having only three shutter speeds, three aperture stops, and is zone focussed the camera itself is a joy to shoot with. It's very much the point & shoot of its era.

While shooting with it is fun, the imagery it made had some issues. The first roll I shot I didn't load correctly (in a painfully obvious way) and the whole roll was scratched, full of light leaks, and nearly every frame overlapped another. As for the other two: one was entirely blank and the other mostly motion blur.

The roll that did survive, though, is quite fun. The photos are extra analogue with all of the imperfections. With my website refresh I've been trying to share more photos here, rather than just on Instagram, and with these ones I wanted to make prints instead of leaving them as just digital files. The whole series is over in the gallery.

art photography film

A couple days ago I completed a new series of plots based on spirals radiating out from a central circle. It took a little but of tuning to get the arms to increase in radius at a nice rate, but using an offset of the starting position ensured that the outer arm had a slow start and gradual ramp up, instead of all arms starting at the same position.

After I had a bunch of "normal" spirals I added a decay to the radius, which created some oddly loopy spirals with a point at one side.

There are six plots in the series, including one happy accident while I was trying to crop the paths of a giant spiral in Inkscape.

New Site & Store

Today I also published a new version of my website, featuring an all new gallery. There's also a new version of my shop where the spiral plots are available.

art generative swiftgraphics swift

Ray Tracing Redux

19 Jan 2021 ∞

I’ve been meaning to revisit my ray tracing system for a while, but never really had the motivation to dive back in. The way I’d built it was fragile and prone to infinite loops. The infinite loops are what finally got me to take another look: recent changes I’d made caused several sketches to never resolve.

My original ray tracing system simply collected line segments from each object it intersected. One of the big "improvements" I made was allowing objects to be reevaluated so a single ray could interact with the same object multiple times. Unfortunately I accomplished this with a recursive method. The closest intersection was found, then the method would be called on the next object. All of the intersections would be collected at the bottom of the stack and passed back up when the ray finished its journey. If it finished its journey.

A Formal Ray

For the second (major) attempt I switched over to a real Ray object to keep track of intersections. Another part of the code I updated was how a ray could be modified. A new RayTracable protocol gives an object the chance to modify a Ray directly, with the default to simply terminate it.

Once the new Ray was set up I began testing and quickly noticed that my circle intersections were wrong. The point of deflection wasn’t actually on the perimeter of the circle. I eventually discovered that I hadn’t understood the t value, which is used to find the point of intersection on the ray, correctly. I found an article which explained what I was missing and managed to fix the method.

Each time I conformed a shape to RayTracable I’d find similar issues. After a spot of pest control I finally had a system that behaved correctly.

Putting it All Together

Once the math was taken care of I revisited my largest ray tracing sketch. I’d never set out to necessarily make a performant ray tracer, but it was exceptionally slow. Looking into the drawing code I found I was generating the paths for each drawn layer. A quick change to separate the calculations and the on-screen drawing helped a lot.

The final tweak I made was how I was rendering the layers. Before I’d updated the drawing code I ended up with a particularly nice sample render where the rays would pool into bright spots, which I then accidentally "fixed" and couldn’t replicate. Eventually I figured out the correct blend mode and drawing order to be able to control the effect to great success.

art generative swiftgraphics swift raytracing

Air Quality Monitors

20 Dec 2020 ∞

Over the summer, while California was in the middle of fire season I decided , mostly out of a morbid curiosity I suppose, to monitor the air quality inside my house. There are a number of commercial solutions available, however they're all either complex, costly, or both. I was looking for something that could take the place of my existing Wemos & ESPHome-based temperature and humidity sensors.

After a bit of research I found the PMS5003T, which measures PM 2.5, temperature, and humidity. The sensor uses a tiny ribbon cable, so they can't be connected directly to the Wemos' pins. I designed a simple circuit board for the sensor, with additional pins so it can be extended for future use. Unfortunately I wasn't paying attention to the conversion of mils and didn't make the mounting holes anywhere near large enough.

I still need to find some little boxes for them, for the most part they're just tucked into corners.

The software is pretty straightforward. The boards run ESPHome with the stock pmsx003 integration. The only real customization is a sliding window moving average to reduce jitter.

  - platform: pmsx003
    type: PMS5003T
      name: "Office Emory Particulate Matter"
        - sliding_window_moving_average:
            window_size: 15
            send_every: 15
      name: "Office Emory Temperature"
      accuracy_decimals: 0
        - sliding_window_moving_average:
            window_size: 15
            send_every: 15
      name: "Office Emory Humidity"
      accuracy_decimals: 0
        - sliding_window_moving_average:
            window_size: 15
            send_every: 15

I also have a handful of spares, so if you're interested I can send either a DIY kit or a fully assembled version.

electronics homeautomation homeassistant software sonoff

Forgotten Plots

09 Nov 2020 ∞

I've been having a little bit of "plotters block" recently, but I finally started making some new plots today. While doing that I remembered a series of plots I'd forgotten to add to the gallery. These plots are built around the addition of geometric boolean operations to SwiftGraphics.

There are also some fun time lapses of some of the plots.

art plotter generative swiftgraphics swift

Placeholder Art

28 Sep 2020 ∞

Last week I was listening to the latest episode of Make Do and near the end the conversation turned to generic art from a department store.

The mass-produced prints from IKEA, Target, etc. remind me of bringing store-bought baked goods to a party versus making a cake. Both fulfill the function of being a dessert, but they aren’t equal. There’s an intentionality to baking a cake or finding a unique print for your wall that the store-bought version doesn’t have.

When you bake a cake or hang “real” art you’re asking other people to ask you about it. Not just in the sense of fishing for compliments (though who doesn’t want people to say you have great taste?), but about the journey you went through for it to be here, now. How you baked the cake, or finally bought a piece by an artist you admired, or how you stumbled upon the piece at a local farmers' market.

A generic photo of pebbles or the Golden Gate Bridge is there to be, effectively, a placeholder in your room. It fills space in an aesthetically pleasing way, but doesn’t prompt a conversation. Someone might say they like it, but the only conversation to be had is “thanks, I picked it up at Safeway on my way here.”

art prints baking

Sonoff L1 & Home Assistant

10 Aug 2020 ∞

Recently I bought a bunch of new Sonoff IoT devices, among them was the Sonoff L1: a WiFi connected LED strip. Like all of the other Sonoff equipment I figured it would be supported by ESPHome for easy integration with Home Assistant. As it happens, this wasn't the case. The whole project ended up being far more complicated than I expected.

Flashing the Controller

The L1 doesn't expose any headers, so you'll need to solder wires to the UART pads. Luckily this only requires a few minutes of soldering and you're good to go.

Flashing the ESP8285 proves a little more difficult as there's no exposed GPIO0 pin. The Tasmota guide has a good illustration showing which pin on the chip itself needs to be shorted to ground. While this seems like it might be tricky, you get the hang of it.

Once the ESP is booted into programming mode it's an easy matter to flash ESPHome which provides OTA updates for future changes.

The Software

I'd initially followed some guides for a similar LED controller which used PWM to control the RGB channels, however, the L1 uses a Nuvoton N76E003 to drive the LEDs. The ESP communicates over serial to the Nuvoton. Some clever people over on the Home Assitant forums managed to figure out how the board works. Everything from serial protocols to commands.

After several rounds of "I don't know enough to make this work" I eventually started looking at the Tasmota implementation along with the ESPHome custom light docs and began coding up a custom component. Eventually I managed a basic, but working, implementation which parsed Home Assistant's input and passed it along to the LED strip.

I would like to figure out how to support some of the other modes in the UI. The music sync mode has parameters for both sensitivity and speed. There are also several "DIY" modes which allow the user to specify colors for the controller to cycle through.

Putting It All Together

I've uploaded the custom component and an example config on GitHub. All you need to do is download the sonoff_l1.h into your ESPHome directory and update led_strip_1.yaml with the specifics of your installation and you should be off to the races.

There are some limitations, at time of writing. While Home Assistant can send changes to the L1, the remote won't push changes back to Home Assistant. Additionally not all of the L1 modes are fully supported yet.

electronics homeautomation homeassistant software sonoff