These are chat archives for brunchboy/afterglow

11th
Dec 2017
James Elliott
@brunchboy
Dec 11 2017 03:59
I love that you’re exploring this stuff! I built afterglow-max as a way to try to let people who are max patchers but who don’t know Clojure experiment with the software. I don’t think anyone actually ended up using that, though. A few people have built shows using afterglow itself, but it has such a steep learning curve that nobody has gone very deep yet.
I’ve been surprisingly busy (and caught an annoying virus) since traveling, so I’ve not been able to share the thoughts I was hoping to. I do think that to get good performance with 500+ pixels will require designing a slightly less flexible Show and Effect variant, so that less code is run on every frame, but which perhaps allows it to be compiled into OpenGL code that can be spread across a GPU. You could use the existing show and effect code for more complex lights, like moving heads, where you actually want to be able to do trigonometry and other fancy math on an arbitrary basis, and have the pixels handled more simply (and with brute speed) by the GPU-based show and effects, and tie the cues to each other.
Joen Tolgraven
@tolgraven
Dec 11 2017 08:31
True! What I'm working on atm is a much more robust firmware on the ESP that doesn't just push straight pixel updates but has traditional DMX channels as well, like strobe, dimmer etc, and also keeps track of it's own state somewhat so can do basic sorta halogen emulation, or at least exposing the A and R in ADSR. And dithering.
Already got things looking unexpectedly awesome so def on the right way :) https://instagram.com/p/BbvWjZtld1O/
Joen Tolgraven
@tolgraven
Dec 11 2017 08:56
Also quite hyped on hooking up my Myo over OSC, feels like the combo of general spatial control and the muscle sensors should allow for some very cool (and generally less jerky-robotic-feeling) effects - playing around a bunch with the push-as-mpc-for-lights with full aftertouch etc has really opened my eyes to how much more natural a more physical (or simply musical) workflow can be, and what seems quite possible on this front right the fuck now even with little or no resources.
Joen Tolgraven
@tolgraven
Dec 11 2017 09:19
A more specific question: I want to get some simple coherence between lights and VJ proj by basically tapping the video palette in realtime, diy ambilight style. I'm guessing first steps'd be an off the shelf open source solution bounced down to just a few HSL "pixels" (say 10-20) coming in over OSC feeding a show-bound object of positions across the rig (just X-wise to start) and then a global effect calculating bounds for each point (a la bloom) pushing the right colors across the rig. Seems sensible, yeah?