Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Jun 11 03:19

    brunchboy on master

    Update to link to netlify hoste… (compare)

  • Jun 11 03:01

    brunchboy on master

    Update doc readme to reflect us… (compare)

  • Jun 10 21:40

    brunchboy on master

    Fix typo in function name. (compare)

  • Jun 10 21:18

    brunchboy on master

    Remove insecure HTTP link, than… Try simplifying API doc path, u… (compare)

  • Jun 10 20:48

    brunchboy on master

    Working on netlify doc build. (compare)

  • Jun 10 20:27

    brunchboy on master

    Update Clojure version. Update Codox version. Try setting up Netlify build fo… (compare)

  • May 27 18:14
    brunchboy edited #70
  • May 27 18:12
    brunchboy edited #70
  • May 27 18:12
    brunchboy labeled #70
  • May 27 18:12
    brunchboy opened #70
  • May 22 05:21
    brunchboy closed #69
  • May 22 05:21

    brunchboy on master

    Update jquery.mincolors color p… Start wedding reception show. (compare)

  • May 22 01:44
    brunchboy opened #69
  • May 22 01:42
    brunchboy closed #23
  • May 22 01:41
    brunchboy closed #65
  • Apr 29 21:08

    brunchboy on master

    Fix more broken images due to w… (compare)

  • Apr 29 21:03

    brunchboy on master

    Fix more broken images due to w… (compare)

  • Apr 29 20:56

    brunchboy on master

    Try without line break? (Older … (compare)

  • Apr 29 20:53

    brunchboy on master

    Fix image macro with comma in a… (compare)

  • Apr 29 07:07

    brunchboy on master

    Make guide base URL independent… (compare)

James Elliott
@brunchboy
That is the waveform summary, which for you is 300 bytes, right? So we need to figure out how to interpret the extra color information. Here are the actual bytes of the waveform blob:
blob: 02 05 02 05 02 05 02 05 02 05 02 05 02 04 02 04 02 05 02 05 02 04 03 04 04 05 02 05 04 04 05 04 03 04 02 04 04 04 02 04 03 04 02 04 03 04 03 04 03 04 04 04 04 04 02 04 04 04 03 04 02 05 03 05 02 04 04 04 03 04 04 04 05 04 04 04 04 04 06 04 06 04 09 04 06 04 06 04 09 04 06 04 05 04 05 04 09 04 0d 04 08 04 09 04 08 00 0c 00 0d 03 16 03 12 03 15 03 18 03 18 03 16 03 11 03 17 03 16 03 13 03 15 03 16 03 10 03 18 03 18 03 18 02 18 02 13 01 13 01 18 03 19 03 10 03 16 03 13 03 15 03 18 02 18 02 16 03 13 03 17 03 18 03 16 03 14 03 18 03 19 03 19 03 19 03 19 03 19 03 0d 05 0d 05 17 03 17 03 18 03 18 03 0f 05 02 05 16 03 17 03 18 03 13 03 14 03 11 03 16 03 18 03 19 03 19 03 15 05 16 05 17 03 18 03 09 05 0a 05 11 03 16 03 19 02 17 02 19 03 19 03 19 03 19 03 18 03 11 03 14 03 10 03 16 03 18 03 19 03 19 03 12 05 17 05 0c 05 0f 05 17 03 14 03 11 03 16 03 19 00 18 00 0c 05 05 05 16 03 16 03 17 03 17 03 13 03 12 03 0d 05 0c 05 19 03 19 03 12 05 17 05 0c 05 0f 05 17 03 13 03 12 03 16 03 19 03 16 03 19 03 19 03 17 03 16 03 08 03 17 03 13 03 11 03 16 03 18 03 17 03 17 03 18 03 19 03 0f 05 0c 05 17 03 16 03 11 03 14 03 19 00 18 00 19 02 15 02 16 03 18 03 19 02 16 02 18 02 19 02 13 02 18 02 16 03 18 03 18 03 18 03 18 03 17 03 19 03 18 03 19 03 17 03 17 03 17 03 0f 05 0c 05 18 03 15 03 17 03 17 03 0d 05 0f 05 17 03 17 03 19 03 17 03 17 03 17 03 0b 05 0f 05 19 03 19 03 19 00 19 00 19 03 19 03 0c 05 0f 05 17 03 15 03 17 03 17 03 0d 05 0f 05 19 03 18 03 19 03 17 03 18 03 18 03 18 03 11 03 19 02 19 02 19 00 19 00 19 03 19 03 08 05 09 05 0b 05 0a 05 0c 05 09 05 0b 05 09 05 15 05 0b 05 14 05 0f 05 0b 05 09 05 0e 05 0e 05 13 04 0e 04 12 04 14 04 0e 05 12 05 17 03 16 03 17 00 11 00 16 01 13 01 17 00 18 00 14 00 16 00 16 03 16 03 17 00 14 00 15 03 17 03 13 00 16 00 16 04 12 04 12 03 17 03 11 03 16 03 16 03 15 03 18 02 18 02 16 03 14 03 18 03 19 03 11 03 16 03 18 03 18 03 13 02 16 02 16 01 13 01 18 02 18 02 14 03 16 03 18 03 19 03 16 03 14 03 18 02 18 02 14 03 16 03 18 03 18 03 18 02 19 02 19 00 19 00 19 03 18 03 19 03 19 03 18 02 19 02 12 05 03 05 17 02 17 02 14 03 15 03 18 03 18 03 0f 05 0d 05 19 03 19 03 16 03 11 03 19 03 19 03 12 03 18 03 17 03 17 03 16 00 17 00 18 03 16 03 17 00 18 00 14 03 12 03 17 03 17 03 0c 05 0f 05 19 03 19 03 10 01 16 01 19 02 19 02 12 00 18 00 17 03 16 03 0f 04 0c 04 0e 05 0b 05 12 05 15 05 15 03 17 03 18 03 15 03 17 03 18 03 14 03 16 03 16 05 05 05 02 05 02 05 02 02 02 04 02 02 04 04 04 07 08 06 0b 0d 0e 0f 0d 0f 0e 0e 0f 0f 0f 0e 0d 0a 0d 0e 08 0d 0f 0e 0d 0e 0a 0f 0a 0d 0a 0c 0d 0f 0d 0a 0f 0a 0f 0f 0f 0f 0f 0f 0b 0d 0a 0e 0e 0f 0c 0a 0f 0e 0f 0f 02 06 07 08 09 0e 0d 0e 0c 0d 0c 0e 0c 0e 0e 0e 0e 0d 0f 0f 0f 0f 0d 0c 0e 0e 0c 0e 0a 0e 0f 0a 02 0d 0c 09 [waveform bytes]
James Elliott
@brunchboy
And finally, here is what the same waveform summary looks like on the CDJ display, so evidently it does know how to interpret it, even though (as an original nexus player) it refuses to show the colors: waveform on CDJ
James Elliott
@brunchboy
But strangely, I just tried with a track on an SD card that was still in the blue format, and that came back with 900 bytes of preview waveform, and the same kind of distortions, so it may be a difference in the firmware version of our players or something?
James Elliott
@brunchboy
Well, I got it figured out, but I can’t explain why @awwright’s seeing such very different results. For my CDJs, the waveform preview is 400 pixels wide, and the first 800 bytes returned by the CDJ are alternating height and whiteness bytes. I don’t know what the remaining 100 bytes are, perhaps color information, I will investigate that later. But when I ignore all but the high bit of the whiteness information (which is what the preview does on my players), the result I get matches the above photo: Fixed waveform display
Austin Wright
@awwright
@brunchboy I suspect there's a different command to retrieve an RGB waveform, is that something that might be the case?
James Elliott
@brunchboy
That might be, but since I don’t have any nexus 2 players, I won’t be able to sniff it out. We will have to wait until someone can report that. :smile: Are you getting two-byte-per-column, 400-pixel waveform summaries like I have described here, or are you getting the one-byte-per-column, 300-pixel versions you described? If so, what model of players and what firmware are you using? I am using the same request that was in dbstruct.js and seeing the results you see here. I am making huge progress in rewriting the beat-link library to use all this new knowledge, though! The code is getting shorter, more clear, much more understandable and robust, and can already do far more than it could before!
Austin Wright
@awwright
Hmm I'll have to take a look. I do recall the 400px summary being a slightly different format
James Elliott
@brunchboy
Yeah, anything you can share about the various preview and waveform formats and how to get them would be interesting. I need to figure out cue grids soon too.
Austin Wright
@awwright
All the stuff for decoding the hotcue/cuepoint and beatgrid stuff should be coded somewhere
Unlike the waveform where I think I coded that somewhere else
James Elliott
@brunchboy
Yeah, I got the beat grid down, I just haven’t tried the cue grid yet. I saw some stuff there, and I am hoping I will be able to figure out the difference between hot cues A, B, C, and ordinary memory point cues.
I’ve been on a bit of a detour to implement a nice metadata cache file system, for use in production environments where all four players are in use. That has come along very well.
Your code has been a huge treasure trove. It’s been very bad for my sleep schedule, though! I am going to go back in to work pretty tired tomorrow. ;^)
Austin Wright
@awwright
Heheh
I'm in Arizona, it's sunset right now
And yeah, for my lighting program I've got a button I hit to say "Create scene from loaded track" and it downloads/syncs all the waveform data
James Elliott
@brunchboy
Neat! I am spending so much time building these libraries and tools for other people to use I haven’t even had a chance to think about playing with this stuff in afterglow yet.
August
@Augusto__Solis_twitter
Hi guys, First you're doing a tremendous job. also i have a nexus 2 to play arround sometimes, I know almost nothing about code but this proyect blow my mind.
James Elliott
@brunchboy
Thanks, August! It is blowing my mind too, at the moment. I am not getting enough sleep, but soon there will be some really cool features for people to try out.
James Elliott
@brunchboy
All right, I have cue lists added to the packet analysis document and Beat Link library now. @awwright, you might be interested to take a look because I figured out how loops are represented and stored too. And were you able to find anything out about the different sizes and formats of waveform previews? I imagine they might use different requests and responses.
James Elliott
@brunchboy
And things are coming together! Here’s the Swing component Beat Link now offers for drawing track waveform previews, with minute marks, playback position, and cue markers: annotated waveform
James Elliott
@brunchboy
And the new player status interface in Beat Link Trigger is nearly done. It uses a ton of this new information! Player Status
Joen Tolgraven
@tolgraven
Hey, I want to thank you for this amazing software framework, proper legend. The fact that it's making me (ever so slowly) learn clojure is an awesome bonus, though it's sad (but not unexpected) that it keeps... basically everyone, I guess, from getting into it.
Joen Tolgraven
@tolgraven
Anyways, question. I'm (mainly) using it rather unorthodoxly, rendering ledstrips I've setup to receive artnet over wifi. Quickly realizing that doesn't scale too well, since even 600 or so pixels plus some regular fixtures means ~50% CPU usage on a water-cooled, overclocked 8-core i7. What steps, if any, might be available for me to make the rendering loop more efficient?
Looking forward to contribute properly to the project once my clojure is less shit! :smile:
James Elliott
@brunchboy
Hello Joen! I’m traveling at the moment so will respond more meaningfully later, but I am delighted to hear you are exploring the project. I have a few ideas, but I am not surprised the current code bogs down after a few hundred lights. Thanks for writing, and more to come!
Joen Tolgraven
@tolgraven
Aight! I’m gearing up something fierce haha, bought like 1500w worth of RGBW led fixtures off aliexpress, including a monster 400w moving head. Crazy what you get for a grand and change these days :O I use Live for music and already fairly into Max so planning on trying to get an entire ecosystem built ontop of your shit basically. Been thinking how like, oh I need to add buttons to solo and mute effects and on-push macros and ways to compose chases and and and, but then realising why reinvent the wheel, when can use Live to capture/modulate/sequence both afterglow state changes and resultant raw dmx output. What sort of stuff do you use the afterglow-max binding for?
James Elliott
@brunchboy
I love that you’re exploring this stuff! I built afterglow-max as a way to try to let people who are max patchers but who don’t know Clojure experiment with the software. I don’t think anyone actually ended up using that, though. A few people have built shows using afterglow itself, but it has such a steep learning curve that nobody has gone very deep yet.
I’ve been surprisingly busy (and caught an annoying virus) since traveling, so I’ve not been able to share the thoughts I was hoping to. I do think that to get good performance with 500+ pixels will require designing a slightly less flexible Show and Effect variant, so that less code is run on every frame, but which perhaps allows it to be compiled into OpenGL code that can be spread across a GPU. You could use the existing show and effect code for more complex lights, like moving heads, where you actually want to be able to do trigonometry and other fancy math on an arbitrary basis, and have the pixels handled more simply (and with brute speed) by the GPU-based show and effects, and tie the cues to each other.
Joen Tolgraven
@tolgraven
True! What I'm working on atm is a much more robust firmware on the ESP that doesn't just push straight pixel updates but has traditional DMX channels as well, like strobe, dimmer etc, and also keeps track of it's own state somewhat so can do basic sorta halogen emulation, or at least exposing the A and R in ADSR. And dithering.
Already got things looking unexpectedly awesome so def on the right way :) https://instagram.com/p/BbvWjZtld1O/
Joen Tolgraven
@tolgraven
Also quite hyped on hooking up my Myo over OSC, feels like the combo of general spatial control and the muscle sensors should allow for some very cool (and generally less jerky-robotic-feeling) effects - playing around a bunch with the push-as-mpc-for-lights with full aftertouch etc has really opened my eyes to how much more natural a more physical (or simply musical) workflow can be, and what seems quite possible on this front right the fuck now even with little or no resources.
Joen Tolgraven
@tolgraven
A more specific question: I want to get some simple coherence between lights and VJ proj by basically tapping the video palette in realtime, diy ambilight style. I'm guessing first steps'd be an off the shelf open source solution bounced down to just a few HSL "pixels" (say 10-20) coming in over OSC feeding a show-bound object of positions across the rig (just X-wise to start) and then a global effect calculating bounds for each point (a la bloom) pushing the right colors across the rig. Seems sensible, yeah?
James Elliott
@brunchboy
Wow, that does look awesome indeed! I can’t wait to see what you end up doing with this. Thanks so much for sharing the fact that you are tinkering with this stuff. I’m hoping to find a little time at the end of this month when I am taking time off from work to dust off afterglow a bit and implement a thing or two that have been on the back burner. It’s been too long, and having someone else out there using it is wonderful motivation. :grinning:
That’s an interesting idea, with respect to having the lights tie into the video palette. If I were going to tackle that, I would probably start with Max/MSP’s Jitter objects to work with the video stream, if you have it available (and it sounded as though you might?) just because it’s designed for exactly that sort of tinkering. Then when you can identify a palette (and perhaps vague spatial relationships), yeah, turn it into some blooms or something. Another place you might want to look for inspiration in this regard, outside of Max, would be some of those open-source kits that use Raspberry Pi to create a palette of LED lights behind a television set which tie into the scene being displayed. There is probably some great code you could borrow from that.
James Elliott
@brunchboy
And, duh, I just realized that “ambilight” in your original question is probably talking about the exact same thing—I didn’t realize that was what they were called, so you are already aware of them.
Joen Tolgraven
@tolgraven
Cheers! Yeah felt about time I said something, been overwhelmingly impressed with your project for a good while now, discovered it way back last year while I was looking for (foss) solutions for custom push2 stuff, but only past few months I've gotten to the point of realizing like, yeah this is it, this is the platform. Basically tore the example file down to something very minimal, reintroducing everything bit by bit, trying to not put stuff back until I actually know how and why it does what it does. Just really adore the quasi-repl workflow I get using Fireplace (too late for me to pick up emacs at this point haha, got a 3k LOC vimrc). But yeah got heaps of ideas and expect to make decent progress towards getting into the actual source in the coming months, at least if I manage to not get bogged down on firmware stuff. Fuuuck embedded programming seriously haha, even with OTA flashing and non-hardcoded mqtt-updatable params it's still like, the exact opposite of jumping around a file magically evaling on the fly.
James Elliott
@brunchboy
I am so delighted to have someone sharing this playground, whenever you have time. Do please feel free to ask questions and share ideas and frustrations! And I can quite see how this is the opposite of embedded programming. I have wanted to write something like Afterglow since I was a teenager tinkering with 8 bit, 1 MHz processors. But they just didn’t have the horsepower to support it. Now we have cycles and memory to spare, and can use lovely abstractions like Clojure and REPLs. I’m glad Fireplace works well! Trying to change from a deeply comfortable vim environment to emacs would be insane, just as the reverse would be for me, with my decades of emacs experience and configuration context.
Joen Tolgraven
@tolgraven
So taken the launch of live10 and promised m4l efficiency improvements to dig into the max side of things. Got afterglow-max set up from snapshot and running in max8 without too much hassle, awesome! Got two issues right away tho. Afterglow can't connect to the push2 while the script active in ableton, nor will the script function if activated after afterglow has connected. I'm guessing due to the updates somehow but it's weird considering they're on separate ports, no?
Joen Tolgraven
@tolgraven
And the other snafu is in .Cue where certain bangs (originating from eg using sel on incoming messages, as opposed to clicks) makes it throw, due to usage of the com.cycling74.max MaxObject Atom which can't be found. Both in max8/snapshot and max7/0.2.1 precompiled. Any clue why?
James Elliott
@brunchboy
I’m not sure what you mean by separate ports, but I haven’t tried combining Max, afterglow, and Live at the same time, so there may be some contention. I haven’t seen that exception either, but I have Max 7.3.4. Cycling ’74 deprecated the Max/Java binding even before Ableton bought them, so perhaps you are starting to run into breakages?
Joen Tolgraven
@tolgraven
Oh I mean like it's trying and failing to connect to the User port, the live script uses the Live port. Hmm yeah was thinking it was something about the new max version/binding or a paths issue but like I said I tested with max7 and your precompiled jar and had the same problem
Joen Tolgraven
@tolgraven
So looking a bit closer at the stacktrace I think what’s happening is that type of bang actually results in the max Atom being passed to afterglow.controllers, which obviously can't resolve it
Joen Tolgraven
@tolgraven
And the push2 error isn't actually the midi port (which makes sense) but from libusb in .openPushDisplay, "USB error 3: Unable to claim interface 0 of Push 2 device: Access denied (insufficient permissions)". Max console errors are a bit scrambled so didn't see the full exception until I ran my regular lein setup, sorry. Any hunch? Might get in touch with ableton whether they've made any breaking changes, because I never tested running ableton+afterglow before updating I can't say whether it was actually working earlier. Happens even in live9 and with a clean set/no M4L loaded so it's definitely either something in the new firmware, or somehow related to my setup...
Actually I should get in touch with them either way since the failing to claim goes both ways and unlike Afterglow the Live log doesn't show any errors when it happens, which definitely ain't right no matter what the root cause
Joen Tolgraven
@tolgraven
On a less aargh note, I'm looking at using libmapper for signal passing on multiple fronts in my larger project, but seems to me it'd be just as spot on for more robust OSC support in core afterglow. Has Java and max bindings and everything, looks like it's been pretty dead for a long time but just started seeing active development again in the past few months. What do you reckon?
James Elliott
@brunchboy
Oh… trying to share a Push 2 with Ableton running probably won’t work, you’re right, because Ableton has the Push display open, and that’s not something that can be shared like a MIDI port can. I don’t think Ableton has really thought about trying to allow user mode applications to directly write to the display.
When I was testing Push in Max was back before the Push 2 had come out, and was without Ableton Live in any case.
As for libmapper, I don’t know anything about it, I am afraid!
Joen Tolgraven
@tolgraven
Ah ok my bad. Tho I'm fairly sure I saw a (paid) lib and M4L projects using it that could share the screen and temporarily take over with like basically the clip note view ableton fomally implemented in live10
James Elliott
@brunchboy
If you can find an API that supports that, please let me know, and I will see about adding it to Wayang!