Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Oct 01 22:12

    brunchboy on master

    Add antora and netlify credits … (compare)

  • Sep 08 16:14

    brunchboy on master

    Update other dependencies. (compare)

  • Sep 08 16:04

    brunchboy on master

    Update CoreMidi4J. (compare)

  • Jun 11 03:19

    brunchboy on master

    Update to link to netlify hoste… (compare)

  • Jun 11 03:01

    brunchboy on master

    Update doc readme to reflect us… (compare)

  • Jun 10 21:40

    brunchboy on master

    Fix typo in function name. (compare)

  • Jun 10 21:18

    brunchboy on master

    Remove insecure HTTP link, than… Try simplifying API doc path, u… (compare)

  • Jun 10 20:48

    brunchboy on master

    Working on netlify doc build. (compare)

  • Jun 10 20:27

    brunchboy on master

    Update Clojure version. Update Codox version. Try setting up Netlify build fo… (compare)

  • May 27 18:14
    brunchboy edited #70
  • May 27 18:12
    brunchboy edited #70
  • May 27 18:12
    brunchboy labeled #70
  • May 27 18:12
    brunchboy opened #70
  • May 22 05:21
    brunchboy closed #69
  • May 22 05:21

    brunchboy on master

    Update jquery.mincolors color p… Start wedding reception show. (compare)

  • May 22 01:44
    brunchboy opened #69
  • May 22 01:42
    brunchboy closed #23
  • May 22 01:41
    brunchboy closed #65
  • Apr 29 21:08

    brunchboy on master

    Fix more broken images due to w… (compare)

  • Apr 29 21:03

    brunchboy on master

    Fix more broken images due to w… (compare)

James Elliott
@brunchboy
I am starting to document and prepare for the next release, so I am going to assume that the Mk2 is working properly unless I hear otherwise in the next few days. Thanks for your help in figuring it out, @hastebrot
James Elliott
@brunchboy
Actually, @hastebrot , I just greatly changed the way all controller bindings work, so that Afterglow figures out what code to call based on the device identification response it gets back from MIDI, and you no longer need to know which namespace to load. I have been able to test it with all the other controllers I support, but I still lack a Launchpad Mk2. If you could test that the new instructions work for you with the latest 0.2.1-SNAPSHOT code in the next few days I would greatly appreciate it.
Benjamin Gudehus
@hastebrot
I'll have a look at it. :) Give me a day.
James Elliott
@brunchboy
Great, thanks. I have several days of documentation and screen shots and movies to capture, so there is time. I will be splitting the Launchpad (and Ableton Push and now Push 2) instructions into their own pages, so I will post an updated link here if the location changes before you get to it.
Here is an example of what the Afterglow interface on the new Push 2 controller’s color display looks like: https://raw.githubusercontent.com/brunchboy/wayang/master/assets/Example.gif
James Elliott
@brunchboy
The updated Launchpad instructions are now on a separate page. You can find the current Mk2 instructions here.
James Elliott
@brunchboy
@hastebrot , I’ve changed things around again, and made it even easier. Now once you use the sample show, your Mk2 should simply light up and start working on its own, as described here. I just have a few interface movies to capture, and I will be ready to cut this release.
boriskalkex
@boriskalkex
hello Guys, I'm hiring coder for a quiet simple light installation. Working with LED. I'm wondering if afterglow is really appropriate for what i'have in mind. can we discus about it
James Elliott
@brunchboy
Hello, @boriskalkex, without knowing more details about what you have in mind, I can’t really say if afterglow would be appropriate. It might be overkill, but of course, you don’t need to use all of afterglow. The answer depends both on the details of your goals, and the skills and preferences of your coder.
Moritz Bust
@Busti
Hello,
I was wondering if I could get some help setting up my afterglow project.
I have an event coming up on Sunday and I still have no idea what I am doing.
And what I should be doing.
James Elliott
@brunchboy
Have you verified that your DMX interface is compatible with OLA, and confirmed end-to-end communication is working? I would say that is the first step. Then you will need to find or create fixture definitions for any of the lights you want to use, measure out where you are going to hang them, and create your show definition which patches the right fixtures at the right addresses and physical locations and orientations. Then you can start creating cues and assigning them to the grid. Are you using any physical grid controllers, or just the web interface? And are you trying to do any kind of synchronization with Pioneer Pro DJ Link, Ableton Link, or MIDI clock? Trying to do all this for the first time by Sunday may be a bit ambitious, but I will help as much as I can (I am busy at work this week though).
Moritz Bust
@Busti
I currently only have LED Pixels connected to OLA, which I tested through the web interface. Since I do not have any Fixtures for testing I can only use my oscilloscope to verify that DMX is working but it seems fine. I am using a USB Serial and a MAX485.
Since we rent the Fixtures only for sunday I can't test anything but the LED Strips before that day, but it might be possible that I can get my hands on some PAR 64 cans for testing tommorrow.
We are using a Pioneer CDJ for DJ'ing and I would like to use an OSC Client for the Tablet that sits in my closet all year.
I am doing this show every year but we were using some proprietary old software before and I wanted to find something better. I can always use that as a fallback.
I am actually doing another event in July where I seriously wanted to use OLA and maybe afterglow to drive ~ 5000 Pixels and the event on Sunday is a bit of a test to experiment a bit, while I have the equipment.
I wanted to use the salary from that event to buy a small moving head that I can use for testing.
I could not get afterglow to send data to ola yet, but thats mainly because I have no Idea what I am doing.
I also tried setting up a leiningen project in intelliJ using afterglow as a dependency, but there is something wrong with it.
Moritz Bust
@Busti
Also I followed this guide to get into clojure a little.
https://learnxinyminutes.com/docs/clojure/
James Elliott
@brunchboy

Those are a lot of things to work on! It would be probably best to tackle them one at a time, but you should definitely have a backup plan ready for Sunday. As long as you are running the OLA daemon on the standard port, on the same machine as Afterglow, it should find it just fine. Otherwise you can specify a host and port with -H and -P.

You will need to configure your show to use the proper universes in OLA. When creating the show, you specify those using the :universes keyword argument, followed by a vector of the OLA universe IDs you want the show to control. For example, if you want your show to work with OLA universes 1, 2, and 5, you would create it like: (show/show :universes [1 2 5] :description "My Show")

Do the LED pixels show up as universes as well? I have not ever tried working with them. Mapping some of your pixels as individual RGB LED fixtures would be a good baby step to start with. But even if you have no lights hooked up, as long as Afterglow is talking to OLA, you can open up the OLA web interface and watch the channel values while you are trying to run cues in Afterglow which affect them. (I just realized that my Open OLA button on the home page will only work if you are running olad on the same machine as Afterglow, but you can manually point a browser to port 9080 on the right machine if you are running it somewhere else.)

Anyway, rather than writing a huge amount here, it would help to know more specifically where you are stuck. I have already written a huge amount of documentation on the Afterglow project page and in the online manual which will be a better reference than what I type here.

Did you install Cursive in IntelliJ IDEA? That is the best way to use Clojure in IntelliJ, according to my colleagues who use IntelliJ. I’m afraid I’ve never tried using Afterglow in Cursive myself, though: I do my Clojure development in Emacs with CIDER, so I won’t be able to be of too much help there.

James Elliott
@brunchboy
I couldn’t see anything that looks like MAX485 in the list of USB/Serial devices that OLA supports, so that is probably the first thing to confirm. If you can’t get your DMX interface working in OLA directly without Afterglow in the mix, then there is no point trying to go beyond that point, except perhaps with your pixel strips, as long as those show up as universes in OLA.
Moritz Bust
@Busti

The max 485 is just a differential driver IC. It just generates the electronic signal, not the digital part.
It works just through the Uart Pins of the PI.

The show worked out in some ways for me. The biggest problem is that I just can't get warm with clojure. It would probably half a year getting into it in a way that I can do anything with it.

It was nice looking into afterglow, though. It seems really great, I just do not understand it all that much.
I have a lot of stress in the University right now so I can't spent that much time digging further.
But thank you very much for all the help :smile:
James Elliott
@brunchboy
That’s quite understandable! Getting deep into Afterglow is a significant commitment, and Clojure is only a part of that. Good luck in your future explorations, wherever they take you, and thanks.
Austin Wright
@awwright
Hey, I just stumbled across this and it looks like I'm doing something extremely similar except in Node.js; I've got a networkable lighting controller I've been designing since 2015 that does timecode sync to PDJL, layer-based effects, and support for MIDI controllers and Art-Net output. I've used it for several events of 50-200 people.
I can watch the CDJ waveform from the lighting control and see ahead of time when the drop really hits, etc.
Hardware is a Linksys router, 2x CDJ-2000NXS, DJM-2000NXS, 2x Lenovo X220 tablet, Behringer X-Touch, Launchpad Mini, an Art-Net to DMX box, and a few other MIDI devices and Ethernet switches.
James Elliott
@brunchboy
That does sound very cool and like we are exploring a lot of similar ideas! I’d love to know more about how you get at the waveforms. Is your source code and documentation on GitHub?
Austin Wright
@awwright
@brunchboy I have it on a private BitBucket . It looks like the DBServer protocol has been figured out by some other people more or less, it's a simple RPC protocol over TCP with typed arguments and responses
Getting the waveform is one such call only available to clients registered on CDJ channels 1-4
You ask for the waveform of a track ID and it gives back a series of bytes, 5 bits for amplitude and 3 bits for color
Another call lets you get the beat grid, with measure beat number and ms offset, which you can use to figure out where in the track the CDJs are playing
It's the only way available to get the position, and it's what the CDJs use internally when you double the playing track from the Master onto another player using the Sync button
Austin Wright
@awwright
I suspect there's an alternate way to get at the metadata that doesn't require you be on channel 1-4; Armin van Buuren travels with a Macbook Air that lets his LD and Resolume techs see what he's playing, and it works on all four channels
It's probably the case that the beatgrid and waveform is pre-downloaded onto that laptop
James Elliott
@brunchboy
Yes, that is what I have always assumed in my own metadata research, that people will need to pre-analyze tracks and cache all the metadata they are going to want to use during a performance if they can’t use a player number from 1 to 4. I have figured out how to get the track text metadata, but not yet the wave form or beat grid. The beat grid especially would be incredibly useful for my users who want timecode. If you ever decide you are willing to contribute this information to the open source world, please let me know, I would love to update the protocol analysis document and my own reference implementation!
The place where I have gone furthest in working with CDJs is in beat-link-trigger which some producers and artists have been using to sync video walls with CDJ tracks in Resolume, and also to sync Ableton Live improv performances with CDJs using the Ableton Link protocol at music festivals.
Austin Wright
@awwright
Also I suppose there is a way to get the track title, at least, without being on channel 1-4 because the Kuvo box requests the data from CDJ mixers, up to all four channels.
Either it's doing that, or Pioneer is storing track IDs. I tend to doubt it, but I haven't sniffed what Rekordbox is sending to Pioneer.
It'd be helpful if there was some way to tear apart the firmware image, but it doesn't seem to match any known formats. It can't be too esoteric though, it contains a lot of plain text strings.
James Elliott
@brunchboy
I have heard that about Kuvo, but I have never had access to a unit, none of the clubs in Madison have them, so I am puzzled about how it works. I hope someone can sniff one and share the details soon! I would also love to know the format of the packets that you use to request wave forms and the beat grid (and the format of the responses). Can you share that? Is the information people have figured out about DBServer organized and published anywhere, beyond what I have done in the dysentery protocol analysis paper?
Austin Wright
@awwright
I suppose I could publish what I've got
Unfortunately by the looks of it this would be a hard protocol to fuzz
In summary though, there's a 2-byte field for (what I presume to be) the method ID, and there's different methods for things like album art, waveform summary x2004 (the 300 px wide version, one byte per pixel), a waveform detail x2904 (300 pixels per second, one byte per pixel), and beat grid info x2204 (uint8 measure_beat, 0x000000, uint32 beat_ms_offset)
Austin Wright
@awwright
Eh, I don't have it documented very well, it's all in code
about 3500 lines of it
I'll figure something out over the weekend, I don't get a lot of time to work on this anymore
I also invented a lot of names for things. Method calls can be fragmented up into multi-message responses, and each response can have multiple fields I called Kibbles
Austin Wright
@awwright
It's also fun to note that all of the menu drawing for the Link functionality is very low-level and done over the network. The CDJ completely re-requests all the menu items every time it has to scroll the menu.
I call the handshake packets Hello and Sup
Austin Wright
@awwright
Austin Wright
@awwright
The other thing to keep in mind about the protocol is it's reverse compatible with every Pro DJ Link product they've ever put out, so there's kludges upon kludges
James Elliott
@brunchboy
I noticed that about the menu drawing when I was looking at network captures a few months ago too, and laughed. Thanks so much for sharing this! Skimming your protocol.txt notes, it looks like we went through incredibly similar research paths. I can see I have a bunch more experimenting ahead of me. I have pulled together my own notes here: https://github.com/brunchboy/dysentery/blob/master/doc/Analysis.pdf
James Elliott
@brunchboy
It’s crazy how months can go by with no activity in my open source projects, and then everything happens at once. I have not had a chance to dig into your insights yet because I just received some breakthrough help that is going to enable a version 1.0 release of CoreMidi4J, which lets MIDI work properly in Java on the Mac. But I am so impatient to do new tricks with my CDJs!
James Elliott
@brunchboy
Just looking at dbstruct.js now that I’ve been able to release CoreMidi4J 1.0, and starting to see the Kibble boundaries you identified, I can see there are going to be some great improvements to the packet analysis PDF! How would you like to be credited? :grin: