Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Oct 01 22:12

    brunchboy on master

    Add antora and netlify credits … (compare)

  • Sep 08 16:14

    brunchboy on master

    Update other dependencies. (compare)

  • Sep 08 16:04

    brunchboy on master

    Update CoreMidi4J. (compare)

  • Jun 11 03:19

    brunchboy on master

    Update to link to netlify hoste… (compare)

  • Jun 11 03:01

    brunchboy on master

    Update doc readme to reflect us… (compare)

  • Jun 10 21:40

    brunchboy on master

    Fix typo in function name. (compare)

  • Jun 10 21:18

    brunchboy on master

    Remove insecure HTTP link, than… Try simplifying API doc path, u… (compare)

  • Jun 10 20:48

    brunchboy on master

    Working on netlify doc build. (compare)

  • Jun 10 20:27

    brunchboy on master

    Update Clojure version. Update Codox version. Try setting up Netlify build fo… (compare)

  • May 27 18:14
    brunchboy edited #70
  • May 27 18:12
    brunchboy edited #70
  • May 27 18:12
    brunchboy labeled #70
  • May 27 18:12
    brunchboy opened #70
  • May 22 05:21
    brunchboy closed #69
  • May 22 05:21

    brunchboy on master

    Update jquery.mincolors color p… Start wedding reception show. (compare)

  • May 22 01:44
    brunchboy opened #69
  • May 22 01:42
    brunchboy closed #23
  • May 22 01:41
    brunchboy closed #65
  • Apr 29 21:08

    brunchboy on master

    Fix more broken images due to w… (compare)

  • Apr 29 21:03

    brunchboy on master

    Fix more broken images due to w… (compare)

Benjamin Gudehus
@hastebrot
I'll make a PR :)
James Elliott
@brunchboy
Great!
Benjamin Gudehus
@hastebrot
Do you have a contributors guideline? Or I'll just put these changes into two commits in a new branch and create the PR.
James Elliott
@brunchboy
Yeah, this is simple and clear. That sounds like a good approach.
I do not yet have contributor guidelines, but we both know what these are about. In the commit messages say something like “correct error in reference manual” for the first, and “correct copy paste error” for the second.
I have a house full of people practicing LED glove performances, and the pizza has just arrived. An interesting Thursday. :grinning:
Benjamin Gudehus
@hastebrot
Bon appetit!
Benjamin Gudehus
@hastebrot
So PR was send. Have fun and thanks for help. I learnt a lot.
James Elliott
@brunchboy
I just merged it! Thank you for helping figure this out. I’ll be happy to continue to help as you explore other areas of Afterglow. I am also happy to see how well it seems to work in Linux.
I will look at photos of the Mk2 and think about where to move the non-arrow round buttons. If you have any thoughts, please let me know. The are the tap-tempo, shift, stop (starts and stops sending DMX for the show) and user-mode buttons (suspends the mapping to let you switch between it and Ableton Live).
James Elliott
@brunchboy
Ok, having found a photo of the Mk2, I have moved some round buttons around. Other than the arrows, the only button active along the top is now the User 2 button, which suspends and reactivates the mapping. There was not a great place to put Click (Tap Tempo) or Shift, so I have made Volume be Tap Tempo, and Pan be Shift. Stop has moved to the button which is actually labeled Stop.
Since I can’t actually test these things, I am a bit nervous about having broken something. If you have a chance to pull the latest code and give it a try, I would appreciate it!
James Elliott
@brunchboy
I guess I should mention @hastebrot to increase the chances of you seeing my question.
James Elliott
@brunchboy
I have moved the buttons again slightly because after studying the Mini and S decals and photos, I can make the Mk2, Mini, and S all use the same round button mapping, and it makes slightly more sense than I described above. When you have time to test, it should be as documented here (follow the link to the photo of the Mini).
Benjamin Gudehus
@hastebrot
I hope to find time this evening to test this out.
James Elliott
@brunchboy
Thanks! There is no hurry, I just wanted to be sure that you someday saw the question.
James Elliott
@brunchboy
I am starting to document and prepare for the next release, so I am going to assume that the Mk2 is working properly unless I hear otherwise in the next few days. Thanks for your help in figuring it out, @hastebrot
James Elliott
@brunchboy
Actually, @hastebrot , I just greatly changed the way all controller bindings work, so that Afterglow figures out what code to call based on the device identification response it gets back from MIDI, and you no longer need to know which namespace to load. I have been able to test it with all the other controllers I support, but I still lack a Launchpad Mk2. If you could test that the new instructions work for you with the latest 0.2.1-SNAPSHOT code in the next few days I would greatly appreciate it.
Benjamin Gudehus
@hastebrot
I'll have a look at it. :) Give me a day.
James Elliott
@brunchboy
Great, thanks. I have several days of documentation and screen shots and movies to capture, so there is time. I will be splitting the Launchpad (and Ableton Push and now Push 2) instructions into their own pages, so I will post an updated link here if the location changes before you get to it.
Here is an example of what the Afterglow interface on the new Push 2 controller’s color display looks like: https://raw.githubusercontent.com/brunchboy/wayang/master/assets/Example.gif
James Elliott
@brunchboy
The updated Launchpad instructions are now on a separate page. You can find the current Mk2 instructions here.
James Elliott
@brunchboy
@hastebrot , I’ve changed things around again, and made it even easier. Now once you use the sample show, your Mk2 should simply light up and start working on its own, as described here. I just have a few interface movies to capture, and I will be ready to cut this release.
boriskalkex
@boriskalkex
hello Guys, I'm hiring coder for a quiet simple light installation. Working with LED. I'm wondering if afterglow is really appropriate for what i'have in mind. can we discus about it
James Elliott
@brunchboy
Hello, @boriskalkex, without knowing more details about what you have in mind, I can’t really say if afterglow would be appropriate. It might be overkill, but of course, you don’t need to use all of afterglow. The answer depends both on the details of your goals, and the skills and preferences of your coder.
Moritz Bust
@Busti
Hello,
I was wondering if I could get some help setting up my afterglow project.
I have an event coming up on Sunday and I still have no idea what I am doing.
And what I should be doing.
James Elliott
@brunchboy
Have you verified that your DMX interface is compatible with OLA, and confirmed end-to-end communication is working? I would say that is the first step. Then you will need to find or create fixture definitions for any of the lights you want to use, measure out where you are going to hang them, and create your show definition which patches the right fixtures at the right addresses and physical locations and orientations. Then you can start creating cues and assigning them to the grid. Are you using any physical grid controllers, or just the web interface? And are you trying to do any kind of synchronization with Pioneer Pro DJ Link, Ableton Link, or MIDI clock? Trying to do all this for the first time by Sunday may be a bit ambitious, but I will help as much as I can (I am busy at work this week though).
Moritz Bust
@Busti
I currently only have LED Pixels connected to OLA, which I tested through the web interface. Since I do not have any Fixtures for testing I can only use my oscilloscope to verify that DMX is working but it seems fine. I am using a USB Serial and a MAX485.
Since we rent the Fixtures only for sunday I can't test anything but the LED Strips before that day, but it might be possible that I can get my hands on some PAR 64 cans for testing tommorrow.
We are using a Pioneer CDJ for DJ'ing and I would like to use an OSC Client for the Tablet that sits in my closet all year.
I am doing this show every year but we were using some proprietary old software before and I wanted to find something better. I can always use that as a fallback.
I am actually doing another event in July where I seriously wanted to use OLA and maybe afterglow to drive ~ 5000 Pixels and the event on Sunday is a bit of a test to experiment a bit, while I have the equipment.
I wanted to use the salary from that event to buy a small moving head that I can use for testing.
I could not get afterglow to send data to ola yet, but thats mainly because I have no Idea what I am doing.
I also tried setting up a leiningen project in intelliJ using afterglow as a dependency, but there is something wrong with it.
Moritz Bust
@Busti
Also I followed this guide to get into clojure a little.
https://learnxinyminutes.com/docs/clojure/
James Elliott
@brunchboy

Those are a lot of things to work on! It would be probably best to tackle them one at a time, but you should definitely have a backup plan ready for Sunday. As long as you are running the OLA daemon on the standard port, on the same machine as Afterglow, it should find it just fine. Otherwise you can specify a host and port with -H and -P.

You will need to configure your show to use the proper universes in OLA. When creating the show, you specify those using the :universes keyword argument, followed by a vector of the OLA universe IDs you want the show to control. For example, if you want your show to work with OLA universes 1, 2, and 5, you would create it like: (show/show :universes [1 2 5] :description "My Show")

Do the LED pixels show up as universes as well? I have not ever tried working with them. Mapping some of your pixels as individual RGB LED fixtures would be a good baby step to start with. But even if you have no lights hooked up, as long as Afterglow is talking to OLA, you can open up the OLA web interface and watch the channel values while you are trying to run cues in Afterglow which affect them. (I just realized that my Open OLA button on the home page will only work if you are running olad on the same machine as Afterglow, but you can manually point a browser to port 9080 on the right machine if you are running it somewhere else.)

Anyway, rather than writing a huge amount here, it would help to know more specifically where you are stuck. I have already written a huge amount of documentation on the Afterglow project page and in the online manual which will be a better reference than what I type here.

Did you install Cursive in IntelliJ IDEA? That is the best way to use Clojure in IntelliJ, according to my colleagues who use IntelliJ. I’m afraid I’ve never tried using Afterglow in Cursive myself, though: I do my Clojure development in Emacs with CIDER, so I won’t be able to be of too much help there.

James Elliott
@brunchboy
I couldn’t see anything that looks like MAX485 in the list of USB/Serial devices that OLA supports, so that is probably the first thing to confirm. If you can’t get your DMX interface working in OLA directly without Afterglow in the mix, then there is no point trying to go beyond that point, except perhaps with your pixel strips, as long as those show up as universes in OLA.
Moritz Bust
@Busti

The max 485 is just a differential driver IC. It just generates the electronic signal, not the digital part.
It works just through the Uart Pins of the PI.

The show worked out in some ways for me. The biggest problem is that I just can't get warm with clojure. It would probably half a year getting into it in a way that I can do anything with it.

It was nice looking into afterglow, though. It seems really great, I just do not understand it all that much.
I have a lot of stress in the University right now so I can't spent that much time digging further.
But thank you very much for all the help :smile:
James Elliott
@brunchboy
That’s quite understandable! Getting deep into Afterglow is a significant commitment, and Clojure is only a part of that. Good luck in your future explorations, wherever they take you, and thanks.
Austin Wright
@awwright
Hey, I just stumbled across this and it looks like I'm doing something extremely similar except in Node.js; I've got a networkable lighting controller I've been designing since 2015 that does timecode sync to PDJL, layer-based effects, and support for MIDI controllers and Art-Net output. I've used it for several events of 50-200 people.
I can watch the CDJ waveform from the lighting control and see ahead of time when the drop really hits, etc.
Hardware is a Linksys router, 2x CDJ-2000NXS, DJM-2000NXS, 2x Lenovo X220 tablet, Behringer X-Touch, Launchpad Mini, an Art-Net to DMX box, and a few other MIDI devices and Ethernet switches.
James Elliott
@brunchboy
That does sound very cool and like we are exploring a lot of similar ideas! I’d love to know more about how you get at the waveforms. Is your source code and documentation on GitHub?
Austin Wright
@awwright
@brunchboy I have it on a private BitBucket . It looks like the DBServer protocol has been figured out by some other people more or less, it's a simple RPC protocol over TCP with typed arguments and responses
Getting the waveform is one such call only available to clients registered on CDJ channels 1-4
You ask for the waveform of a track ID and it gives back a series of bytes, 5 bits for amplitude and 3 bits for color
Another call lets you get the beat grid, with measure beat number and ms offset, which you can use to figure out where in the track the CDJs are playing
It's the only way available to get the position, and it's what the CDJs use internally when you double the playing track from the Master onto another player using the Sync button
Austin Wright
@awwright
I suspect there's an alternate way to get at the metadata that doesn't require you be on channel 1-4; Armin van Buuren travels with a Macbook Air that lets his LD and Resolume techs see what he's playing, and it works on all four channels
It's probably the case that the beatgrid and waveform is pre-downloaded onto that laptop
James Elliott
@brunchboy
Yes, that is what I have always assumed in my own metadata research, that people will need to pre-analyze tracks and cache all the metadata they are going to want to use during a performance if they can’t use a player number from 1 to 4. I have figured out how to get the track text metadata, but not yet the wave form or beat grid. The beat grid especially would be incredibly useful for my users who want timecode. If you ever decide you are willing to contribute this information to the open source world, please let me know, I would love to update the protocol analysis document and my own reference implementation!
The place where I have gone furthest in working with CDJs is in beat-link-trigger which some producers and artists have been using to sync video walls with CDJ tracks in Resolume, and also to sync Ableton Live improv performances with CDJs using the Ableton Link protocol at music festivals.