Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Chinmay Pendharkar
    @notthetup
    so you want to stream from the browser to icecast using the webcast
    Jefferson de Andrade Santos
    @jersobh
    Many guys said it was impossible ant irc@freenode
    but there it is :)
    Chinmay Pendharkar
    @notthetup
    so browser (webcast ) -> websockets -> nodeserver -> icecast
    Jefferson de Andrade Santos
    @jersobh
    now I want browser -> nodeserver || nodeserver (alone) -> icecast
    But I"m just about switching to python
    It has a more mature structure for this I guess
    Chinmay Pendharkar
    @notthetup
    i think it's not a question of maturity
    i think you're looking in the wrong place
    webaudio API wasn't designed for streaming audio
    webcast is kind of a hack on top of ScriptProcessorNode
    Jefferson de Andrade Santos
    @jersobh
    but is there another way beside web audio api?
    Chinmay Pendharkar
    @notthetup
    on which side? browser or server?
    Jefferson de Andrade Santos
    @jersobh
    server's
    Chinmay Pendharkar
    @notthetup
    on browser side there isn't much you can do.. but I believe your issue isn't on browser side
    on server side there is no real webaudio api
    Jefferson de Andrade Santos
    @jersobh
    nope, it works well
    Chinmay Pendharkar
    @notthetup
    webaudio api is a browser api. it's supported by Chrome, FF, Edge
    Jefferson de Andrade Santos
    @jersobh
    no problem if I can play audio on icecast :)
    Chinmay Pendharkar
    @notthetup
    it's NOT a JS API
    Jefferson de Andrade Santos
    @jersobh
    with node
    Chinmay Pendharkar
    @notthetup
    so on server side, all you need is websockets -> icecast right?
    just use some basic node streams and you should be good
    Jefferson de Andrade Santos
    @jersobh
    yep, usually receive the stream (microphone) and mix it together
    Chinmay Pendharkar
    @notthetup
    you can do mixing and volume changing using things like https://www.npmjs.com/browse/keyword/audio-mixer
    I have a fork that does some extra things like gain https://github.com/notthetup/audio-mixer
    Jefferson de Andrade Santos
    @jersobh
    I see
    I'll take a look, thank you bro
    Chinmay Pendharkar
    @notthetup
    no problem
    JomoPipi
    @JomoPipi
    are people still here
    Deschutter
    @Deschutter
    Hi!
    I need to control several parameters live from the web audio api from another location
    not audio streaming, just parameters data
    so for instance I could modify a synth sound with a slider in one location and the website that holds that web audio api synth changes parameters in real time
    how could I do that?
    JomoPipi
    @JomoPipi
    @Deschutter What have you tried?
    Steve Mason
    @hysteve
    @Deschutter I'm trying to build something similar, using the npm midi library to create a virtual midi instrument that other DAWs can use the CC MIDI signal values from. This lib runs on a nodejs server, so a UI requires the server to serve a webpage and handle events. For this, I built an express server that hosts the midi instrument and a single-page web app, with a websocket connection. This allows the server and client to message each other in an open stream.
    Steve Mason
    @hysteve

    The most difficult part of this is signal processing on the node server, which the web audio api is designed for. Doing this yourself would be extremely complicated, you have to manage node connections and then resolve the values of all the node outputs using some kind of efficient dependency resolution, and that needs to happen at some sample rate where each pass is just a slice of signal output in the stream, and the sample rate or final quality of the signal depends on how quickly each pass can run on your hardware. This is what the web audio api does, with hardware acceleration and all that.

    In a perfect world, the node server would be processing the audio signal chain and streaming those values out thru the MIDI stream. The client UI is just receiving messages on the web socket about the state of the machine on the server and updating the UI accordingly. The websocket introduces potential latency, even though its probably <1ms with everything running locally. With the client being disconnected from the MIDI stream source on the server, you don't want the client producing the signals that are being streamed out to other DAWs, etc. The sample rate is limited by the websocket network transfer latency. That's terrible.

    So in this perfect world, theres a nodejs implementation of the Web Audio API, that can process and stream complex, node-based audio signal chains with hardware acceleration.

    The node-midi library is a js wrapper for a C++ library called RtMidi. I might look for other C++ libraries that similarly provide audio signal chain processing and see about js wrappers.