Hmm, perhaps I gonna create an UI library for NAudio. Other question, is it possible to read Speex file from a path with NSpeex? I´ve just seen implementations that compress or decompress raw data with NSpeex.
So how does the build/release process work? I'm trying to investigate #176
Nevermind. Looks like I figured it out. Thanks.
Hi everyone! Is NAudio capable of microtonal pitch-shifting? I have an old recording I want to add accompaniments to, and I'd like to fix its pitch before.
Hi.. I am new to using NAudio. I saw a tutorial on youtube on using it. It is playing mp3 perfectly but giving an Exception when playing .wav file. The exception is [An unhandled exception of type 'System.ApplicationException' occurred in NAudio.dll. Additional information: Only PCM supported]. It is coming on the line where WaveChannel32 object is initialized using WaveFileReader object.
Any help is highly appreciated.
WaveChannel32 expects PCM 16 bit input. So check the WaveFormat property of the WaveFileReader class. Also, sounds like that tutorial is quite old. AudioFileReader may be a better choice replacing both WaveFileReader and WaveChannel32
@markheath Can you suggest a better tutorial please.. Thanks for your efforts.
Hi I posted this on the Naudio/vorbis gitter but since it seems that only few people still read it I'm also posting it here , I'm having a problem with VorbisWaveReader, I have an ogg file and I confirmed several times that it's in fact an ogg file, but for some reason when i use VorbisWaveReader it throws me "could not determine container type"....anyone knows what could be happening? the project compiles Ok and doesnt give any errors. I'm trying to convert the ogg file to a wav file...
Hey, i´ve a problem I play audio files with this solution:"http://naudio.codeplex.com/wikipage?title=MP3" I want to have a fadein in the next song as soon as the current song is finished. my first thought was with two "IWavePlayers" and two "AudioFileReaders". But then I came across the problem that I can't react to the end of the song at all during the runtime. Anybody got any ideas?
Hi, I saw in class WaveFormat that format length value is minimum 18. I understand that this is because of additional extraSize field. However, in case of WaveFormatEncoding.Pcm this value is always 0. What is a reason why extraSize field is always added? Should Wave format always have this extra field even if it's not used?
Julio César Rocha
Hey guys. NAudio semi-related question. Does anyone know if UWP's AudioGraph can play AICF format?
I very much doubt it. It likely can play only the formats that MediaFoundationReader in NAudio supports
Julio César Rocha
Do you know if AudioGraph has some extension mechanism, so that it could be achieved via codecs?
Hi can anyone say How to get microphone device peak values using naudio I tried several snippets nothing worked perfectly?
I'm trying to load a SoundFont but I'm not quite sure how to use it. I have to admit I know very little about the format though. In my case I'd like to get the audio samples for each note, and which segment of the sample is the loopable part. I was able to load a SoundFont just fine but the SampleData field is a byte array, what should I do with that to get the actual audio samples for each note?
Ps. I do have pitch shifting working, I know that the soundfont doesn't have an actual wav for each note, but if I can get the base notes than I can pitch shift those easily.
Is it possible to use Naudio with Vorbis to resample and convert .wav files to .ogg files?
Hello. Please it is possible to record at position with naudio?
@markheath How to write audio at specific position? For example, I want to create an MP3 using two audio files, one that starts at 5 seconds and the other that starts at 10 seconds.
@markheath Never mind, I found the OffsetSampleProvider class.
I see that we can use WaveOut in .NET Core 3.1; but WaveIn isn't an option? In other words, can playback but not record... this seems like an oversite. To clarify, I am specifically wanting to build a WinForms application using .NET Core 3.1 targeting Windows so that I can ultimately have a "single file executable" that has a "faster startup time" where I have a reduced chance of some external force (Microsoft) breaking my "stuff" through an update that is out of my control... this is all in addition to simply being part of a continued ecosystem. So clarifying that this has nothing to do with cross-platform development; with that said, that's not to exclude this as a future possibility. ;-)
@markheath How to write "wma" format file with naudio in c# ?
For anyone that may have the same problem I mentioned about a month ago... I resolved the problem by wedging the recording bits into my application (which, in the process - as I couldn't help myself, I also converted these to VB; but still relies on the existing NAudio library for everything that I didn't have to "wedge"). Things seem to be working pretty well thusfar.
@markheath how to use WaveInEvent (BufferMilliseconds, NumberOfBuffers, DataAvailable) for recording UDP - multicast stream? I am creating a desktop app using .net core, MumbleSharp, and Mumble Server, where I can record data from the sound device using WaveInEvent. but I need to send the stream from multicast to mumble server using WaveInEvent.
Hello All, How to get my microphone output volume in decibels using NAudio in c#
Hi, just like to visualize a wav file as chart in a web UI ... How can I read these data points out from the wav file ?
Julio César Rocha
Can someone please confirm if WaveMixerStream32 is now cross-platform? I know it's in the Core assembly, but will it work in runtime on, say, Android?
Julio César Rocha
Nevermind. I've seen it works.
By the way, if anyone ever needs an Android implementation of IWavePlayer, some dude in GitHub made one: