Stream audio from mobile microphone in xamarin to external system.

Phil 166 Reputation points
2022-09-05T09:10:31.94+00:00

Hi there
I am wanting to stream audio from a devices microphone to be streamed over a network (I was thinking of using SignalR to achieve this.)

I have downloaded the "Plugin.AudioRecorder" Nuget package. This allows me to use the microphone to record sound and then play it back once completed.
It also allows you to get the sound as a stream. It specifically says this can be used for streaming. Details of the Nuget package can be found here. https://github.com/NateRickard/Plugin.AudioRecorder

I was hoping someone could point me in the correct direction on how to stream using this functionality.

My idea is to have a timer in the Xamarin form that would fire every second.

So every second it would get the stream (possibly convert to a byte array) and send that to the api. Items connected to the api would then be able to load the data.

As the audio plays on the device the stream increases in size (as expected). So would I want to keep track of the size so I can see what I previously sent and only send new sections of the stream, or could I send the entire stream each time, and the player would timestamp and so be able to keep track of where it is?

The plugin also has a function for writing a WavHeader. This allows you to wrap the audio with the data required to make it into a wav file (is my understanding). This can be copied to a byte array and so I could send the audio data with a wav header.

Does audio require certain keyframes to work?

Is this a sensible way to achieve this?

Does a timer make sense for sending the data from a stream?
Can I just send the data up or will it need that wav header to be read back?
Will devices reading the data be able to read it, or does it require some additional keyframe/metadata?

The question is a bit vague so I have tried to narrow it down.

I have created a github repo where you can click a button and it will generate the audio and play it back:
https://github.com/developerfiveneosoftware/AudioExample.git

In order to pass the stream to a server (this will be signalr) I currently start a timer. Grab the stream every second and then send that data.

 Timer audioTimer;  
  
 audioTimer = new Timer(GetMicrophoneStream, null, Timeout.Infinite, Timeout.Infinite);  
  
 async Task UpdateRecording()  
        {  
            try  
            {  
                if (!audioRecorderService.IsRecording)  
                {  
                    AudioText = "Record";  
                    AudioTextColor = Color.Red;  
  
  
  
                    var recordingTask = await audioRecorderService.StartRecording();  
  
                    audioTimer?.Change(1000, 1000);  
  
                    Device.BeginInvokeOnMainThread(async () =>  
                    {  
  
  
  
                        var audioFile = await recordingTask;  
  
                        if (audioFile != null)  
                        {  
                            audioTimer?.Change(Timeout.Infinite, Timeout.Infinite);  
                            audioPlayer.Play(audioRecorderService.GetAudioFilePath());  
                            AudioText = "Stop";  
                            AudioTextColor = Color.Green;  
                        }  
                    });  
  
                }  
                else  
                {  
                    await audioRecorderService.StopRecording();  
                }  
            }  
            catch (Exception ex)  
            {  
  
            }  
        }  
  
async void GetMicrophoneStream(object state)  
        {  
            try  
            {  
                if (audioRecorderService != null)  
                {  
                    using (var stream = audioRecorderService.GetAudioFileStream())  
                    {  
                        if (stream != null && stream.Length > 0)  
                        {  
  
                            using (var headerStream = new System.IO.MemoryStream())  
                            {  
                                await stream.CopyToAsync(headerStream);  
  
                                headerStream.Seek(0, System.IO.SeekOrigin.Begin);  
  
                                //write wav to audio file so it can be played  
                                AudioFunctions.WriteWavHeader(headerStream,  
                                audioRecorderService.AudioStreamDetails.ChannelCount,  
                                audioRecorderService.AudioStreamDetails.SampleRate,  
                                audioRecorderService.AudioStreamDetails.BitsPerSample);  
  
                                byte[] buffer = headerStream.ToArray();  
                                Console.WriteLine($"audio length = {buffer.Length}");  
                                //This would send the data to signalr to be broadcast  
                            }  
  
                        }  
                    }  
                }  
            }  
            catch (Exception ex)  
            {  
                Console.WriteLine($"ERROR {ex.Message}");  
            }  
        }  

Does this seem like a sensible way to take the data from a stream and pass it to a sever?
Is there a better/more standard way to handle sending streams to an api?

Many thanks

Xamarin
Xamarin
A Microsoft open-source app platform for building Android and iOS apps with .NET and C#.
5,367 questions
{count} votes

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.