Ionic displaying audio stream from an API - ionic-framework

I was given two urls, the first one is for the live stream radio and the second one is for the API respectively as following:
The actual stream link.
Here's the API for the stream:
{"apiVersion":"v1","data":{"kind":"Rotblau\\Model\\FcbLiveRadio\\Stream","items":[{"id":1,"url":"http:\/\/116.202.174.106:8100\/live","type":"LIVE","name":"Live-Studio"}]}}
I've been trying to use all available Audio packages for Ionic like StreamingMedia, NativeAudio, and Media but it seems like all of these are outdated or so, every time I try to call a method like play() on a test mp3 it does nothing and I get a bunch of errors. So I need a working package or modern way to display that audio stream on the page if it's possible.

Related

How can I handle multiple formats in a video player used in my flutter application for android/ios?

The backend api sends a link url to the video that can be used in the video player's source. But the issue is, the source can vary from m4v, mp4 and other potential formats. I want to support the major formats, if not all, in the app.
Backend is in Django. I am open to modify the api code as well.
If the video formats need conversion, suggest the fastest way with an ETA for a ~100MB file.
Tried video_player plugin from pub.dev. At worse, I am thinking of integrating flutter_ffmpeg but I don't want users to wait and ruin their UX.

Flutter Video_Player - Play video from asp.net web API using stream

Flutter is new to me, and I'm creating an app using video_player library, where the video plays from local resource as well as from the internet using direct url like https://flutter.github.io/assets-for-api-docs/assets/videos/bee.mp4.
However, I am trying to read a stream in chunks and play them to video_player, but it does not seem to be working.
I have created an API with the help of Asynchronously streaming video with ASP.NET Web API which return the Stream outputStream as return value.
Note: The video stored in my file server and API server are also different, and both are on the same network. API server can read the data in chunks from the actual url that is not publicly accessible and its working fine in html video tag.
Does the video_player support such streaming or do I need to use any other player?
Please help me guys, whether they are in the right direction or not.
I am sure the details provided are not enough, but as a flutter beginner this is what i can provide here.
Best Regards,
Hi ten

How to get audio volume (as stream) from WebRTC's MediaStream in Flutter?

I am using flutter mediasoup client to stream video+audio from the server.
This works well in general.
However, I now want to measure audio level (ie, loud/soft) from the incoming audio stream so that I can display an audio level indicator widget.
The underlying stream is this webrtc class, but there doesn't seem to be any API to directly extract audio level.
I found this thread in flutter-webrtc repo, but it led to no concrete solution.
So, I wonder if anyone has had any success in extracting audio level from webrtc media stream.
Thank you.
I also posted this question in flutter-webrtc github repo and got a response that eventually led to a solution in my case:
https://github.com/flutter-webrtc/flutter-webrtc/issues/833

how to play, next, pause, etc audio with audioservice from local file

Please help me, I have a problem when I play audio from local files, if I play one file at a time, I can. and it's working. but I want like applications in general to be able to play, next, stop, etc. using a background service. the data file has been successfully displayed in the application, the data of the audio file is in the form of a LIST, while the data requested by the audioservice is in the form of a MEDIAITEM
Suggest you to have a look on audio_service package. This is meant for playing audio in background, the use case you are looking for(play,next,pause).

Is there a way to continuously send snippets of audio being recorded in realtime to backend server in Flutter.io?

I am creating an application that uses Mozilla's Deep Speech API to transcribe the user's speech to text. The input requires audio files with some sort of format and in order for this app to work, I will need to continuously send these audio files to my flask server while the user is recording audio.
I have seen that most flutter plugins only allow you to record, pause, and stop, however, I need to find a way to keep on recording while also sending audio files. If anyone has found a way to accomplish this using Flutter.io's recording plugins, any guidance and information would help.
My backup plan is to use one of Flutter's Speech-To-Text plugins like this one: https://pub.dev/packages/speech_to_text#-example-tab-, and then to send over the text to my backend server through a websocket. However, I don't know what kind of API they're using to transcribe and how accurate the text gets transcribed to.
Any ideas on how to accomplish this? Or would anyone happen to know if another framework like Swift/React-Native can accomplish this?