Is it possible to stream endless audio in flutter? - flutter

I'm developing an online radio app in flutter and I'm looking for an audio player which supports endless audio streaming from a certain URL (e.g. http://us4.internet-radio.com:8258/stream?type=http). It is highly desirable for it to be supported both on iOS and Android.
Is there such an option in flutter?
From what I've found, there are no solutions that satisfy me needs. The closest one is fluttery_audio, but, apparently, it doesn't support endless audio.
I apologize for my jargon with 'endless audio streaming', I'm not really sure what's the technical name for an online radio player is.

Try with flutter_webview_plugin and hide it.
https://pub.dev/packages/flutter_webview_plugin
final flutterWebviewPlugin = new FlutterWebviewPlugin();
flutterWebviewPlugin.launch(url, hidden: true);

You can also try flutter radio package, Check working app here...
Source code

Related

Flutter - Audio Player

hello i am new to flutter
i am trying to play audio files from url or network but which to use because
i searched google it showed many but which one to use.
if possible can show an example on how to create like below image
i want to create an audio player like this
kindly help...
Thanks in Advance!!!
An answer that shows how to do everything in your screenshot would probably not fit in a StackOverflow answer (audio code, UI code, and how to extract audio wave data) but I will give you some hopefully useful pointers.
Using the just_audio plugin you can load audio from these kinds of URLs:
https://example.com/track.mp3 (any web URL)
file:///path/to/file.mp3 (any file URL with permissions)
asset:///path/to/asset.mp3 (any Flutter asset)
You will probably want a playlist, and here is how to define one:
final playlist = ConcatenatingAudioSource(children: [
AudioSource.uri(Uri.parse('https://example.com/track1.mp3')),
AudioSource.uri(Uri.parse('https://example.com/track2.mp3')),
AudioSource.uri(Uri.parse('https://example.com/track3.mp3')),
AudioSource.uri(Uri.parse('https://example.com/track4.mp3')),
AudioSource.uri(Uri.parse('https://example.com/track5.mp3')),
]);
Now to play that, you create a player:
final player = AudioPlayer();
Set the playlist:
await player.setAudioSource(playlist);
And then as the user clicks on things, you can perform these operations:
player.play();
player.pause();
player.seekToNext();
player.seekToPrevious();
player.seek(Duration(milliseconds: 48512), index: 3);
player.dispose(); // to release resources once finished
For the screen layout, note that just_audio includes an example which looks like this, and since there are many similarities to your own proposed layout, you may get some ideas by looking at its code:
Finally, for the audio wave display, there is another package called audio_wave. You can use it to display an audio wave, but the problem is that there is no plugin that I'm aware of that provides you access to the waveform data. If you really want a waveform, you could possibly use a fake waveform (if it's just meant to visually indicate position progress), otherwise either you or someone will need to write a plugin to decode an audio file into a list of samples.

How to play the sound effect play when page change in Flutter

I want to play the sound effect when I go to next page. Is it possible do in flutter? If can, can I know how to do it?
If you are looking for a way to play sound in Flutter, head over here
Code: AssetsAudioPlayer.newPlayer().open(Audio("assets/audios/song1.mp3"),);

Overlay text and images to YouTube live stream from Flutter app

I am looking into creating a Flutter mobile app that live streams to YouTube using the YouTube Live Streaming API. I have checked the API and found that it does not offer a way to overlay text and images onto the livestream. How would I achieve this using Flutter?
I imagine this involves using the Stack widget to overlay content on top of the user's video feed. However this would somehow need to be encoded into the video stream to be sent to YouTube.
this type of work is usually done with FFmpeg
See this discussion for more info: https://video.stackexchange.com/questions/12105/add-an-image-overlay-in-front-of-video-using-ffmpeg
FFmpeg for mobile devices is made available by this project:
https://github.com/tanersener/mobile-ffmpeg
And then, as always, we have a flutter package called flutter_ffmpeg to allow us these features on flutter
https://pub.dev/packages/flutter_ffmpeg
TLDR: You can use CameraController (Camera package) and Canvas in Flutter for drawing the text. Unfortunately CameraController.startImageStream is not documented in the API docs, and is a 1 year+ GitHub issue.
Everytime the camera plugin gives you a video frame controller.startImageStream((CameraImage img) { /* your code */}, you can draw the image onto the canvas, draw the text, capture the video and call the YouTube API. You can see an example of using the video buffer in Tensorflow Lite package here or read more info at this issue.
On this same canvas, you can draw whatever you want, like drawArc, drawParagraph, drawPoints. It gives you ultimate flexibility.
A simple example of capturing the canvas contents is here, where I have previously saved the strokes in state. (You should use details about the text instead, and just pull the latest frame from the camera.):
Future<img.Image> getDrawnImage() async {
ui.PictureRecorder recorder = ui.PictureRecorder();
Canvas canvas = Canvas(recorder);
canvas.drawColor(Colors.white, BlendMode.src);
StrokesPainter painter = StrokesPainter(
strokes: InheritedStrokesHistory.of(context).strokes);
painter.paint(canvas, deviceData.size);
ui.Image screenImage = await (recorder.endRecording().toImage(
deviceData.size.width.floor(), deviceData.size.height.floor()));
ByteData imgBytes =
await screenImage.toByteData(format: ui.ImageByteFormat.rawRgba);
return img.Image.fromBytes(deviceData.size.width.floor(),
deviceData.size.height.floor(), imgBytes.buffer.asUint8List());
}
I was going to add a link to an app I made which allows you to draw and screenshot the drawing into your phone gallery (but also uses Tensorflow Lite), but the code is a little complicated. Its probably best to clone it and see what it does if you are struggling with capturing the canvas.
I initially could not find the documentation on startImageStream and forgotten I have used it for Tensorflow Lite, and suggested using MethodChannel.invokeMethod and writing iOS/ Android specific code. Keep that in mind if you find any limitations in Flutter, although I don't think Flutter will limit you in this problem.

Is there a way to start buffering video before playing it in Flutter?

I am trying to make an app that can play multiple videos at once. The goal is to record multiple music voices and then play them together. I am using the video_player plugin and each video has its own player. The problem is that when I start playing, it is not reliable when each player will start. So, I thought if I load the files before start playing this difference could be reduced. But I couldn't figure out how to do it and if it is possible to do. So, is there a way to load those files to reduce this difference? If there is another way to achieve my goal I would be glad to know. Thank you!
I have a similar problem where I want to control the exact moment the video starts playing.
For now, the best way seems to be (assuming you have a _controller constructed already)
await _controller.initialize()
await _controller.play()
await _controller.pause()
After that, the video starts playing almost immediately. For my purposes it’s good enough but if you need to control this down to the millisecond for synchronization purposes it might not cut it.
You might also want to hide the video and mute it during those first calls.

Marmalade SDK: is there a way to skip/seek to a part of a video in s3e video play?

I am currently using s3eVideo in the Marmalade SDK to play a video in my project after a button event. I attempted to find a way to implement a slider bar (or something of the like) to go back and forth in the video. I am unsure if this feature is even supported, but I may be wrong. Otherwise, is there a way to open a native video player outside of the app and then play the video that way with the seek feature I need?
Any help would be greatly appreciated.
There doesn't seem to be a way of finding out the length of the video but s3eVideoSetInt (S3E_VIDEO_POSITION, timeInMilliseconds) should do the trick.
I guess it will depend on if the frame index in the file is ok and if the specific platform supports it. Only really did play/stop video in Marmalade so you may have to try using this function while playing, while pause etc and see what works and what errors.