I am looking into creating a Flutter mobile app that live streams to YouTube using the YouTube Live Streaming API. I have checked the API and found that it does not offer a way to overlay text and images onto the livestream. How would I achieve this using Flutter?
I imagine this involves using the Stack widget to overlay content on top of the user's video feed. However this would somehow need to be encoded into the video stream to be sent to YouTube.
this type of work is usually done with FFmpeg
See this discussion for more info: https://video.stackexchange.com/questions/12105/add-an-image-overlay-in-front-of-video-using-ffmpeg
FFmpeg for mobile devices is made available by this project:
https://github.com/tanersener/mobile-ffmpeg
And then, as always, we have a flutter package called flutter_ffmpeg to allow us these features on flutter
https://pub.dev/packages/flutter_ffmpeg
TLDR: You can use CameraController (Camera package) and Canvas in Flutter for drawing the text. Unfortunately CameraController.startImageStream is not documented in the API docs, and is a 1 year+ GitHub issue.
Everytime the camera plugin gives you a video frame controller.startImageStream((CameraImage img) { /* your code */}, you can draw the image onto the canvas, draw the text, capture the video and call the YouTube API. You can see an example of using the video buffer in Tensorflow Lite package here or read more info at this issue.
On this same canvas, you can draw whatever you want, like drawArc, drawParagraph, drawPoints. It gives you ultimate flexibility.
A simple example of capturing the canvas contents is here, where I have previously saved the strokes in state. (You should use details about the text instead, and just pull the latest frame from the camera.):
Future<img.Image> getDrawnImage() async {
ui.PictureRecorder recorder = ui.PictureRecorder();
Canvas canvas = Canvas(recorder);
canvas.drawColor(Colors.white, BlendMode.src);
StrokesPainter painter = StrokesPainter(
strokes: InheritedStrokesHistory.of(context).strokes);
painter.paint(canvas, deviceData.size);
ui.Image screenImage = await (recorder.endRecording().toImage(
deviceData.size.width.floor(), deviceData.size.height.floor()));
ByteData imgBytes =
await screenImage.toByteData(format: ui.ImageByteFormat.rawRgba);
return img.Image.fromBytes(deviceData.size.width.floor(),
deviceData.size.height.floor(), imgBytes.buffer.asUint8List());
}
I was going to add a link to an app I made which allows you to draw and screenshot the drawing into your phone gallery (but also uses Tensorflow Lite), but the code is a little complicated. Its probably best to clone it and see what it does if you are struggling with capturing the canvas.
I initially could not find the documentation on startImageStream and forgotten I have used it for Tensorflow Lite, and suggested using MethodChannel.invokeMethod and writing iOS/ Android specific code. Keep that in mind if you find any limitations in Flutter, although I don't think Flutter will limit you in this problem.
Related
I'm working with ML on the device for Flutter that requires to have UIImage to feed into the model. The requirement is to use Livestream to detect objects in near real-time.
I use the Flutter camera with startImageStream function and get CameraImage from the streaming. I ask the camera to return ImageFormatGroup.bgra8888 for iOS, No need for Android since it's already working fine.
I convert bgra8888 on Isolate spawn to convert to a jpg image using Image Lib and send the binary of the image to Swift via Flutter Method Channel, rebuild that binary into UIImage and feed it into the model. I feed the image every 0.5 seconds (didn't feed in realtime from the camera stream image since it will be too much data feeding into the Model)
Everything seems working fine until I tested with old devices, iPhone 6s, and iPhone7 Plus. iPhone X is working fine, The model responded around 0.3 seconds which is faster than I feed.
while iPhone 6s, and iPhone 7 Plus spend around 1.5 - 2 seconds.
I tested the model with Native by creating camera view and feed the UIImage from didOutputSampleBuffer like below sample code. My iPhone 6s, and iPhone 7 Plus response around 0.5-0.6 seconds which is a lot faster
After I've done some research and found out that
https://github.com/flutter/plugins/blob/main/packages/camera/camera_avfoundation/ios/Classes/FLTCam.m
Flutter actually has camera stream which is the same as iOS and create RGBA and send to Flutter
- (void)captureOutput:(AVCaptureOutput *)output
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection {
if (output == _captureVideoOutput) {
CVPixelBufferRef newBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CFRetain(newBuffer);
I could get CMSampleBufferRef and create UIImage and feed to model directly without sending back and forth between Flutter and iOS, and I don't have to convert image in Flutter which is slower so If I can get CMSampleBufferRef directly from Camera, I believe that iPhone6S, and iPhone 7 Plus should run faster
The question is: is it possible to get CMSampleBufferRef directly without have to go around via Flutter? I've founded that FLTCam.h has a function called copyPixelBuffer . I debug it via Xcode and this return the image that I want but I can't find the way to use it.
and I found that FlutterTexture mention that texture can be share via Flutter
https://api.flutter.dev/objcdoc/Protocols/FlutterTexture.html
but no idea how to get that share texture
Anyone has any idea how I can access the image before Flutter camera send to Flutter?
I have another solution that I might clone their camera and expose copyPixelBuffer to public so I can access it. I didn't try yet but I want it to be last resource since other developers has to maintain 2 camera versions that we use in the app.
is it possible to add Randoms video to my flutter app from YouTube using a specific title
for example I want to display Randoms "Football" video in my app ?
It is possible to add youtube videos to Flutter applications through a variety of methods. Refer This plugin
For more info: Refer this medium article
The plugin supports a variety of features like custom buffering progress indicator, custom progress bar, standard audio/video control configurations.
Let's take an example URL: youtube.com/watch?v=123iamosum456
The value after "v=" is the video ID of that particular video which you would need to provide.
hello i am new to flutter
i am trying to play audio files from url or network but which to use because
i searched google it showed many but which one to use.
if possible can show an example on how to create like below image
i want to create an audio player like this
kindly help...
Thanks in Advance!!!
An answer that shows how to do everything in your screenshot would probably not fit in a StackOverflow answer (audio code, UI code, and how to extract audio wave data) but I will give you some hopefully useful pointers.
Using the just_audio plugin you can load audio from these kinds of URLs:
https://example.com/track.mp3 (any web URL)
file:///path/to/file.mp3 (any file URL with permissions)
asset:///path/to/asset.mp3 (any Flutter asset)
You will probably want a playlist, and here is how to define one:
final playlist = ConcatenatingAudioSource(children: [
AudioSource.uri(Uri.parse('https://example.com/track1.mp3')),
AudioSource.uri(Uri.parse('https://example.com/track2.mp3')),
AudioSource.uri(Uri.parse('https://example.com/track3.mp3')),
AudioSource.uri(Uri.parse('https://example.com/track4.mp3')),
AudioSource.uri(Uri.parse('https://example.com/track5.mp3')),
]);
Now to play that, you create a player:
final player = AudioPlayer();
Set the playlist:
await player.setAudioSource(playlist);
And then as the user clicks on things, you can perform these operations:
player.play();
player.pause();
player.seekToNext();
player.seekToPrevious();
player.seek(Duration(milliseconds: 48512), index: 3);
player.dispose(); // to release resources once finished
For the screen layout, note that just_audio includes an example which looks like this, and since there are many similarities to your own proposed layout, you may get some ideas by looking at its code:
Finally, for the audio wave display, there is another package called audio_wave. You can use it to display an audio wave, but the problem is that there is no plugin that I'm aware of that provides you access to the waveform data. If you really want a waveform, you could possibly use a fake waveform (if it's just meant to visually indicate position progress), otherwise either you or someone will need to write a plugin to decode an audio file into a list of samples.
I have a book with pictures. The task is that a each picture is attached to video, and when the camera hovers over the picture, the application should open another screen and play the video associated with the photo. i tried to use teachablemachine, but it cant detect if there's too many photos. Any ideas is highly appreciated. Thanks
You could use firebase's Object Detection and Tracking and Camera Plugin's image stream feature.
Basically, you would process each frame you get from camera plugin with Firebase's ML feature, and once you detect an object you can perform any action with it.
You can use Tensoflow Lite: https://www.tensorflow.org/lite
You have some dependencies for flutter, for example: https://pub.dev/packages/tflite
I'm developing an online radio app in flutter and I'm looking for an audio player which supports endless audio streaming from a certain URL (e.g. http://us4.internet-radio.com:8258/stream?type=http). It is highly desirable for it to be supported both on iOS and Android.
Is there such an option in flutter?
From what I've found, there are no solutions that satisfy me needs. The closest one is fluttery_audio, but, apparently, it doesn't support endless audio.
I apologize for my jargon with 'endless audio streaming', I'm not really sure what's the technical name for an online radio player is.
Try with flutter_webview_plugin and hide it.
https://pub.dev/packages/flutter_webview_plugin
final flutterWebviewPlugin = new FlutterWebviewPlugin();
flutterWebviewPlugin.launch(url, hidden: true);
You can also try flutter radio package, Check working app here...
Source code