Flutter API to record video - flutter

Flutter has no way to record video of the Flutter App.
I tried all the following, all have majors drawbacks that brings to ask this question
Using repaintBoundary
This method has 2 major drawbacks:
It draws blank white boxes on PlatformViews and WebViews (https://github.com/flutter/flutter/issues/25306 and Update 30 June https://github.com/flutter/flutter/issues/83856 it is in progress this quarter)
Bitmaps need to be sent back to the Native platform for encoding into mp4. (Using OpenGL)
Using native platform
Flutter exposes the getBitmap API that is meant be used on the Native Platform to capture a screenshot of the Running hosted Flutter App but this has 2 major drawbacks
It must run on UIThread so the FlutterApp is blocked, so the video is recording a blocked UI
This method is very slow. Sometimes up to 100 ms per frame depending on the screen content.
Recording the entire screen?
Using MediaRecorder that's possible but we want to:
Record the app only, not the entire screen
Recording the entire screen requires special permission from user
Drawing a canvas of the FlutterView in the Native platform?
This draws black screen because FlutterView is a SurfaceView and bitmap cannot be obtained as it was drawn on the different thread.
Use PixelCopy?
This API is only availabe on >= API 24
Does FLutter Expose an API to record Bitmaps continuously in the background not on the Main UI thread . (Or a video recording API but that's too much to ask)
Keywords: getBitmap, screenshot, bitmap, uithread, background, record, recording, screen shot, main thread, flutterview, surfaceview, video, mp4, mediarecorder, opengl, mediamuxer, mediaencoder

Related

In Flutter, how can I check if a mouse device or a touch device?

How can I check if the device is a touch device or a mouse device?
Using kIsWeb is not sufficient because if using web version of the app on a mobile device kIsWeb returns true, but I need it to return false because it is a touch device.
Checking the platform doesn't work either because if using web version of the app on an iOS device for example returns false for iOS platform check.
Use case - I have two different types of video players for my app. One is suitable for touch devices (you tap to show and hide controls) and one is suitable for mouse devices (controls show when you mouse into the player and hide when you mouse out).
Youtube has the same idea. If I use the youtube app or website on my iPhone I get touch controls. If I use the youtube app or website on my iPad Pro I get touch controls. If I use the youtube website on my Mac I get mouse controls at all screen sizes (even mobile screen sizes).
So I guess I really just need to know platform on the web. I can get platform if not on the web.
Great question #jwasher! I had the same issue - a touch and swipe based UI that was great as a native mobile app, great as an single page web app (SPA) on mobile web browsers, but that was weird and clunky for mouse based interactions when the SPA was used in a desktop web browser.
The solution I have settled on is to wrap sensitive widgets in a MouseRegion (https://api.flutter.dev/flutter/widgets/MouseRegion-class.html) to detect mouse activity at runtime, then reacting by augmenting the UI with buttons to provide a mouse focussed way of triggering actions previously only linked to touch triggers.
You could use this to "mouse-enable" individual widgets, or you could wrap the whole app in a MouseRegion, trip a state field when activity was detected then rebuild the widget tree in a substantially different way for point and click users.
This strategy may incur some minor complexity/CPU overhead on devices that will never have a mouse attached (like a smartphone), but the solution will be more flexible than a build or configuration time capability determination. If a user starts the app on a tablet, then puts it in a stand and attaches a bluetooth mouse - it will react appropriately.
A device isn't "a mouse device" or "a pointer device". Events have an input type--see Pointer event.kind--but not the whole device. A laptop can have a touch screen, and a tablet can have a stylus or external mouse; an app running in those environments can receive both types of event.
Without knowing what you are trying to accomplish with this classification, is hard to advise on how to accomplish it. Trying to style your UI based on a guess of the primary interaction mode, for instance, is a completely different problem than reacting to a mouse event differently than a touch event.

iPhone start screen latency

Im wrapping a html5 website in a native app. The app splash screen appears for a sec then a white page appears, I decided to add an image view that have the same image as the splash screen to fill this white screen and I hide this intermediate image once the websites download all the jquery and CSS. However, this screen lasts for really long time (up to 30 sec.) so I thought that it may be due to the jquery and CSS large buddle downloading time. So create a page which has and image and a redirect js(non library) statement and found that redirects so fast. I cannot understand why, anybody have an explanation
That's the expected behaviour when attaching a web view
You may want to consider that your app plans may run afoul of App Store guidelines:
Apple can reject mobile web shell apps.

recording a conversation on tokbox for iphone/ipad

I tried using the following code from http://codethink.no-ip.org/wordpress/archives/673 then putting it into the The OpenTokHello sample app from OpenTok and it appears to not actually record the video as I thought it would.
I made the ScreenCaptureView the new "superview" of everything and then made sure that the video streaming views would be added to that view. And when I played the video, the place where the streaming video should've been, was blank.
Any ideas on what I'm doing wrong?
Full disclosure: I wrote some of the OpenTok iOS SDK and work for TokBox.
The implementation of this ScreenCaptureView might not work with our SDK because all of our video rendering is done outside the context of UIView. You'd have to grab the rendering layer of the view in order to recover that part of the screen.
Depending on why you're trying to record the conversation, I recommend either
Using screen capture in QuickTime and running your app in simulator (easier)
Waiting for OpenTok archiving support on iOS which will be available in a few months (also easy, but not for the impatient)
Capturing the rendering output of the subscriber from CoreGraphics (less easy)

Does using css sprite speed up performance in an html5 iphone app wrapped with phonegap?

I was wondering if creating a sprite for an HTML5 app, and wrapping it with phonegap for iPhone and Android, would increase performance at all. Because the app is offline, so it is only making a call for the image out of local storage.
In theory, no. However, at least on iOS, images aren't pre-loaded for each page load and you sometimes may see a 'flash' as the image is loaded into the page. So a sprite is one way to handle that. Another way is to use some JS to load all of your app's images in the background on the home page. It takes a bit longer for the app to load, but then everything is cached and ready to go.
Where sprites are a convenience, though, is maintenance. It's so much easier to only have to edit a handful of image sprites than it is to maintain dozens of individual image files.

How will I have to update how I program for iPhones now with iOS4?

For example, preparing a launch screen of 320 x 480 would have to be changed....
How is that going to work for us? Are programmers always going to have to be submitting a high-res that will be scaled down for old devices such as the iphone 3g?
The size of the screen is basically 4 times on a pixel by pixel basis. So each pixel of your image for example gets boosted to 4.
What this means for you? You don't have to change your App, your app will scale to the hi res screen for you, same with your UI and images within your UI. Of course if you want to take advantage of the better screen quality you will have to submit hi res images.
I haven't looked at going the other way but I believe it would be a similar case.
One exception to this is text. It automatically scales to the higher res for you for free. So text will look super sharp. One problem with this is if your loading image has text based on the original load screen that wouldn't look the same as when the high res text loads.
Strictly speaking, anyone who's seen the documentation on how they're handling this is still under non-disclosure until Monday, when the new iOS ships.
Suffice to say, it's clever. You'll be able to put both high and low-rez versions of ALL your images into your app, and then load them into your app in a way that's totally transparent from the code side. The device will make its own call about which version of the image is appropriate for the kind of screen it's got.
Now that the WWDC 2010 videos are available for free to any registered iPhone developer (or ADC member), I recommend watching Session 134: Optimize your iPhone App for the Retina Display for a full description of what you need to do to support the iPhone 4's new display.