reuse result of initialize method from video_player flutter package - flutter

Is there option to reuse result of initialize method for video_player package? It takes time for complete - it would be great to cache it (eg. memory level) and reuse it when you back to before used video - and simple use cached data instead of wait for initialize result. I need it for intensive switching between videos.

There is a package called cached_video_player which may help resolve your problem. Check it out here.

I think you are asking about having the screen/page/widget pre-render. That is not currently supported by flutter according to this issue filed on github:
https://github.com/flutter/uxr/issues/6#issuecomment-881918751
Sure, but this is not very scalable and will quickly turn into a mess. It's much simpler and more flexible to just give MyRoute someway it can cache the next route, and then show that cached route when it needs. But flutter doesn't support this as everything needs to be 'on-stage' before it can be initialized. In AIR, or Unity, I could simply construct my new page, and it would begin loading data, I could then toss it on stage whenever I want.
PS. You probably already know you can pre-cache the video data/file itself.

Related

Is there a way to make Riverpod's StreamProvider go into loading state only on the initial call?

This is something I've wondered about for a while. It is hindering the UX to show a loading Icon whenever the stream changes when using {StreamProvider's name}.when method.
I can already make it work by using a StreamBuilder instead and using the StreamProvider's .stream property as the stream. This way, it only shows a loading icon when no data is present, and if new data arrives on top of old data, it shows the old data until the new data is "loaded".
But I find that as a workaround and was wondering if there is a cleaner way.
The version 2.0.0 updates how AsyncValue is dealt with, making what you're asking the new default behaviour.
You can try it out now using version flutter_riverpod: ^2.0.0-dev.0

How to clear/invalidate ambient cache on iOS app

When I update tilesets on mapbox, changes don't appear in the iOS app unless I re-install it. There is seemingly documentation on this here: https://docs.mapbox.com/ios/api/maps/5.2.0/Classes/MGLOfflineStorage.html#/c:objc(cs)MGLOfflineStorage(im)setMaximumAmbientCacheSize:withCompletionHandler: but I can't figure out how exactly to implement it. I don't have an MGLOfflineStorage object because I am not worried about offline map storage right now, I just want to refresh the cache in the app. There are good examples of how to do this in android, but not on iOS. Any help is appreciated (preferably in swift)
It appears to be correct to call the methods on the shared MGLOfflineStorage object. The method parameter should be a closure containing any code you want to execute upon completion.
MGLOfflineStorage.shared.invalidateAmbientCache { error in
print("Invalidated")
}
Naturally you should check the error in the usual 'safe' way.

What should I use for i18n in Flutter: S.of(context) or S.current?

I'm using the i18n plugin for Flutter (I believe it's this one) that comes with Android Studio.
And in every example I see it says to use S.of(context).my_string to get the Strings but it always returns null.
If I use S.current.my_string, it seems to work.
So is S.current the right way to do it and every doc/tutorial out there is wrong, are they the same or what?
What I'm basically asking here, is what is the difference between them.
Seems like S.of(context) is initially available way to access localised string.
But sometimes you need to use it without Build Context (in ViewModel, for example). So S.current was added for these cases.
More info here

How to get frame data in AppRTC iOS app for video modifications?

I am currently trying to make some modifications to the incoming WebRTC video stream in the AppRTC app for iOS in Swift (which in turn is based on this Objective-C version). To do so, I need access to the data which is stored in the frame objects of class RTCI420Frame (which is a basic class for the Objective-C implementation of libWebRTC). In particular, I need an array of bytes: [UInt8] and Size of the frames. This data is to be used for further processing & addition of some filters.
The problem is, all the operations on RTCVideoTrack / RTCEAGLVideoView are done under the hood of pre-compiled libWebRTC.a, it is compiled from the official WebRTC repository linked above and it's fairly complicated to get a custom build of it, so I'd prefer to go with the build available in the example iOS project; in my understanding it's supposed to have all the available functionality in it.
I was looking into RTCVideoChatViewController class and in particular, remoteView / remoteVideoTrack, but had no success in accessing the frames themselves, spent a lot of time researching the libWebRTC sources in official repo but still can't wrap my head around the problem of accessing the frames data for own manipulations with it. Would be glad for any help!
Just after posting the question I had a luck in finding the sneaky data!
You have to add the following property to RTCEAGLVideoView.h file:
#property(atomic, strong) RTCI420Frame* i420Frame;
In the original implementation file there is the i420Frame property but it wasn't exposed in the iOS project's header file for the class. Adding the property allows you to get view's current frame.
I'm still in search of a more elegant way of getting the stream data directly, without the need to look into remoteView contents, will update the answer once I find it.

How to empty cache for WebView?

I have a Webview that must load an image! When I upload this image I see every time the same image as before, and i must reboot my app to see the new image...
I think is a cache problem..How can I solve that??
One quick and easy method would be to append the current time stamp onto the url whenever you load it.
So instead of loading:
http://www.myhost.com/myimg.jpg
you'd load
http://www.myhost.com/myimg.jpg?12345689
Using a cache breaker like this is a very common method in web development to force reloading of content.
I did do some quick googling and it appears clearing out NSURL's cache won't do the trick. In 10.6 the api reloadFromOrigin: may do the trick, but I'm not aware if this has made it's way onto the iphone yet.
Edit:
I found this page in the docs. It looks like you can use the preferences system to say whether or not to use caching. Not tested, but that'd be something to look at.