I did some complex 3d animations on blender and I want to play them on flutter. The problem is that when I export the animation into a video and I put it on my flutter app the transparent background is gone. I'm using video_player to play videos on flutter and the video extension I use is .webm because is the only one I know that can be readed on flutter and have a transparent background but the application puts a black background in the video.
The conclusion I get is that video player plugin is not the best idea to do it. So i've been searching about how flutter manage gifs and it's manage alpha channel perfectly because it can manage alpha channel on image. But the next problem is how to control the gif, for that I use gifimage plugin which works really well to reproduce an animation. Finally the last problem is to export from blender to a gif, blender can't export on gif format so you should export into a quicktime format with Qt animation codec and then convert it to gif. If for some reason the first frame of the gif stay as background of the gif you should edit the gif with photoshop for example and change in the timeline the first frame to not disappear to disappear.
Current player plugin uses ExoPlayer on Android. I found this issue on the repo.
Related
I was sent a video (made in after effects) with an alpha channel (has transparency). I need to add that video to an iOS app, like adding a transparent GIF.
There is this tutorial online on how to do just that, however he is using a single mp4 that displays alpha info on the bottom and the colour info on the top, I am using Quicktime 444 which is already transparent by nature. Is there anyway I can play a transparent video on Swift for iOS? I am using SwiftUI.
I am new to flutter, and am trying to play videos in my app. I followed this tutorial on using Chewie to play videos by copying and pasting the code from its main.dart and chewie_list_item.dart code snippets on the website into a fresh project (the github project provided by the website is too outdated for me to debug, so I did copy pasted instead)
I expected to get something like this, with the player "wrapped" around the video:
However, I get this (on my android virtual device), with the video player taking up the entire size of the screen regardless of the video dimensions. I did try inputting AspectRatio according to the video dimensions, but all that did was to eliminate the stretching of the video, but the main issue remains.
Why does it behave this way, and how do I achieve the result shown in the first image? This is the test project I made: https://github.com/nathantew14/chewie_test Thanks!
We're working with the Hololens 2 and sadly when recording a video, taking an image or going into the live-view, Unity UI elements are not shown in the image/video.
Does anybody know how we can make Unity UI elements appear in video & photo capture in Hololens?
Here is an example where below the title there is text present, but not captured in the image.
The title part uses a TextMeshPro component, while the text part uses a TextMeshProUGUI component (due to the scrolling window of the text.)
We're using Unity 2020.3.6f1, MRTK 2.7.2 with OpenXR backend.
Thanks for any help and recommendations.
For how to create mixed-reality photos and videos, you can use the Start gesture to go to Start, then select the Camera icon, for more information please refer to this link: Create mixed reality photos and videos.
If you want to seamlessly integrate mixed reality capture and insertion into your apps, you need to enable the Windows Mixed Reality Camera Settings provider in your MRTK profile and check Render from PV Camera.
The issue was that our Unity version was not updated, as hinted out in this github issue. Simply updating to the newest Unity version solved the problem.
https://github.com/microsoft/MixedRealityToolkit-Unity/issues/10155
I have several .mov videos with an alpha channel that I want to include in my Unity project.
I use Unity 2018.4LTS plus for this project (it breaks if I upgrade to 2019).
in the unity documentation,
it looks like it's both compatible and given an inspector walk through how to transcode it.
https://docs.unity3d.com/2018.4/Documentation/Manual/VideoSources-FileCompatibility.html
https://docs.unity3d.com/2018.4/Documentation/Manual/VideoTransparency.html
however, as seen in the image below, the video is both not recognized as such and the inspector attributes are not as shown in the documentation
Do you know of an alpha supporting video format that is supported by Unity 2018.4LTS?
thank you all helpers,
WebM format with VP8 has indeed supported transparency and
Unity - make video background transparent - Mobile/PC
Check this link, It's working perfectly: https://styly.cc/tips/tomo-chromakey/#
Credit to the author
I am building an AR project using Unity and Meta 2 glasses. I am using Vuforia to detect an object when I add the vuforia packages to the project and run it I see like a video in the background. It is so distracting and could not manage to disable it.
See a screenshot
Any clue how to disable it?
Unless the Vuforia version you have recognizes your specific AR device, Vuforia thinks it is a phone and renders the video background. All you have to do is not to render the video background.
Here you can find the exact instructions of how to disable the rendering of the background in Unity:
Disabling video background
The important code is this:
BackgroundPlaneBehaviour bgPlane = GetComponent<BackgroundPlaneBehaviour> ();
if (bgPlane.enabled) {
bgPlane.enabled = false;
}