I found some tutorials about playing a fullscreen video on the iPhone, but I wonder if it is possible to play an embedded movie in the background of a openGl animated app?
The movie should just loop seamlessly and should not have any control buttons like play/pause/volume popping up while being played. If possible, then how about performance?
Decode next video frame
Upload frame bitmap to texture
Disable depth buffer reads/writes
Draw screen-sized textured quad using frame texture
Re-enable depth buffer reads/writes
Render your regular scene
Go to #1
Related
What happens when I have an 8K texture for a background in my 2D game when the device that's playing the game doesn't support that texture size.
For instance, what if I have an 8k texture for a background and I try to play the game on an iPhone 5SE (which only supports 4096x4096 textures)?
I have a video player in Unity that loads a video from a server.
The loading time of the video can be long, so I decided to display a "loading" video while I load the video from the server.
I have tried to add another video player component to another object, but rendering two videos on the same texture is problematic.
Is there a way to display a default video while the real video is being loaded by the video player component?
I have the exact same problem.
I kind of solved it by using two video players as you said, but I also switched the textured of the video players.
The video player of the loading video is rendering to textureA which is the texture where I play the video, and the other one render to an unused texture.
When the real video finish to load I switch the video player's textures.
This solution is working but it isn't efficient, and I am still looking for an efficient solution.
I'm trying to build a "lights-off" feature in my ARKit app, where I basically turn off the camera video background and only show my 3D content, sort of like VR.
Could someone please help me figure out how to turn off the the video background. I can see that a material called YuVMaterial is being used to render the camera texture but setting that to single color actually covers the entire screen and doesn't show my 3D content either.
You can uncheck the UnityEngine.XR.ARFoundation.ARCameraBackground component under the main AR camera, it will just render a black background.
In Unity, you can switch between cameras while using ARKit. The main camera has a run time spherical video applied to it, so it's not actually your device camera, but a rendering of what the device camera sees. By switching cameras, you can effectively "turn off" the background video image, but still take advantage of the ARKit properties. Have fun.
I am creating an application for the samsung smart tv that plays music clips in native samsung player. And I want scrolling in, so I must moving the video frame (player window). Here are come some problems:
1) Moving video on screen during scrolling is not fluent (On scroll event I get the position of element, where I want to have video frame, and set this position by function SetDisplayArea). Do you have any experience how handle this?
2) When I am scolling video frame out of display, top/bottom of video cannot slide out.
Is it possible show only a part of video frame (something like SetDisplayArea(0, -200, 400, 225))
Thanks for any suggestions.
Which API version you use?
For API 2.5 to change player position and size SetDisplayArea() is the only solution for native player.
I haven't try it but if SetDisplayArea(0, -200, 400, 225) doesn't go out of the screen it is impossible to do.
Do you want to move player while the video is playing? You can't expect fluent animations on SmartTV platforms especially when you want to move huge (memory expensive) elements.
Possible solution for me will be to make image of your video player and make slide-in animation on this image. After that hide image and show real player instead and start playing the video.
I think it should be possible to use SetCropArea() in case of out of screen.
I would like to play back a video on iPhone/iPad in the background of my game, with opengl rendered 3D on top. The video should speed up and slow down depending of the speed of the player. The video playback speed should be anything between 0 and 2 times regular speed (even faster max speed would be better, if the CPU can handle it).
Is it possible to adjust video playback speed like this? Also, I need to know the currently rendered video frame, so that certain events in the game can be synchronized to the video.