Samsung Smart TV App: setting video frame position (native player) - samsung-smart-tv

I am creating an application for the samsung smart tv that plays music clips in native samsung player. And I want scrolling in, so I must moving the video frame (player window). Here are come some problems:
1) Moving video on screen during scrolling is not fluent (On scroll event I get the position of element, where I want to have video frame, and set this position by function SetDisplayArea). Do you have any experience how handle this?
2) When I am scolling video frame out of display, top/bottom of video cannot slide out.
Is it possible show only a part of video frame (something like SetDisplayArea(0, -200, 400, 225))
Thanks for any suggestions.

Which API version you use?
For API 2.5 to change player position and size SetDisplayArea() is the only solution for native player.
I haven't try it but if SetDisplayArea(0, -200, 400, 225) doesn't go out of the screen it is impossible to do.
Do you want to move player while the video is playing? You can't expect fluent animations on SmartTV platforms especially when you want to move huge (memory expensive) elements.
Possible solution for me will be to make image of your video player and make slide-in animation on this image. After that hide image and show real player instead and start playing the video.

I think it should be possible to use SetCropArea() in case of out of screen.

Related

How can I turn off camera video background in Unity ARKit

I'm trying to build a "lights-off" feature in my ARKit app, where I basically turn off the camera video background and only show my 3D content, sort of like VR.
Could someone please help me figure out how to turn off the the video background. I can see that a material called YuVMaterial is being used to render the camera texture but setting that to single color actually covers the entire screen and doesn't show my 3D content either.
You can uncheck the UnityEngine.XR.ARFoundation.ARCameraBackground component under the main AR camera, it will just render a black background.
In Unity, you can switch between cameras while using ARKit. The main camera has a run time spherical video applied to it, so it's not actually your device camera, but a rendering of what the device camera sees. By switching cameras, you can effectively "turn off" the background video image, but still take advantage of the ARKit properties. Have fun.

Easy Movie Texture and Google Cardboard Buttons

I'm making an app for 360° video with Google Cardboard. I have the Easy Movie Texture plugin, but I can't find the way to make a UI. The controls appear on the left side. Is there a way to make them appear with the movement of the head?
create a canvas and attach as child of CardboardMain.
I use this method along with a reticle canvas to allow selection via gaze

Unity - Trying to fit a game in an android screen

I started developing a video game as web, but I want it to run on an android device. It chops off a lot of the game. How can I make the game fit to the screen size of the device?
Thanks!
Are you using Orthographic or Perspective camera? With perspective camera it might be an issue with the aspect ratio so you could just pull the camera back a bit. On orthographic camera you need to calculate the orthographic size for each screen size:
camera.orthographicSize = 640/screenwidth * screenheight/2
Put that in your Start or Awake function and change the normalized width to something that fits your project. More here.

How to detect a moving object and its position in camera in iOS?

I want to detect movement of an object in iOS camera. Actually I am working on a project where the iOS camera is placed some where and when it detects any movement on the camera screen then it gives notification or fire any particular event.
Please suggest if how can I achieve the same.

How to make (or event fake) ripple animation on iPhone, iPad by UIView!

I'm doing an app on iPad, and I wanna make a ripple effect in a background image, just the ripple animation for surface of water. Flash can solve this issue easily, but it's not available on iPad. Does anyone know something about this?
With iOS 5, Apple now has a sample project called GLCameraRipple that lets you tap/draw on the screen to create animated water ripples over incoming video from the camera... a nice example of using OpenGL ES 2.0 and GLKit.