Using a Raspberry Pi to rotate a video - raspberry-pi

I want to build a kind of photo booth. A normal booth is so boring so I decided to build a funky fancy one.
I will use 2 Raspberry Pis. One for stream, shoot and print the photo. The other to display the live stream of the photo.
The streaming shooting and printing is already done. Now I am building the video stream display part.
I will show the picture in format 1:1, because i want to display it on every 3 shot rotated by a random angle. So the guys in front of TV have to bend their heads, so I will get strange and funny pictures. Maybe it is possible to rotate constantly like a hypnotic spiral.
On Windows with VLC the rotation of the stream works very well. How can I do this on a Raspberry Pi?

I did it now with HTML and Browser in Fullscreen Mode.
Rotate Videostream and crop it to 1:1

Related

Unity: How to acces Android stereo back camera?

I want to get frames from back cameras at the same time (to compute the disparity map) on an Android smartphone, is there a way to do it?
I tried to get the list of a smartphone with 2 rear cameras and a front camera using ''WebCamTexture.devices" and it displays only two cameras: a back and a front camera. I then tried to display "WebCamDevice.depthCameraName" and I got nothing.
I know that there is a Depth API that provide depth image but I don't want to use it since it's based on a SLAM algorithm.
I appreciate your help, I'm still newbie in Unity.

Pupil Labs Eye Tracking camera set up, is my video feed inverted?

I just got Pupil Labs eye tracking headset (just the eye tracking cameras, no world view camera, regular headset not VR). I am using pupil labs capture software to get a video feed of my eyes and track my pupils to use in a Unity app. I noticed that my right eye video feed is inverted (upside down) by default, and I wanted to ask if that was intended or if my headset is somehow defective or incorrectly set-up. I ask because I saw a video of someone setting it up and their videos were both up-right with the correct orientation. Thank you for your input!
According to Pupil Labs' release documentation:
The eye 0 (right eye) camera sensor is physically flipped, which
results in an upside-down eye image for the right eye. This is by
design and does not negatively affect pupil detection or gaze
estimation. However, the upside-down eye 0 image repeatedly led users
to believe that something was broken or incorrect with Pupil Core
headsets. We flipped the eye 0 image now by default to better match
user expectations with a right-side-up eye image.
That is, the hardware intentionally has one camera physically flipped relative to the other. If this is visible to you in the video output, then you should either upgrade the software (which now defaults to flipping the image as required), or apply that setting manually.

Minimising the shaking or jumpiness of AR video when played over marker image using Vuforia in Unity3D

I am playing video over image target using vuforia plugin in unity3d. It is simple green screen video. I am using shader to remove it.
Sometimes when video played over image target it becomes too shaky and jittery and thus reducing the AR experience. How can I avoid or reduce it.
I tried multiple ways to get rid of it but no success. Here is what I tried:
Previously I was embedding the video in a Plane then I used Quad
but no success.
I tried to change AR camera (World Center Mode) to different
values like FIRST_TARGET,CAMERA and SPECIFIC_TARGET but still same problem.
Also my vuforia target image has 5 rating in vuforia database.
What could be the solution to this problem. Any help would be highly appreciated. Thanks!

How many screenshot in unity to render a video in blender with which settings?

I've got bad results trying to render video from unity game.
I imagined (and I could be easily wrong) that I need to capture screen every 33ms to get nearly 30 images for Blender. I first tried to record when playing the game in unity and my game display tab is on 16:9 aspect ratio and well I got the result far from what I thought (bad quality).
I further built the game and run the game choosing 1280x1024 resolution. I wanted 1920x1080, but I don't understand how to get this, because in CaptureScreenshot of unity documentation the last parameter seems to be about "how much bigger to do", but I could not figure out "related to what exactly". I tried to play with it going 4 and 8 and I could not control the result.
I recorded 90 images of 1280x1024 and added to Blender in video editing perspective/view. I found myself confused that when adding images, what should I choose from the left side-bar in Start-frame, and End-frame! I get confused as I don't know if to count every picture a frame or not. If yes, then 90 images must produce 3s video, which I failed to get!
Back to Blender, the settings I change:
File format: H.264
Encoding: Preset H.264 (format)
Resolution/Quality to 100% (it is X:1920, Y:1080)
Start frame: 1
End frame: ??? (anything I get ends to unexpected result)
Frame rate: 30
Other than output path and file name, I didn't change anything else. What I get is nearly 3s video but the character movements are not realistic, it looks like forwarding fast the video.
How I can achieve a good result or point me to where to read what that could help me understand what to do in terms of obtaining screen shots from unity and what setting in blender?

How to implement 360 video and Sound in Unity3D for VR

Can someone please provide me some insights about how the pipeline is to implement 360 video and sound in VR? I know a thing or two about video editing, sound and unity3d but would like to know more about how to do all these things for VR. Lets say I want to shoot a 360 video, then put it on VR but also I want to incorporate the sound captured. Also I would like to have some interactive spots on it.
Edit: If I want to make interactive spots on it, will that mean I need different 360 cameras shooting from the spots I want the interaction to happen? or will the one video shot with one camera allowed for that?
Thank you
First you have to choose target platfrom e.g. IOS,Android etc. Than you have to find out video player which support 360 video like AVPROMEDIAPLAYER from unitiy3d's AssetStore.
for Interactive spots in video you have to make some local database like thing e.g. xml file for position of trigger and time for doing any activity.Hope this will help you.