create 360 from 2d video adding background - unity3d

I have a 2D video and I would like to move it into 360. I'm aware of the differences and so on and so forth but, I would like to have a 360 video with something like a cinema room and, in the main screen, that 2D video would be displayed.
Is there any suggestion, automatic tool for that, or anything that could be useful? I'm open to use Unity3D, blender or any software related to video editing.

The usual approach is to use a computer generated world or a 360 image for the cinema room rather than a 360 video, and display your 'flat' 2D video on a 'screen' or wall in that generated room.
You basically render the video onto a texture that you have set up in Unity. This is supported as standard by Google VR for unity(https://developers.google.com/vr/develop/unity/video-overview):
Streaming Video Support
The GVR SDK for Unity includes an additional GVR video plugin that supports streaming flat and 360° videos in both mono and stereo formats by using the ExoPlayer library to handle decoding and rendering of video, audio, and related streams. ExoPlayer provides all the standard playback framework for both videos embedded in your application and streaming videos, including adaptive streaming support, such as DASH and HLS.
For example the Netflix version, VR Theatre, looks like this - the video plays on the 'screen' in front of the viewer:
If you look on the unity asset store you can find complete home or cinema 360 'images' that you can use in Unity also - for example (at the time of writing): https://assetstore.unity.com/packages/templates/vr-home-cinema-66863

Related

Unity video player - play a default video until another video is completly loaded

I have a video player in Unity that loads a video from a server.
The loading time of the video can be long, so I decided to display a "loading" video while I load the video from the server.
I have tried to add another video player component to another object, but rendering two videos on the same texture is problematic.
Is there a way to display a default video while the real video is being loaded by the video player component?
I have the exact same problem.
I kind of solved it by using two video players as you said, but I also switched the textured of the video players.
The video player of the loading video is rendering to textureA which is the texture where I play the video, and the other one render to an unused texture.
When the real video finish to load I switch the video player's textures.
This solution is working but it isn't efficient, and I am still looking for an efficient solution.

Unity Alternatives to "Snapshots" Controls with Resonance Audio

I'm wondering what other solutions people are using to control volume and activation of various Resonance Audio Sources in Unity.
I'm working on a mobile VR experience in Unity with interactible elements (looping audio until player triggers next step) and linear movement of the player between different-sounding spaces in one scene.
I've resorted to using the Animation features on the Timeline to control how to turn Audio Sources on and off and setting volumes, as Snapshots and other audio controls are ignored in the Resonance Mixer.
I'd love to hear how other Resonance users are controlling their audio!
Thanks,
Anna

How to play an Alpha video inside an Unlit sphere in Unity?

I'm trying to play an Alpha Video inside a sphere which has a StereoPanoSphereMaterial using a shader "GoogleVR/Demo/VideoDemo InsideShader" and is used to play a 360 degree video.
I'm using a GoogleVR/Unlit/TransparentOVerlay shader on a Quad to run my Alpha Video inside the sphere. It appears to be running fine when I run it in the editor but when I run it on my device it just shows a blank Quad which is supposed to be transparent and the video cannot be seen as well.
I've tried playing it with other given shaders as well like FX/Flare, Unlit/Transparent to no use.
Any tips on how to play an Alpha video inside my 360 degree video sphere?
Game View in the Editor
Scene View in the Editor
View on Phone
So for anybody following this thread. The problem is that GVR SDK apparently do not support .mov format as of now even when Unity does. So we converted the video in a .webm format and it is working not to the best of our expectations but does the job of playing the Alpha Videos inside the sphere playing the stereoscopic video for now.
Also note that out of the shaders present in default with Unity only FX/Flare shader works well with this format to play videos in VR.
P.S. I also tried to place a PNG image as a component inside the view, even that is not working for now, even when it works in Unity.

Video player in google daydream VR sdk

I am trying to render 2D video content using google VR SDK, I tried to refer the documentation regarding the video viewports given in the dev page but it seems obsolete and incomplete and the samples on the github do not give any reference on how to create 2D quads and how to render external texture on it, there is only 360 video sample available. any reference or help would be appreciated, thanks!!
Playing a video is pretty much easy using GoogleVR SDK. All you need to do is, take a quad objects (from 3D objects), drag the GvrVideoTexturePlayer script (from the downloaded SDK), and set the material of mesh as VideoMaterial (from the downloaded SDK). Now just select the type of video (Dash/HLS) and enter the URL, the video should play.
Make sure that you are testing on the device as the video won't play on emulator. For 360° videos, replace quad object with sphere, rest is same.

360 stereo panorama viewer with Unity

I have two panoramic images that I have stitched together using some proprietary software and my stereo camera. I would like to visualise them in VR mode inside my app.
Is there any tutorial for realising this using Unity?