vuforia ARCamera videobackground is black - unity3d

I have create assetbundle scene with vuforia ARCamera and an imagetarget. Now after loading the assetbundle scene, the scene is able to start with black screen. I have noticed that the ARCamera -> Camera -> BackgroundPlane->VideoMaterial(Instance) -> "Custom/VideoBackground" is not enabled. But when i enabled that manually, then the camera has enabled and showing live. Is there anyway to make that shader enabled after scene loads.
Picture-1:
Before enabling and scene was opened from assetbundle
Picture-2: After enabling shader option

Here is a more complete answer for future reference:
You can solve this issue by attaching to script to said GameObject that enables the shader upon awake, it would look something like this:
void Awake()
{
//get your video material component
VideoMaterial myVideoMaterial = getComponent<VideoMaterial>();
//Look for a shader called "VideoBackground" and apply it to the shader material of the component
myVideoMaterial.material.shader = Shader.find("Custom/VideoBackground");
Destroy(this);//this will remove this script after executing it, just looks a bit cleaner in my opinion but no necessary
}
More information about material shaders can be found in the unity docs here.
More information about Shader.find can be found in the docs here
This is assuming that you have a reference to the shader from a material already somewhere in your scene. If you do not you can as per Gowthy's comment add the shader to the "always included shaders" list. This can be found by going to the Graphics menu under Project Settings, and then scroll down to the "Always Included Shaders" section. Or you can add the shader to a "Resources" folder that gets included in the player build"

Delete the Vuforia folder from the assets directory.
Open player settings and uncheck Vuforia support from XR settings.
Choose the remove files options.
Then check the Vuforia support again.
Choose Vuforia camera in your scene.
Add the license key.
That's it.

Related

why my skybox disappeared after I added the urp package?

I want to add light2d to my project, but it's a decision I made after months I started the project, so then I added the urp package.
But something strange happended: my skybox disappeared. I mean, it can't display neither in my scene view nor game view, it seems it's totally transparented instead of a material error.
I have checked my lighting settings and camera settings, but there's no problem.
I tried to restart Unity editor, but it can't solve the problem.
Sorry for poor English, hoping you can read this.
If you use a "URP Asset (with 2D Renderer)", it wont use the Skybox material set in Lightning -> Environment.
Its probably by design i guess (2D dont need a "box", instead a rect will suffice).
Use the other Asset (with Universal Renderer) or set your "Skybox" as Sprite material in the scene.

Unity 3D Shader Doesn't appear on VR mode

We have two scene, in one scene we create level design and mechanics for VR in runtime, and other scene we use SteamVR for VR mode. Shader show up in editor scene but doesn't work in VR scene.
Thank you for your answers,
Actually it was a save load problem. We use 3th party asset from unity asset store (Runtime Save & Load) and SceneAssetLibrary.asset that hold scene asset references was overwritten when worked on scene. So we revert that file thus it worked again.

Unity UI canvas not working with VR

I have been trying to get a very simple demo of a native Unity UI canvas working with VR.
I have read the oculus blog post here: https://developer3.oculus.com/blog/unitys-ui-system-in-vr/ but i need to use the native unity UI as i want to redistribute the code without license worries.I followed this tutorial https://unity3d.com/learn/tutorials/topics/virtual-reality/interaction-vr?playlist=22946 and downloaded the unity vr samples project from the asset store. In this they provide some scripts to place on the camera (VRInput and VREyeRaycaster) and some scripts to place on the target object (VRInteractiveItem and ExampleInteractiveItem).
When i apply the target scripts to a regular GameObject in the scene (e.g. a cube) the raycast works fine and the appropriate calls are made when fire1 is activated. When i try and do this for a canvas object (e.g. a button) - no hit is detected. I have tried placing the two target scripts (VRInteractiveItem and ExampleInteractiveItem) on the canvas, the image containing the button and the button itself and none work. What am i doing wrong? Why would it work on a regular gameobject and not on a UI canvas? I have made sure all my canvas elements have their raycast target boolean property ticked
EDIT:
It seems to work when i attach a box collider to the UI element, is this required? i thought it should just work with a GraphicsRaycaster attached. but the configuration below doesn't work (when box collider is disabled and graphics raycaster is enabled)
This is what is on my players camera:
I dont have a problem using box colliders if i have to but i wanted to take advantage of the UI buttons changes in highlighted and pressed color properties
In Unity raycasting works only with game objects having colliders. Raycast returns true when it hits a collider. Without colliders there is nothing the ray can hit.
Unity Physics.Raycast documentation
I believe, for anyone just seeing this for the first time, a potential reason it is not working is because the canvas from the above picture is using a "Graphics Raycaster" element and not an "OVR Raycaster" element. The OVR Raycaster is meant to replace the graphics raycaster to connect Oculus to Unity UI.
If you want to use the unity's UI in VR you might want to take a look at this asset: VRTK
There are some examples of VR UI using controllers or camera targeting.
Go to your canvas, you should have an option that is "Plane Distance" it's set to 100 , I change it to 0.5 and it works quite well.

Sphere not rendering in Unity for Google Cardboard

I was following this blog post on how to implement 360 degree video in Unity. At the end, I used ffmpeg to split the video into individual frames as recommended. I also set the first frame as the texture for each material on each sphere. The end result looks like this
bad sphere
The big problem though is that once I build and run it on my phone or just play the scene itself, the sphere simply fails to render. Could this be caused by the texture being the first frame? Or am I making some other sort of error? Many thanks.
Movies in Unity are usually rendered as textures on objects. On mobile the issue becomes that the device only wants to display video in a video player, so the Unity class MovieTexture is not supported.
I am having success circumventing this, and successfully rendering 360-video on the inside of a sphere using a Unity plug-in from the Unity Asset Store called, Easy Movie Texture.
For working on a Mac, here's what I did:
Download the Easy Movie Texture plug-in from the Unity Asset Store
Open the Demo Sphere demo scene from Assets/EasyMovieTexture/Scene
Create a new (empty) Prefab to your project, and drag the Sphere GameObject from the Demo Sphere scene onto the Prefab.
Reopen your Cardboard scene and drag the new videosphere prefab into your hierarchy.
Open your source 360-video in Quicktime
File -> Export -> 720p
Change file extension from '.mov' to '.mp4'
Drag your new mp4 file into your projects Assets/Streaming Assets directory. Note: don't import through the menu system, as this will force Unity to convert to OGG.
On the "Media Player Ctrl" script component of your videosphere GameObject, locate the "Str_File_Name" field and provide the FULL filename of your newly exported video file. Make sure to include the extension as part of the string, "mymovie.mp4".
Pretty sure that's everything. Hope it helps other folks stuck on this problem.
Final note, the video will only render on the device. In the editor you will only see a white texture on the sphere. You have to publish to the device in order to see your awesome 360-video.

How to create VR Video player using Google Cardboard SDK for Unity

I just downloaded Google Cardboard SDK for unity. I am fine and able to create VR project. Setup is fine and everything is working fine.
I am noob at VR Apps. Just stepped in VR Apps.
I am planing to create my own VR Enabled Video Player for android, Just like the default Google Cardboard Youtube player.
Can any one suggest me a link or can guide me in developing this app.
Scott Driscoll's answer totally works. I had some initial problems getting the Easy Movie Texture Unity plug-in to work for me, but finally figured it out, and it works flawlessly. I now have 360-video running as a texture on the inside of a sphere on my iPhone 6. And I have to say, I didn't think it would happen.
For working on a Mac, here's what I did:
Download the Easy Movie Texture plug-in from the Unity Asset Store
Open the Demo Sphere demo scene from Assets/EasyMovieTexture/Scene
Create a new (empty) Prefab to your project, and drag the Sphere GameObject from the Demo Sphere scene onto the Prefab.
Reopen your Cardboard scene and drag the new videosphere prefab into your hierarchy.
Open your source 360-video in Quicktime
File -> Export -> 720p
Change file extension from '.mov' to '.mp4'
Drag your new mp4 file into your projects Assets/Streaming Assets directory. Note: don't import through the menu system, as this will force Unity to convert to OGG.
On the "Media Player Ctrl" script component of your videosphere GameObject, locate the "Str_File_Name" field and provide the FULL filename. Make sure to include the extension as part of the string, "mymovie.mp4".
Pretty sure that's everything. Hope it helps other folks stuck on this problem. Thanks Scott Driscoll!
One last note, you can only view the video on the phone, not in preview in the editor. It would be better if it didn't work this way, but really once the initial issues of resolution and placement are resolved, I don't really need to see the video every time I run the scene in the editor.
Here are the major steps for how we do this:
Add a sphere with an equirectangular UV mapping and inward facing normals around the camera.
Purchase a plugin to play a movie on that sphere’s texture. I recommend Easy Movie Texture.
Use mp4s or ogg vorbis files that are compatible with the platform. This is phone and OS dependent.
Full details: http://immersivetechblog.foundry45.com/2015/07/31/implementing-360-video-in-unity-for-gear-vr-and-cardboard/
I saw the answers above but all of them either required Easy Movie Texture Unity plug-in or coding your way through..
There's another easy solution to this as well which won't require you to buy that asset or code your way through..
Oculus provides an already built free sample framework which you can use without much trouble..
The solution below shows both how to create both a photo viewer as well as video viewer for Unity..
Building your 360 degree PhotoViewer:
Go to Blender and delete all the prexisitng objects (if any) and make an icosphere and increase the subdivisions to a point where it looks more like a sphere like 6 and hit Generate UVs (both these options are found in settings underneath the create tab in add to sphere) and go to edit and choose Flip Normals so that you can see inside out rather than outside inwards and save it.
Bring that icosphere saved file into your assets folder in Unity.
Download GoogleVR SDK and bring GoogleVR plugin into your assets folder as well.. (You can download it here: https://developers.google.com/vr/unity/)
Delete the main camera and directional light present in default.
Bring your icosphere asset into your project.
Bring GVR Main from your assets folder into your project: GoogleVR plugin -> Legacy -> Prefab -> GVR Main
Take any panaroma or 360 photo and bring it in your assets folder.
Take this photo in your asset folder and put it above the icosphere in your scene and hit play. You should be able to see your 360 degree photos.
Building your 360 degree MoviePlayer:
Step 1 same.
Now go to the Oculus developer console and download this file and bring this to your assets folder. https://developer3.oculus.com/downloads/game-engines/1.5.0/Oculus_Sample_Framework_for_Unity_5_Project/
Bring this file you downloaded above to your assets folder.
Find MoviePlayer in your assetsFolder in bring it in your project.
Bring the icosphere you downloaded into your assets folder as well and scale it a little bit so you can see correctly.
Copy the MoviePlayer sample script and Audio Source in the components of the MovieSurface from the project and add it in the components of the sphere in the scene, also get rid of the animator in the components of the sphere.
Bring the Movie Player material found in the Materials under Mesh Renderer in MovieSurface and add it on top of your sphere.
Now this sphere formed is your 360 degree movie player so store it as an asset in the asset folder.
Create a new scene, delete the directional light and bring your saved icosphere asset into this scene and move the main camera at the centre.
Delete the non required assets to clean up some space in your project other than MoviePlayer, Plugins and Streaming Assets.
You'll have to convert the desired mp4 into an ogv file as well for the plugin to play in VR and bring both the mp4 and ogv files into your streaming assets folder and change the MovieName and click Play. You should be able to see your 360 degree video playing.
*To play it in your devices, just go to build settings and choose the desired platform and delete all the scene and just Add Open Scene and click on the Virtual Reality Supported in Other Settings under Player Settings.
To play it in your android phone you need to download the GoogleVR SDK just like above and bring it in your assets folder and find the GVRViewerMain in the assets folder and bring it in the scene and uncheck the Virtual Reality Supported you did above and just build and run the whole thing in your device (You should be able to see the view in you game mode when you hit play in Unity).
You should be able to see the video in your respective gear.
There's also a video tutorial available but I'm only able to share only 2 links with my new StackOverFlow profile.
I can't help you with Unity, but in java, you can create a texture with OpenGL-ES:
private static int GL_TEXTURE_EXTERNAL_OES = 0x8D65;
....
GLES20.glGenTextures(1, textureHandle, 0);
GLES20.glBindTexture(GL_TEXTURE_EXTERNAL_OES, textureHandle[0]);
Use it to create a surface texture and a surface:
SurfaceTexture surfaceTexture = new SurfaceTexture(textureHandle[0]);
Surface surface = new Surface(surfaceTexture);
And then pass that surface to android.media.MediaPlayer:
MediaPlayer mediaPlayer = new MediaPlayer(getContext(), uriToMyMediaFile, surface);
Bind that texture to a square in your scene and call this every frame:
surfaceTexture.updateTexImage()
and the video will play when you call mediaPlayer.start();
If Unity allows you to write your own java code to run behind the scenes, this should work if you bind that texture to a surface from Unity.
If you have a video stream that you can't play with mediaPlayer (like a live video chat, etc), you can use the surface with android.media.MediaCodec as well, but there's a lot more setup work involved.
This has become very simple for Unity 5.6 and above.
You just need a sphere with its normals inverted which you can either find online or just go to blender and make an IcoSphere and flip its normals or you can use a shader to do the same on a normal sphere. In either case use an Unlit texture for the shader.
Add a Video Player in the component which comes default with Unity to this sphere.
Add any 360 degree video to this Video Player. You can also add an online link as well. And it plays consistently well throughout the range of platforms from GoogleVR to SteamVR.
The only downside is, it only plays monoscopic images/videos by default and there requires some tweaking to run stereoscopic images/videos.