I'm rendering the WebCamTexture onto a raw image in Unity, then deploy it to the Hololens 2 to display the internal video camera image. It works but the application tanks to about 15-20 fps. Nothing else is in the scene. If I disable the texture, it runs with 60 fps.
Using Unity version 2020.3.27f1 and MRTK2
Here's the class I'm using:
using UnityEngine;
using UnityEngine.UI;
public class PlayMovieTextureOnUI : MonoBehaviour
{
WebCamTexture tex;
public RawImage rawimage;
void Awake()
{
WebCamDevice device = WebCamTexture.devices[0];
tex = new WebCamTexture(device.name);
rawimage.texture = tex;
tex.Play();
}
}
Any ideas?
What I've tried so far:
Use a different Unity version
Change texture scale
Change WebCamTexture.requestedFPS, height and width
Related
I am using the Oculus Quest with Unity 2018.4.22f1 and Oculus SDK. Moving into my application works well. But every time I want to get the position of the headset, the vector zero is returned.
I tried these solutions :
OVRPose tracker = OVRManager.tracker.GetPose();
return tracker.position;
And
GameObject camera = GameObject.Find("OVRCameraRig");
return camera.transform.position;
This is position tracking setup:
Do you have any idea how to get the headset position?
When I want to get the headset's position, I am using the transform.position of the CenterEyeAnchor which is inside OVRPlayerController/OVRCameraRig/TrackingSpace.
using UnityEngine;
public class YourScriptNameHere : MonoBehaviour
{
GameObject centerEye;
Vector3 headsetPos;
void Start(){
centerEye = GameObject.Find("CenterEyeAnchor");
}
void Update(){
headsetPos = centerEye.transform.position;
}
}
I have the code exact from a tutorial I copied, and the webcam is inserted and works. But when I load the Unity game (In Unity Editor) there is no "No Device connected" error or incorrect scripts. I'm confused as to why it isn't working.
Why isn't it being displayed?
My webCamScript
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class webCamScript : MonoBehaviour {
public GameObject webCameraPlane;
// Use this for initialization
void Start () {
if (Application.isMobilePlatform) {
GameObject cameraParent = new GameObject ("camParent");
cameraParent.transform.position = this.transform.position;
this.transform.parent = cameraParent.transform;
cameraParent.transform.Rotate (Vector3.right, 90);
}
Input.gyro.enabled = true;
WebCamTexture webCameraTexture = new WebCamTexture ();
webCameraPlane.GetComponent<MeshRenderer> ().material.mainTexture = webCameraTexture;
webCameraTexture.Play ();
}
// Update is called once per frame
void Update () {
Quaternion cameraRotation = new Quaternion (Input.gyro.attitude.x, Input.gyro.attitude.y, -Input.gyro.attitude.x, -Input.gyro.attitude.y);
this.transform.localRotation = cameraRotation;
}
}
Solved
I found the problem, I had a custom texture on the plane which was stopping the camera texture from being inserted.
I presume it has something to do with the fact that you have your code wrapped in an if statement that is checking to see if you are running on a mobile platform. The editor will not be classed as a mobile platform and hence that code will be ignored
I want to play a stereo 360 degree video in virtual reality in Unity on an Android. So far I have been doing some research and I have two cameras for the right and left eye with each a sphere around them. I also need a custom shader to make the image render on the inside of the sphere. I have the upper half of the image showing on one sphere by setting the y-tiling to 0.5 and the lower half shows on the other sphere with y-tiling 0.5 and y-offset 0.5. With this I can show a 3D 360 degree image already correct. The whole idea is from this tutorial.
Now for video, I need control over the Video speed so it turned out I need the VideoPlayer from the new Unity 5.6 beta. Now my setup so far would require the Video Player to play the video on both spheres with one sphere playing the upper part (one eye) and the other video playing the lower part (other eye).
Here is my problem: I don't know how to get the video Player to play the same video on two different materials (since they have different tiling values). Is there a way to do that?
I got a hint that I could use the same material and achieve the tiling effect via UV, but I don't know how that works and I haven't even got the video player to play the video on two objects using the same material on both of them. I have a screenshot of that here. The Right sphere just has the material videoMaterial. No tiling since I'd have to do that via UV.
Which way to go and how to do it? Am I on the right way here?
Am I on the right way here?
Almost but you are currently using Renderer and Material instead of RenderTexture and Material.
Which way to go and how to do it?
You need to use RenderTexture for this. Basically, you render the Video to RenderTexture then you assign that Texture to the material of both Spheres.
1.Create a RenderTexture and assign it to the VideoPlayer.
2.Create two materials for the spheres.
3.Set VideoPlayer.renderMode to VideoRenderMode.RenderTexture;
4.Set the Texture of both Spheres to the Texture from the RenderTexture
5.Prepare and Play Video.
The code below is doing that exact thing. It should work out of the box. The only thing you need to do is to modify the tiling and offset of each material to your needs.
You should also comment out:
leftSphere = createSphere("LeftEye", new Vector3(-5f, 0f, 0f), new Vector3(4f, 4f, 4f));
rightSphere = createSphere("RightEye", new Vector3(5f, 0f, 0f), new Vector3(4f, 4f, 4f));
then use a Sphere imported from any 3D application. That line of code is only there for testing purposes and it's not a good idea to play video with Unity's sphere because the spheres don't have enough details to make the video smooth.
using UnityEngine;
using UnityEngine.Video;
public class StereoscopicVideoPlayer : MonoBehaviour
{
RenderTexture renderTexture;
Material leftSphereMat;
Material rightSphereMat;
public GameObject leftSphere;
public GameObject rightSphere;
private VideoPlayer videoPlayer;
//Audio
private AudioSource audioSource;
void Start()
{
//Create Render Texture
renderTexture = createRenderTexture();
//Create Left and Right Sphere Materials
leftSphereMat = createMaterial();
rightSphereMat = createMaterial();
//Create the Left and Right Sphere Spheres
leftSphere = createSphere("LeftEye", new Vector3(-5f, 0f, 0f), new Vector3(4f, 4f, 4f));
rightSphere = createSphere("RightEye", new Vector3(5f, 0f, 0f), new Vector3(4f, 4f, 4f));
//Assign material to the Spheres
leftSphere.GetComponent<MeshRenderer>().material = leftSphereMat;
rightSphere.GetComponent<MeshRenderer>().material = rightSphereMat;
//Add VideoPlayer to the GameObject
videoPlayer = gameObject.AddComponent<VideoPlayer>();
//Add AudioSource
audioSource = gameObject.AddComponent<AudioSource>();
//Disable Play on Awake for both Video and Audio
videoPlayer.playOnAwake = false;
audioSource.playOnAwake = false;
// We want to play from url
videoPlayer.source = VideoSource.Url;
videoPlayer.url = "http://www.quirksmode.org/html5/videos/big_buck_bunny.mp4";
//Set Audio Output to AudioSource
videoPlayer.audioOutputMode = VideoAudioOutputMode.AudioSource;
//Assign the Audio from Video to AudioSource to be played
videoPlayer.EnableAudioTrack(0, true);
videoPlayer.SetTargetAudioSource(0, audioSource);
//Set the mode of output to be RenderTexture
videoPlayer.renderMode = VideoRenderMode.RenderTexture;
//Set the RenderTexture to store the images to
videoPlayer.targetTexture = renderTexture;
//Set the Texture of both Spheres to the Texture from the RenderTexture
assignTextureToSphere();
//Prepare Video to prevent Buffering
videoPlayer.Prepare();
//Subscribe to prepareCompleted event
videoPlayer.prepareCompleted += OnVideoPrepared;
}
RenderTexture createRenderTexture()
{
RenderTexture rd = new RenderTexture(1024, 1024, 16, RenderTextureFormat.ARGB32);
rd.Create();
return rd;
}
Material createMaterial()
{
return new Material(Shader.Find("Specular"));
}
void assignTextureToSphere()
{
//Set the Texture of both Spheres to the Texture from the RenderTexture
leftSphereMat.mainTexture = renderTexture;
rightSphereMat.mainTexture = renderTexture;
}
GameObject createSphere(string name, Vector3 spherePos, Vector3 sphereScale)
{
GameObject sphere = GameObject.CreatePrimitive(PrimitiveType.Sphere);
sphere.transform.position = spherePos;
sphere.transform.localScale = sphereScale;
sphere.name = name;
return sphere;
}
void OnVideoPrepared(VideoPlayer source)
{
Debug.Log("Done Preparing Video");
//Play Video
videoPlayer.Play();
//Play Sound
audioSource.Play();
//Change Play Speed
if (videoPlayer.canSetPlaybackSpeed)
{
videoPlayer.playbackSpeed = 1f;
}
}
}
There is also Unity tutorial on how to do this with a special shader but this does not work for me and some other people. I suggest you use the method above until VR support is added to the VideoPlayer API.
I'm experimenting to have device camera (Pixel phone) real-time video show in Daydream app built by Unity. In Unity I tried WebCamTexture, and when test run in Unity on a PC, WebCamTexture captures the webcam feed, but when I build the app to Daydream /Cardboard, there is no camera feed. However, I did put a canvas / text to print out if any camera is detected, and if so, the name of the camera. In Daydream app when running, 2 cameras are detected (Camera 0 , Camera 1), but no image shown. Does any one have ideas/ suggestions how to solve this? The following is the code I wrote in C# and attached it to a Unity Plane/ Quad, and also referenced a Canvas Text.
using UnityEngine;
using System.Collections;
using UnityEngine.UI;
public class tWebCam : MonoBehaviour
{
public Text textPanel;
WebCamTexture mCamera = null;
public GameObject camPlane;
// Use this for initialization
void Start ()
{
WebCamDevice[] devices = WebCamTexture.devices;
//camPlane = GameObject.FindWithTag("Player");
if (devices.Length > 0)
{
mCamera = new WebCamTexture(devices[devices.Length-1].name, 1920, 1920, 30);
camPlane.GetComponent<Renderer>().material.mainTexture = mCamera;
mCamera.deviceName = devices[devices.Length - 1].name;
Debug.Log(devices.Length);
Debug.Log(mCamera.deviceName);
mCamera.Play();
textPanel.text = "# of devices: " + devices.Length.ToString() + "; Device 1: " + mCamera.deviceName;
}
else
{
textPanel.text = "No device detected...";
}
}
// Update is called once per frame
void Update () {
}
}
If I understand your question, the camera won't work because the phone is inside either the Cardboard or Daydream headsets and is thus physically obstructed.
I have a problem with getting a video texture to show up in unity 5.2 personal edition. I have applied a material with unlit shader and assigned it as a video texture. I also call the specific video texture through a script attached to the object with the video texture.
using UnityEngine;
using System.Collections;
[RequireComponent(typeof(AudioSource))]
public class CallMovie : MonoBehaviour {
public MovieTexture myMovie;
// Use this for initialization
void Start () {
Debug.Log ("start intro");
GetComponent<Renderer>().material.mainTexture = myMovie;// get movie
myMovie.Play ();// shall play movie
}
void Update(){
myMovie.loop =true;
}
}
When I hit the play button in unity the video texture stays black and nothing happens om screen although the program says it ran the video when checked with debug log.
Since I cant post questions in comment on your initial the following is an attempt to answer with what I know.
In your first statement after the debug call you are setting the maintexture component of the instanced material to myMovie, depending on shader this may or may not work as 'mainTexture' may not be referencing the texture you expect.
You can insure you hit the desired texture using the following method
//Note the diffrence between a material instance and the shared material
//... dont forget to clean up instances if you make them which hapens when you call .material
Material instancedMaterial = gameObject.GetComponent<Renderer>().material;
Material sharedMaterial = gameObject.GetComponent<Renderer>().sharedMaterial;
//_MainTex is the name of the main texture for most stock shaders ... but not all
//You can select the shader by clicking the gear in the inspector of the material
//this will display the shader in the inspector where you can see its properties by name
instancedMaterial.SetTexture("_MainTex", movie);
The following code is from a working class object I use to set a Unity UI object RawImage to render a movie. From what I see in your example you have the movie part correct I suspect your issue is with the shader parameter.
using UnityEngine;
using System.Collections;
public class RawImageMovePlayer : MonoBehaviour
{
public UnityEngine.UI.RawImage imageSource;
public bool play;
public bool isLoop = true;
public MovieTexture movie;
// Use this for initialization
void Start ()
{
movie = (MovieTexture)imageSource.texture;
movie.loop = isLoop;
}
// Update is called once per frame
void Update ()
{
if (!movie.isPlaying && play)
movie.Play();
}
public void ChangeMovie(MovieTexture movie)
{
imageSource.texture = movie;
this.movie = (MovieTexture)imageSource.texture;
this.movie.loop = isLoop;
}
public void OnDisable()
{
if (movie != null && movie.isPlaying)
movie.Stop();
}
}