WebCamTexture doesn't show on Daydream (Pixel's device camera)? - unity3d

I'm experimenting to have device camera (Pixel phone) real-time video show in Daydream app built by Unity. In Unity I tried WebCamTexture, and when test run in Unity on a PC, WebCamTexture captures the webcam feed, but when I build the app to Daydream /Cardboard, there is no camera feed. However, I did put a canvas / text to print out if any camera is detected, and if so, the name of the camera. In Daydream app when running, 2 cameras are detected (Camera 0 , Camera 1), but no image shown. Does any one have ideas/ suggestions how to solve this? The following is the code I wrote in C# and attached it to a Unity Plane/ Quad, and also referenced a Canvas Text.
using UnityEngine;
using System.Collections;
using UnityEngine.UI;
public class tWebCam : MonoBehaviour
{
public Text textPanel;
WebCamTexture mCamera = null;
public GameObject camPlane;
// Use this for initialization
void Start ()
{
WebCamDevice[] devices = WebCamTexture.devices;
//camPlane = GameObject.FindWithTag("Player");
if (devices.Length > 0)
{
mCamera = new WebCamTexture(devices[devices.Length-1].name, 1920, 1920, 30);
camPlane.GetComponent<Renderer>().material.mainTexture = mCamera;
mCamera.deviceName = devices[devices.Length - 1].name;
Debug.Log(devices.Length);
Debug.Log(mCamera.deviceName);
mCamera.Play();
textPanel.text = "# of devices: " + devices.Length.ToString() + "; Device 1: " + mCamera.deviceName;
}
else
{
textPanel.text = "No device detected...";
}
}
// Update is called once per frame
void Update () {
}
}

If I understand your question, the camera won't work because the phone is inside either the Cardboard or Daydream headsets and is thus physically obstructed.

Related

Hololens 2 Unity WebCamTexture render low framerate

I'm rendering the WebCamTexture onto a raw image in Unity, then deploy it to the Hololens 2 to display the internal video camera image. It works but the application tanks to about 15-20 fps. Nothing else is in the scene. If I disable the texture, it runs with 60 fps.
Using Unity version 2020.3.27f1 and MRTK2
Here's the class I'm using:
using UnityEngine;
using UnityEngine.UI;
public class PlayMovieTextureOnUI : MonoBehaviour
{
WebCamTexture tex;
public RawImage rawimage;
void Awake()
{
WebCamDevice device = WebCamTexture.devices[0];
tex = new WebCamTexture(device.name);
rawimage.texture = tex;
tex.Play();
}
}
Any ideas?
What I've tried so far:
Use a different Unity version
Change texture scale
Change WebCamTexture.requestedFPS, height and width

Headset position always returns zero in Oculus Quest with Unity and Oculus SDK

I am using the Oculus Quest with Unity 2018.4.22f1 and Oculus SDK. Moving into my application works well. But every time I want to get the position of the headset, the vector zero is returned.
I tried these solutions :
OVRPose tracker = OVRManager.tracker.GetPose();
return tracker.position;
And
GameObject camera = GameObject.Find("OVRCameraRig");
return camera.transform.position;
This is position tracking setup:
Do you have any idea how to get the headset position?
When I want to get the headset's position, I am using the transform.position of the CenterEyeAnchor which is inside OVRPlayerController/OVRCameraRig/TrackingSpace.
using UnityEngine;
public class YourScriptNameHere : MonoBehaviour
{
GameObject centerEye;
Vector3 headsetPos;
void Start(){
centerEye = GameObject.Find("CenterEyeAnchor");
}
void Update(){
headsetPos = centerEye.transform.position;
}
}

Unity | webCamScript not displaying

I have the code exact from a tutorial I copied, and the webcam is inserted and works. But when I load the Unity game (In Unity Editor) there is no "No Device connected" error or incorrect scripts. I'm confused as to why it isn't working.
Why isn't it being displayed?
My webCamScript
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class webCamScript : MonoBehaviour {
public GameObject webCameraPlane;
// Use this for initialization
void Start () {
if (Application.isMobilePlatform) {
GameObject cameraParent = new GameObject ("camParent");
cameraParent.transform.position = this.transform.position;
this.transform.parent = cameraParent.transform;
cameraParent.transform.Rotate (Vector3.right, 90);
}
Input.gyro.enabled = true;
WebCamTexture webCameraTexture = new WebCamTexture ();
webCameraPlane.GetComponent<MeshRenderer> ().material.mainTexture = webCameraTexture;
webCameraTexture.Play ();
}
// Update is called once per frame
void Update () {
Quaternion cameraRotation = new Quaternion (Input.gyro.attitude.x, Input.gyro.attitude.y, -Input.gyro.attitude.x, -Input.gyro.attitude.y);
this.transform.localRotation = cameraRotation;
}
}
Solved
I found the problem, I had a custom texture on the plane which was stopping the camera texture from being inserted.
I presume it has something to do with the fact that you have your code wrapped in an if statement that is checking to see if you are running on a mobile platform. The editor will not be classed as a mobile platform and hence that code will be ignored

How to trigger "Fire2" button using bluetooth controller of Google Cardboard in Unity3d

I am trying to transform object into VR app using google cardboard sdk in Unity3d. I write a script and comparing input buttons on Update method. So my code is below.
void Update () {
if (Input.GetButtonDown("Fire1"))
{
onObjectDown();
}
if (Input.GetButtonDown("Fire2"))
{
onObjectExit();
}
if (Input.GetButtonDown("Fire3"))
{
onObjectExit();
}
if (dragging)
{
Ray ray = Camera.main.ScreenPointToRay(Input.mousePosition);
Vector3 rayPoint = ray.GetPoint(distance);
transform.position = rayPoint;
}
}
onObjectDown() & onObjectExit() method are as follows
public void onObjectDown()
{
Debug.Log(name + " Game Object Down!");
distance = Vector3.Distance(transform.position, Camera.main.transform.position);
dragging = true;
}
public void onObjectExit()
{
dragging = false;
Debug.Log(name + " Game Object Exit!");
GetComponent<Renderer>().material.color = originalColor;
}
This coding is perfectly working on play mode on desktop. But on emulator only onObjectDown is executing. This means only "Fire1" button is pressed. Is anyone know how to get input from button "Fire2" & "Fire3" via bluetooth controller ?
For reference, my input setting in project is as below
Button 1 on Bluetooth controller you can initializing with Input.GetMouseButton(0) or Fire 1 because it simulate display tap. But the other controller button simulate (in my case) back button on Android device, so u can initializing with Input.GetKeyDown(KeyCode.Escape).

Can I take a photo in Unity using the device's camera?

I'm entirely unfamiliar with Unity3D's more complex feature set and am curious if it has the capability to take a picture and then manipulate it. Specifically my desire is to have the user take a selfie and then have them trace around their face to create a PNG that would then be texture mapped onto a model.
I know that the face mapping onto a model is simple, but I'm wondering if I need to write the photo/carving functionality into the encompassing Chrome app, or if it can all be done from within Unity. I don't need a tutorial on how to do it, just asking if it's something that is possible.
Yes, this is possible. You will want to look at the WebCamTexture functionality.
You create a WebCamTexture and call its Play() function which starts the camera. WebCamTexture, as any Texture, allows you to get the pixels via a GetPixels() call. This allows you to take a snapshot in when you like, and you can save this in a Texture2D. A call to EncodeToPNG() and subsequent write to file should get you there.
Do note that the code below is a quick write-up based on the documentation. I have not tested it. You might have to select a correct device if there are more than one available.
using UnityEngine;
using System.Collections;
using System.IO;
public class WebCamPhotoCamera : MonoBehaviour
{
WebCamTexture webCamTexture;
void Start()
{
webCamTexture = new WebCamTexture();
GetComponent<Renderer>().material.mainTexture = webCamTexture; //Add Mesh Renderer to the GameObject to which this script is attached to
webCamTexture.Play();
}
IEnumerator TakePhoto() // Start this Coroutine on some button click
{
// NOTE - you almost certainly have to do this here:
yield return new WaitForEndOfFrame();
// it's a rare case where the Unity doco is pretty clear,
// http://docs.unity3d.com/ScriptReference/WaitForEndOfFrame.html
// be sure to scroll down to the SECOND long example on that doco page
Texture2D photo = new Texture2D(webCamTexture.width, webCamTexture.height);
photo.SetPixels(webCamTexture.GetPixels());
photo.Apply();
//Encode to a PNG
byte[] bytes = photo.EncodeToPNG();
//Write out the PNG. Of course you have to substitute your_path for something sensible
File.WriteAllBytes(your_path + "photo.png", bytes);
}
}
For those trying to get the camera to render live feed, here's how I managed to pull it off. First, I edited Bart's answer so the texture would be assigned on Update rather than just on Start:
void Start()
{
webCamTexture = new WebCamTexture();
webCamTexture.Play();
}
void Update()
{
GetComponent<RawImage>().texture = webCamTexture;
}
Then I attached the script to a GameObject with a RawImage component. You can easily create one by Right Click -> UI -> RawImage in the Hierarchy in the Unity Editor (this requires Unity 4.6 and above). Running it should show a live feed of the camera in your view. As of this writing, Unity 5 supports the use of webcams in the free personal edition of Unity 5.
I hope this helps anyone looking for a good way to capture live camera feed in Unity.
It is possible. I highly recommend you look at WebcamTexture Unity API. It has some useful functions:
GetPixel() -- Returns pixel color at coordinates (x, y).
GetPixels() -- Get a block of pixel colors.
GetPixels32() -- Returns the pixels data in raw format.
MarkNonReadable() -- Marks WebCamTexture as unreadable
Pause() -- Pauses the camera.
Play() -- Starts the camera.
Stop() -- Stops the camera.
Bart's answer has a required modification. I used his code and the pic I was getting was black. Required modification is that we have to
convert TakePhoto to a coroutine and add
yield return new WaitForEndOfFrame();
at the start of Coroutine. (Courtsey #fafase)
For more details see
http://docs.unity3d.com/ScriptReference/WaitForEndOfFrame.html
You can also refer to
Take photo using webcam is giving black output[Unity3D]
Yes, You can. I created Android Native camera plugin that can open your Android device camera, capture image, record video and save that in the desired location of your device with just a few lines of code.
you need to find your webcam device Index by search it in the devices list and select it for webcam texture to play.
you can use this code:
using UnityEngine;
using System.Collections;
using System.IO;
using UnityEngine.UI;
using System.Collections.Generic;
public class GetCam : MonoBehaviour
{
WebCamTexture webCam;
string your_path = "C:\\Users\\Jay\\Desktop";// any path you want to save your image
public RawImage display;
public AspectRatioFitter fit;
public void Start()
{
if(WebCamTexture.devices.Length==0)
{
Debug.LogError("can not found any camera!");
return;
}
int index = -1;
for (int i = 0; i < WebCamTexture.devices.Length; i++)
{
if (WebCamTexture.devices[i].name.ToLower().Contains("your webcam name"))
{
Debug.LogError("WebCam Name:" + WebCamTexture.devices[i].name + " Webcam Index:" + i);
index = i;
}
}
if (index == -1)
{
Debug.LogError("can not found your camera name!");
return;
}
WebCamDevice device = WebCamTexture.devices[index];
webCam = new WebCamTexture(device.name);
webCam.Play();
StartCoroutine(TakePhoto());
display.texture = webCam;
}
public void Update()
{
float ratio = (float)webCam.width / (float)webCam.height;
fit.aspectRatio = ratio;
float ScaleY = webCam.videoVerticallyMirrored ? -1f : 1f;
display.rectTransform.localScale = new Vector3(1f, ScaleY, 1f);
int orient = -webCam.videoRotationAngle;
display.rectTransform.localEulerAngles = new Vector3(0, 0, orient);
}
public void callTakePhoto() // call this function in button click event
{
StartCoroutine(TakePhoto());
}
IEnumerator TakePhoto() // Start this Coroutine on some button click
{
// NOTE - you almost certainly have to do this here:
yield return new WaitForEndOfFrame();
// it's a rare case where the Unity doco is pretty clear,
// http://docs.unity3d.com/ScriptReference/WaitForEndOfFrame.html
// be sure to scroll down to the SECOND long example on that doco page
Texture2D photo = new Texture2D(webCam.width, webCam.height);
photo.SetPixels(webCam.GetPixels());
photo.Apply();
//Encode to a PNG
byte[] bytes = photo.EncodeToPNG();
//Write out the PNG. Of course you have to substitute your_path for something sensible
File.WriteAllBytes(your_path + "\\photo.png", bytes);
}
}
There is a plugin available for this type of functionality called Camera Capture Kit - https://www.assetstore.unity3d.com/en/#!/content/56673 and while the functionality provided is geared towards mobile it contains a demo of how you can use the WebCamTexture to take a still image.
If you want to do that without using a third party plugin then #FuntionR solution will help you. But, if you want to save the captured photo to the gallery (Android & iOS)then it's not possible within unity, you have to write native code to transfer photo to gallery and then call it from unity.
Here is a summarise blog which will guide you to achieve your goal.
http://unitydevelopers.blogspot.com/2018/07/pick-image-from-gallery-in-unity3d.html
Edit: Note that, the above thread describes image picking from the gallery, but the same process will be for saving the image to the gallery.