Unity | webCamScript not displaying - unity3d

I have the code exact from a tutorial I copied, and the webcam is inserted and works. But when I load the Unity game (In Unity Editor) there is no "No Device connected" error or incorrect scripts. I'm confused as to why it isn't working.
Why isn't it being displayed?
My webCamScript
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class webCamScript : MonoBehaviour {
public GameObject webCameraPlane;
// Use this for initialization
void Start () {
if (Application.isMobilePlatform) {
GameObject cameraParent = new GameObject ("camParent");
cameraParent.transform.position = this.transform.position;
this.transform.parent = cameraParent.transform;
cameraParent.transform.Rotate (Vector3.right, 90);
}
Input.gyro.enabled = true;
WebCamTexture webCameraTexture = new WebCamTexture ();
webCameraPlane.GetComponent<MeshRenderer> ().material.mainTexture = webCameraTexture;
webCameraTexture.Play ();
}
// Update is called once per frame
void Update () {
Quaternion cameraRotation = new Quaternion (Input.gyro.attitude.x, Input.gyro.attitude.y, -Input.gyro.attitude.x, -Input.gyro.attitude.y);
this.transform.localRotation = cameraRotation;
}
}
Solved
I found the problem, I had a custom texture on the plane which was stopping the camera texture from being inserted.

I presume it has something to do with the fact that you have your code wrapped in an if statement that is checking to see if you are running on a mobile platform. The editor will not be classed as a mobile platform and hence that code will be ignored

Related

Unity3D: Raycaster.Raycast on UI, get Worldposition of the hit point

I am in UNity3D. I am trying to raycast on UI Element and get the world position of the hit. I have the small test code here. It gives me the name of the target UI Element, but not the position. The world position of the hit is always (0,0,0). Please, can someone suggest, how i can get it right? thank you a lot.
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.UI;
using UnityEngine.EventSystems;
public class RaycasterRayScriptTest : MonoBehaviour
{
[SerializeField] private GraphicRaycaster m_Raycaster;
private PointerEventData m_PointerEventData;
private EventSystem m_EventSystem;
[SerializeField] private Vector3 HitPosition;
// Update is called once per frame
void Update()
{
m_PointerEventData = new PointerEventData(m_EventSystem);
m_PointerEventData.position = Input.mousePosition;
List<RaycastResult> results = new List<RaycastResult>();
m_Raycaster.Raycast(m_PointerEventData, results);
foreach (RaycastResult result in results)
{
Debug.Log("Hit " + result.gameObject.name);
HitPosition = result.worldPosition;
Debug.Log("HitPosition " + HitPosition.ToString());
}
}
}
I just solved this problem. You have to set Canvas Render Mode: Screen Space - Camera. Select the camera, that you have to use for the UI. With this UI-Plane will appear as an GameObject in the world. But the ui-plane position is not adjustable, so you have to move the camera back and adjust the field of view of the camera.
And because the UI-Plane is now an Object, the code
HitPosition = result.worldPosition;
works now.
without the changes on the canvas and camera, the code doesn't work, because plane of ui is virtual and it's position set to the camera position. So if you try to get the worldPosition, it uses the distance betwenn camera and UI-Plane, and since it is zero, you will get zero for the worldPosition as well.

TryGetFeatureValue always 0 for Unity XR Input's CommonUsages.trigger

I am developing a VR game in Unity (2020.3.15f2) using the XR Interaction Toolkit package (1.0.0-pre.5) for my Oculus Quest 2. At this stage in my development, I am trying to recognize presses to the trigger and grip buttons on the controllers respectively in order to animate some 3D hand models. Here's the script I've written to accomplish this:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.XR;
public class HandPresence : MonoBehaviour {
public InputDeviceCharacteristics controllerCharacteristics;
public GameObject handModelPrefab;
private InputDevice targetDevice;
private GameObject spawnedHandModel;
private Animator handAnimator;
void Start() {
TryInitialize();
}
void TryInitialize() {
List<InputDevice> devices = new List<InputDevice>();
InputDevices.GetDevicesWithCharacteristics(controllerCharacteristics, devices);
if (devices.Count > 0) {
targetDevice = devices[0];
spawnedHandModel = Instantiate(handModelPrefab, transform);
handAnimator = spawnedHandModel.GetComponent<Animator>();
}
}
void UpdateHandAnimation() {
if (targetDevice.TryGetFeatureValue(CommonUsages.trigger, out float triggerValue)) {
handAnimator.SetFloat("Trigger", triggerValue);
} else {
handAnimator.SetFloat("Trigger", 0);
}
if (targetDevice.TryGetFeatureValue(CommonUsages.grip, out float gripValue)) {
handAnimator.SetFloat("Grip", gripValue);
} else {
handAnimator.SetFloat("Grip", 0);
}
}
void Update()
{
if (!targetDevice.isValid) {
TryInitialize();
} else {
spawnedHandModel.SetActive(true);
UpdateHandAnimation();
}
}
}
The issue I'm experiencing is that the values of both triggerValue and gripValue are always 0. The value of targetDevice looks fine. I also tried using triggerButton, gripButton, primaryButton, etc. and they are always 0/false as well. The hand models show up just fine and their movement is in sync with the movement of the controllers, but they just don't seem to want to register any button presses.
I've been stuck on this one for hours and would very much appreciate any insight, thank you!
Is your project setup with the (new) Input System? I have no problem detecting there trigger and grip values.
Also make sure the targetDevice actually uses trigger and grip features, maybe it is another device such as the HMD.

GVRReticlePointer not working properly with onClick Events

I followed this tutorial and done everything that was mentioned, everything looked and worked fine except the reticle point expands and contracts when I gaze upon the cube but the click event does not trigger. Any help would be greatly appreciated.
Code for random teleport is
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.UI;
using UnityEngine.EventSystems;
public class ReticleTest : MonoBehaviour
{
public void RandomlyTeleport()
{
Debug.Log("reached here");
gameObject.transform.position = new Vector3(
GetRandomCoordinate(), Random.Range(0.5f, 2), GetRandomCoordinate());
}
private float GetRandomCoordinate()
{
var coordinate = Random.Range(-7,7);
while( coordinate > -1.5 && coordinate < 1.5)
{
coordinate = Random.Range(-5,5);
}
return coordinate;
}
}
These are screenshots
Update :
It turns out that the same thing is happening when I bring up the scene that comes premade with SDK HelloVR, although the hexagon(the only interactive thing in the scene) changes color when I gaze at it, nothing else happens when I click it. So, this is not a problem with what I made but the inherent problem of unity or the SDK

Unity SteamVR Actions not Working in Build

I am trying to switch to another scene when the headset is removed. It is already working in the Unity editor, however, not in the build.
Setup:
Unity 2018.3.6f1
SteamVR Unity Plugin v.2.2.0
Vive Pro
This code is working in the editor:
using System.Collections;
using UnityEngine;
using UnityEngine.SceneManagement;
using Valve.VR;
public class EndGame : MonoBehaviour
{
[Tooltip("This action lets you know when the player has placed the headset on their head")]
public SteamVR_Action_Boolean headsetOnHead = SteamVR_Input.GetBooleanAction("HeadsetOnHead");
void Update()
{
if (SteamVR.initializedState != SteamVR.InitializedStates.InitializeSuccess)
{
return;
}
if (headsetOnHead != null)
{
if (headsetOnHead.GetStateDown(SteamVR_Input_Sources.Head))
{
StopCoroutine(RestartGame());
}
else if (headsetOnHead.GetStateUp(SteamVR_Input_Sources.Head))
{
StartCoroutine(RestartGame());
}
}
}
IEnumerator RestartGame()
{
yield return new WaitForSecondsRealtime(3);
SceneManager.LoadScene("Startscene", LoadSceneMode.Single);
yield return null;
}
}
In the Editor \actions\default\in\HeadsetOnHead is referenced to public SteamVR_Action_Boolean headsetOnHead.
The actions.json (including /actions/default/in/HeadsetOnHead) is in the build folder and the logs don't show any errors.
Switching to SteamVR_LoadLevel instead of scene management solved the issue for the current Unity Version.
Also, the script was attached to the SteamVR Player Prefab in the scene. Now it's placed on a seperate gameobject in the scene, to avoid issues with dont distroy on load with the Player prefab.

Video texture Unity 5

I have a problem with getting a video texture to show up in unity 5.2 personal edition. I have applied a material with unlit shader and assigned it as a video texture. I also call the specific video texture through a script attached to the object with the video texture.
using UnityEngine;
using System.Collections;
[RequireComponent(typeof(AudioSource))]
public class CallMovie : MonoBehaviour {
public MovieTexture myMovie;
// Use this for initialization
void Start () {
Debug.Log ("start intro");
GetComponent<Renderer>().material.mainTexture = myMovie;// get movie
myMovie.Play ();// shall play movie
}
void Update(){
myMovie.loop =true;
}
}
When I hit the play button in unity the video texture stays black and nothing happens om screen although the program says it ran the video when checked with debug log.
Since I cant post questions in comment on your initial the following is an attempt to answer with what I know.
In your first statement after the debug call you are setting the maintexture component of the instanced material to myMovie, depending on shader this may or may not work as 'mainTexture' may not be referencing the texture you expect.
You can insure you hit the desired texture using the following method
//Note the diffrence between a material instance and the shared material
//... dont forget to clean up instances if you make them which hapens when you call .material
Material instancedMaterial = gameObject.GetComponent<Renderer>().material;
Material sharedMaterial = gameObject.GetComponent<Renderer>().sharedMaterial;
//_MainTex is the name of the main texture for most stock shaders ... but not all
//You can select the shader by clicking the gear in the inspector of the material
//this will display the shader in the inspector where you can see its properties by name
instancedMaterial.SetTexture("_MainTex", movie);
The following code is from a working class object I use to set a Unity UI object RawImage to render a movie. From what I see in your example you have the movie part correct I suspect your issue is with the shader parameter.
using UnityEngine;
using System.Collections;
public class RawImageMovePlayer : MonoBehaviour
{
public UnityEngine.UI.RawImage imageSource;
public bool play;
public bool isLoop = true;
public MovieTexture movie;
// Use this for initialization
void Start ()
{
movie = (MovieTexture)imageSource.texture;
movie.loop = isLoop;
}
// Update is called once per frame
void Update ()
{
if (!movie.isPlaying && play)
movie.Play();
}
public void ChangeMovie(MovieTexture movie)
{
imageSource.texture = movie;
this.movie = (MovieTexture)imageSource.texture;
this.movie.loop = isLoop;
}
public void OnDisable()
{
if (movie != null && movie.isPlaying)
movie.Stop();
}
}