Hi I developed an app vr gaze app by watching tutorial.(https://www.youtube.com/watch?v=_YTVsLnK-XU) where when the reticle(VR gaze cursor) is clicked the cube will move upward.each time I click the cube it will move upward.actually the project works in unity.but after making it as an app. I used Bluetooth controller.but the gaze cursor is not selected by the controller. It's not working.even after clicking also it's not moving upward .Please help!
ma unity version is 5.5.2 the googlevr sdk version is v1.0.0
the code i used is
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class MoveStoolUp : MonoBehaviour
{
// Use this for initialization
void Start()
{
}
// Update is called once per frame
void Update()
{
}
public void movestoolupward()
{
transform.position += new Vector3(0f, 1f, 0f);
}
}
Related
I am in UNity3D. I am trying to raycast on UI Element and get the world position of the hit. I have the small test code here. It gives me the name of the target UI Element, but not the position. The world position of the hit is always (0,0,0). Please, can someone suggest, how i can get it right? thank you a lot.
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.UI;
using UnityEngine.EventSystems;
public class RaycasterRayScriptTest : MonoBehaviour
{
[SerializeField] private GraphicRaycaster m_Raycaster;
private PointerEventData m_PointerEventData;
private EventSystem m_EventSystem;
[SerializeField] private Vector3 HitPosition;
// Update is called once per frame
void Update()
{
m_PointerEventData = new PointerEventData(m_EventSystem);
m_PointerEventData.position = Input.mousePosition;
List<RaycastResult> results = new List<RaycastResult>();
m_Raycaster.Raycast(m_PointerEventData, results);
foreach (RaycastResult result in results)
{
Debug.Log("Hit " + result.gameObject.name);
HitPosition = result.worldPosition;
Debug.Log("HitPosition " + HitPosition.ToString());
}
}
}
I just solved this problem. You have to set Canvas Render Mode: Screen Space - Camera. Select the camera, that you have to use for the UI. With this UI-Plane will appear as an GameObject in the world. But the ui-plane position is not adjustable, so you have to move the camera back and adjust the field of view of the camera.
And because the UI-Plane is now an Object, the code
HitPosition = result.worldPosition;
works now.
without the changes on the canvas and camera, the code doesn't work, because plane of ui is virtual and it's position set to the camera position. So if you try to get the worldPosition, it uses the distance betwenn camera and UI-Plane, and since it is zero, you will get zero for the worldPosition as well.
I am using the Oculus Quest with Unity 2018.4.22f1 and Oculus SDK. Moving into my application works well. But every time I want to get the position of the headset, the vector zero is returned.
I tried these solutions :
OVRPose tracker = OVRManager.tracker.GetPose();
return tracker.position;
And
GameObject camera = GameObject.Find("OVRCameraRig");
return camera.transform.position;
This is position tracking setup:
Do you have any idea how to get the headset position?
When I want to get the headset's position, I am using the transform.position of the CenterEyeAnchor which is inside OVRPlayerController/OVRCameraRig/TrackingSpace.
using UnityEngine;
public class YourScriptNameHere : MonoBehaviour
{
GameObject centerEye;
Vector3 headsetPos;
void Start(){
centerEye = GameObject.Find("CenterEyeAnchor");
}
void Update(){
headsetPos = centerEye.transform.position;
}
}
I followed this tutorial and done everything that was mentioned, everything looked and worked fine except the reticle point expands and contracts when I gaze upon the cube but the click event does not trigger. Any help would be greatly appreciated.
Code for random teleport is
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.UI;
using UnityEngine.EventSystems;
public class ReticleTest : MonoBehaviour
{
public void RandomlyTeleport()
{
Debug.Log("reached here");
gameObject.transform.position = new Vector3(
GetRandomCoordinate(), Random.Range(0.5f, 2), GetRandomCoordinate());
}
private float GetRandomCoordinate()
{
var coordinate = Random.Range(-7,7);
while( coordinate > -1.5 && coordinate < 1.5)
{
coordinate = Random.Range(-5,5);
}
return coordinate;
}
}
These are screenshots
Update :
It turns out that the same thing is happening when I bring up the scene that comes premade with SDK HelloVR, although the hexagon(the only interactive thing in the scene) changes color when I gaze at it, nothing else happens when I click it. So, this is not a problem with what I made but the inherent problem of unity or the SDK
I am trying to switch to another scene when the headset is removed. It is already working in the Unity editor, however, not in the build.
Setup:
Unity 2018.3.6f1
SteamVR Unity Plugin v.2.2.0
Vive Pro
This code is working in the editor:
using System.Collections;
using UnityEngine;
using UnityEngine.SceneManagement;
using Valve.VR;
public class EndGame : MonoBehaviour
{
[Tooltip("This action lets you know when the player has placed the headset on their head")]
public SteamVR_Action_Boolean headsetOnHead = SteamVR_Input.GetBooleanAction("HeadsetOnHead");
void Update()
{
if (SteamVR.initializedState != SteamVR.InitializedStates.InitializeSuccess)
{
return;
}
if (headsetOnHead != null)
{
if (headsetOnHead.GetStateDown(SteamVR_Input_Sources.Head))
{
StopCoroutine(RestartGame());
}
else if (headsetOnHead.GetStateUp(SteamVR_Input_Sources.Head))
{
StartCoroutine(RestartGame());
}
}
}
IEnumerator RestartGame()
{
yield return new WaitForSecondsRealtime(3);
SceneManager.LoadScene("Startscene", LoadSceneMode.Single);
yield return null;
}
}
In the Editor \actions\default\in\HeadsetOnHead is referenced to public SteamVR_Action_Boolean headsetOnHead.
The actions.json (including /actions/default/in/HeadsetOnHead) is in the build folder and the logs don't show any errors.
Switching to SteamVR_LoadLevel instead of scene management solved the issue for the current Unity Version.
Also, the script was attached to the SteamVR Player Prefab in the scene. Now it's placed on a seperate gameobject in the scene, to avoid issues with dont distroy on load with the Player prefab.
I have the code exact from a tutorial I copied, and the webcam is inserted and works. But when I load the Unity game (In Unity Editor) there is no "No Device connected" error or incorrect scripts. I'm confused as to why it isn't working.
Why isn't it being displayed?
My webCamScript
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class webCamScript : MonoBehaviour {
public GameObject webCameraPlane;
// Use this for initialization
void Start () {
if (Application.isMobilePlatform) {
GameObject cameraParent = new GameObject ("camParent");
cameraParent.transform.position = this.transform.position;
this.transform.parent = cameraParent.transform;
cameraParent.transform.Rotate (Vector3.right, 90);
}
Input.gyro.enabled = true;
WebCamTexture webCameraTexture = new WebCamTexture ();
webCameraPlane.GetComponent<MeshRenderer> ().material.mainTexture = webCameraTexture;
webCameraTexture.Play ();
}
// Update is called once per frame
void Update () {
Quaternion cameraRotation = new Quaternion (Input.gyro.attitude.x, Input.gyro.attitude.y, -Input.gyro.attitude.x, -Input.gyro.attitude.y);
this.transform.localRotation = cameraRotation;
}
}
Solved
I found the problem, I had a custom texture on the plane which was stopping the camera texture from being inserted.
I presume it has something to do with the fact that you have your code wrapped in an if statement that is checking to see if you are running on a mobile platform. The editor will not be classed as a mobile platform and hence that code will be ignored