As seen on a video of the HoloLens 2 presentation it is possible to have a pinch slider, a touch slider and also a push slider. Poorly there is only a prefab for a pinch slider and I can't get the slider to work also as a touch slider.
Which version of the MRTK was supporting the touch slider? Couldn't find any version >= 2.5.0.
How can I do this? I thought adding a NearInteractionTouchable script on the thumbRoot button would help but it does not.
If you want to adjust the value by poking with one finger in MRTK 2.7.2, please check out the TouchSliderExample.unity which is located at /MRTK/Examples/Experimental/TouchSlider/Scenes.
Related
greetings fellow programmers and game devs.
I am kind of new to Unity3D. I have the latest beta build of 2017 Personal.
I have a bit of a problem. I have a first person controller prefab and a dual touch control and an event system, yet when I test the game on my iPhone 5s, I can look around but as soon as I am done panning by swiping the camera resets to the place I was looking previously and I am unable to move forward or back.
Now, I also tried not having the dual touch control and it does the same thing.
Also, I checked input and the axis is set to both Horizontal and Vertical and the other is Mouse X and Mouse Y. I also tried dragging two mobile joy sticks and assigning them Mouse X and Mouse Y and the other joystick Horizontal and Vertical.
Any help would be appreciated. I also want to note that I had another project that worked with just dragging and dropping the dual touch controls on to the hierarchy and even though the settings are the same for the FPSController and the Camera and the Controls it just doesn't seem to work.
I am using GVR Unity SDK Version: 1.1 with Unity 5.5.0f3. I need to display a button similar to the Gear Icon that is rendered by the Google VR SDK. This button should accept touches and should be visible at all times. The issue with using a Button created in the Unity Canvas is that it does not process touches because the GVR Input Module takes precedence over the StandAlone input Module.
I have looked at the google vr sdk code and have found that the gear icon is rendered using OpenGL calls in the PostRender.cs file but still am not able to find where the touch is processed. Beyond this I am stuck as I dont have much knowledge of OpenGL.
My question : How do I render a button on top of the existing UI (so that it is visible at all times just like the gear icon) and get it to accept touches (by calling a function in my code).
I know that you have to do some recasting from the controller to the UI. There is a demo scene called "Scrolling UI". Try checking there to see if there is something that can help out.
I'm not a hundred percent sure, but just some advice to the right direction.
Hope you find your answer!
The simplest way I found for enabling regular touch inputs on top of the VR scene is to use a Screen Space - Overlay canvas with a regular Graphic-Raycaster and add a Standalone Input Modul to the EventSystem game object.
This way the buttons in that canvas are clickable.
Hope it helps.
Using Xcode 6 GM, I created a new project based on the Game template using SceneKit and Swift. Then I dragged a slider onto the SCNView. Now in the simulation, when I move the slider, the slider moves (as expected) but the camera moves as well (like if I was simply touching the screen). I'm not an experienced iOS developer but I never had this problem in the past. Did I forget to do something? Did anything change recently in the default behaviour of sliders? It looks more like a bug to me. If you add a slider, you want to control only this slider, not what is underneath.
Anyone can help me there or confirm the issue? Sorry if it's the answer is very simple. I googled my problem but could not find anything.
There is a property on the scene called allowsCameraControl, and if that is set to true then the panning will control the camera. It's probably on by default in the game template.
I want to detect movement of an object in iOS camera. Actually I am working on a project where the iOS camera is placed some where and when it detects any movement on the camera screen then it gives notification or fire any particular event.
Please suggest if how can I achieve the same.
I have a 3D model in my Unity project and I have a JavaScript that rotates the camera based on keyboard arrow keys (left/right).
Now, I need to have a script that detects a horizontal swipe hand gesture and returns a vector that I would use to rotate the camera.
I am using the ZigFu SDK with PrimeSense OpenNI/NITE. The ZigFu SDK comes with sample scripts, one of which is SwipeDetector - I am wondering how does it work?
My setup:
I have 3 GameObjects: a 3D model, a MainCamera, and a Directional Light.
So, how do I use the SwipeDetector script in my project? The way I do it right now is 1)Create an empty game object "SwipeDetection", 2) "drag and drop" the SwipeDetector script from ZigFu. I've put in logs in the SwipeDetector script, but I don't see them.
The Zigfu bindings (I'm assuming you're using version 1.4?) dont have a SwipeDetector sample, but they do include a SwipeDetector MonoBehaviour. The SwipeDetector detects vertical and horizontal swipes, but unfortunately doesn't detect the velocity of the swipe.
You have a few options:
Use the provided Swipe Detector, and rotate the camera by a fixed amount every time you detect a horizontal swipe (SwipeDetector_Left or SwipeDetector_Right events)
Use the provided Swipe Detector, start rotating on Swipe, and stop rotating on the SwipeDetector_Release event. This would be similar to pressing on the arrow keys (assuming you have the same behaviour on keydown/keyup events)
Keep track of the hand velocity, and check its value when the swipe occurs. Use this value to rotate the camera. You can keep track of velocity by creating a new MonoBehaviour, and implementing Hand_Create, Hand_Update, and Hand_Destroy (look at any of the scripts in the HandpointControls folder). Keep a queue with the hand points from the last n frames. The delta between the newest & oldest points will be your velocity for those n frames (I recommend you start with 15 frames, or about half a second)
(This will be included in a future Zigfu release :))
Your game object setup sounds right - if you dont see any logs you may not be performing the 'focus gesture' correctly. Try waving or performing a tap towards the sensor - this should cause the Hand_Create event to be called. Once you have a valid handpoint you should get the proper events from the Swipe Detector.
Also worth mentioning your swipe detection game object should have a HandPointControl component (added implicitly with RequireComponent) and that 'ActiveOnStart' should be true.