I am building an Augmented Reality app for Hololens. I need to test the Air Tap Gesture, without deploying to the Hololens.
Is there any method to test the Air Tap functionality through the Hololens Emulator or the Holographic Emulation provided by Unity3D?
This will depend on your emulation environment:
Hololens Emulator (From Microsoft, allows multiple input devices)
Air tap gesture - Right-click the mouse, press the Enter key on your keyboard, or use the A button on an Xbox controller.
Windows Holographic Emulation (From Unity, requires game controller)
Perform a tap gesture with a
virtual hand - Left and right trigger buttons; the A button
Adding to the other answer. If you don't have a HoloLens this will not help but, if you do this will speed up development significantly.
On your HoloLens, download the Holographic Remoting App. Once it is downloaded open and run it and you will see your IP address for your HoloLens.
Now, go to the Unity editor -> Window tab -> Holographic Emulation. This will open a new window. For Emulation mode select remote device. Under Remote Machine enter the IP address. You can tinker with the other settings as you see fit.
Now with the Application running on your HoloLens click connect. Once you are connected the Unity editor will tell you and the HoloLens should go blank. Now when you press play on the editor the app will run on the HoloLens.
This does not download the app to your HoloLens and stops running when you stop the editor
If you are working in the Unity editor,
Shift + LMB (left-mouse-button) to simulate air-tap with the left
hand;
Space + LMB to simulate air-tap with the right hand.
Other useful points:
gaze can be controlled with the mouse by keeping RMB down. Also
q/w/e/a/s/d. (I know question was only about airtap, but they kind of
come together).
If you need to test typing some data with the standard keyboard, no
need to simulate airtap - just click with the mouse.
Related
I'm developing the WebGL game need to be controlled by a joystick.
But I don't have a joystick device.
I don't have any idea which joystick emulator is the best for unity and how to use it.
You could maybe connect a playstation/xbox controller via a usb cable to your computer. Then you could it to see if your controller works on your computer by using this website: https://gamepad-tester.com
then look at this to see if it works, the different parts of the controller in the image should react to what you do with the controller in real life:
For an app I am making on macOS I want different behaviour for the trackpad vs. a mouse. Is there any way to check if an external mouse is connected (e.g via usb or bluetooth) to the mac in Swift? I.e the assumption will be that if an external mouse is connected then the user will be using that rather than the trackpad. If they have an external mouse connected but would like to use the trackpad as as a pointer then they can change this in a settings menu.
I am currently encountering issues with detecting a touch for an iOS test game. I have imported the Cross PlatformInput library provided by Unity and did include the DualTouchControls in my Scene like so:
When I run the app on my phone I can see the black box but when I press the screen it does not jump my rocket (character). These are the settings of my Jump DualTouchControl:
And here is the rocket itself with its game environment:
What would I like to achieve?
I would like to jump the rocket whenever a user taps his screen on his mobile phone. Previously I didn't work with the DualTouchControls because I mainly build it for desktop for testing purposes. It works perfectly on desktop tho using the spacebar.
Hopefully someone can help me out with some tips.
How can I see the game on my glasses when pressing play in the editor?
I'm using GearVR. USB Cable is plugged to my headset.
You have to build and deploy the app on your phone and when it runs then mount it in headset. You can not use unity remote for VR apps. This is because GearVR headset takes priority when you mount the phone. VR apps which have GearVR sdk enabled should default "insert your phone in headset" when you run it. An app can either be VR or Non-VR and UnityRemote is Non-VR App. Can only be used as container for Non-VR games.
I want to know whether the open laszlo mouse down events will be converted to touch events while compiling it in mobile format.
Yes, at least on iPad. I tested this myself on an iPad2 running the OpenLaszlo 4.9.0 in HTML5 (aka DHTML) mode of my application last year for R&D purposes and the following were confirmed to work:
1) Touching a button on the screen in the application in OpenLaszlo HTML5 mode properly triggered the onclick event of the button.
2) Drag and drop with your finger on a touch screen in OpenLaszlo HTML5 mode has the same result as dragging and dropping with the mouse on a non-touch screen system.
Note: This was only tested on the iPad2, it was not tested on Android, Windows Phone, Blackberry, etc.
Flash is not relevant for mobile (since Flash Player has just been removed from Google Play store), but Adobe AIR for Android and iOS is an option, if you want to build native applications. In that case, you have to start capturing the touch events using the ActionScript3 API.