My game has a swipe up gesture recognition that does something. Because I use it a lot and making swipe on iPhone Simulator is not easy I want to use the real keyboard instead.
How can I capture the real keyboard (the one connected to the pc running the simulator) presses to do the thing instead of having to swipe with the mouse?
Well, I would suggest using real hardware to do this if thats the problem. But you obviously dont want to take that route OR, dont want to pay the $99 bucks yet. So I suggest creating a macro that does the mouse movement for you. You can create a macro that repeatedly does a mouse click, move and unclick (a swipe gesture). Over and over again with a pause interval inbetween. You can then set a hotkey like (CTRL + M) that will start and stop the macro. Good luck.
Related
In a nutshell, my goal is to create my own program for visually looking at game controller input.
However, it appears that once the unity application is in the background, ergo, once I switch to a different window to play a game, the controller input for my unity program is no longer read.
I've seen that the user Elringus had a project called UnityRawInput to help with this, but it only works for keyboard input. I've seen a lot of different responses related to this question that mention using native libraries to "hook" to get background input, but it apparently only works for keyboards.
What information is out there such that I can hook to game controllers themselves? I cannot seem to easily search for information pertaining to game controllers (and their various axis) themselves and not keyboards.
Why not a keyboard? Because I am very interested in a game controller's triggers in the way where, when pressed slightly, it can preform "weaker" actions rather than being pressed all the way, which a keyboard cannot discern as far as I can tell.
Though, wherever the solution might be, even outside of Unity, I'd like to know. Thanks! :D
How do I stop an EV3 program that is in an infinite loop? (I didn't remove the batteries because it might damage it). I have the stock lego firmware installed.
just press the button on the top left simple.
I found out: Hold the back button, left button, and center button. When the screen goes blank release the back button.
If you're using the official LabView, you can just press the back button. This doesn't work when you use c4ev3.
Your problem is that you need to use LabView which should come for free with the kit. if you are using C then you need to program in a parallel wait block as an exit contingency which will keep looking for an input by the user or its sensors, say from brick buttons or a touch sensor. then make it terminate the other code in the program and then itself.
p.s. you can also download the software-firmware package (including LabView) for free on the lego website under 'Downloads' in the mindstorms section
I am currently developing a game in Unity3D. One of the features in the game involves having both LMB and RMB pressed at the same time.
My problem is that I tested my code with my own mouse a Trust/15315 and it was not able to register both buttons pressed at the same time, but just the 2 of them individually. I then tested the same code with 3 other mouses and it worked perfectly.
My questions are: Is my mouse not working properly by being broken? Or is this a feature in my mouse? Are there other mouses that are built to act like this (not being able to detect both mouse buttons down at the same time)?
As asked I added the code used:
using UnityEngine;
using System.Collections;
public class Gestures : MonoBehaviour {
void Update () {
print (Input.GetMouseButton(0)+" "+Input.GetMouseButton(2) + " "+ Input.GetMouseButton(1));
}
}
Also I would like to add that the left and the middle buttons work together but the right doesn't work with the other two.
Hard to say if it's intended by Unity or not to get a clean single button click implementation across different mice and OSs.
I struggled with a similar problem and found out that it's more reliable to read the mouse button status from Input.GetMouseButtonDown and Input.GetMouseButtonUp events.
I then tested the same code with 3 other mouses and it worked perfectly.
It sounds like it is just your mouse. Tracking down the exact issue is tricky since there are so many steps in the process. (hardware -> onboard software -> wireless -> drivers -> OS -> Unity) Sloppy implementation at any point in that process could theoretically cause an issue like you described.
As others have mentioned, the mouse events can be a bit glitchy even in the best of times. The good news is that, depending on what exactly you need to do, there is usually a work around.
If you are aiming to handle single vs. double click this might be useful: simultaneous single and double click functions
If you are looking for click-and-hold sort of behavior you will likely want to look at mapping the mouse buttons via Input.GetAxis.
I'm working on a game for Google TV (Sony NSZ-GS7) which uses the touchpad to move things around.
AFAIK it is not possible (yet) to change or disable the pointer Arrow-bitmap programmatically, which is pretty shitty from game design perspective, is it at least possible to disable the default click-sound played by the device on every click so i can replace it with my own?
Try to disable the standard onClick noise using setSoundEffectsEnabled(false);
Basically, we either have remote access to the iPhone or the phone is connected to a network where we can control the phone (send it messages etc.) How can I simulate a swipe without touching the actual phone? I know there are Swipe Recognizers, but I haven't found a way to HARDCODE coordinates to simulate a swipe; for example, without touching the phone, perform the swipe to unlock.
A swipe is input. You'd normally recognize the swipe, either with a gesture recognizer or by handling the touch directly, and then perform some sort of action. If you want to simulate a swipe, just perform the action that would be performed if the user made the equivalent gesture.
For example, if a swipe would normally switch to a different view, simply call the method that switches to that view. If possible, do it with animation so that the user has some visual indication of what's going on.
I'm not entirely sure that what you are trying to do is possible, unless maybe with a script in the Automation instrument. However, if the iPhone is jailbroken, you could install Veency and connect the phone through any VNC client and interact with it that way.