I am coding a game on a Touchscreen with many players at the same time. The issue is, when there are 2 or more touches, a little square is appearing on the screen. It seems to be a unity built-in feature as it is still present in an empty project.
Is there a way to prevent this annoying little square to appear ? I already disabled magic touch shortcuts in windows. And this doesn't appear on the desktop home screen.
I am able to listen to the touches. It seems to be only a visual thing.
Even when I disable multitouch with Input.multiTouchEnabled = false; It still appears.
I also tried to remove the 18 default Axes in the Input Manager.
My goal is to handle every touch separately, without listening to pinch, long press, or scroll interactions. Each player has only to tap somewhere on the screen.
Thanks for you time
Solved it by myself. I completely disabled Touch feedback in the windows parameters. I don't think it is the only way to do that but it works.
Configuration Panel > Pen and Touch
Unmark "Show visual feedback when touching the screen"
I am beginner for interactions on iphone. I have 3 buttons, one on the left, and the 2 other ones on the right. I would like to push the left button with finger and display a line real time stretching with my finger moving on screen that goes to the other button I am going to go to on the right - and then I reach the button on the right and release my finger the line stays on screen.
Where should I start in the manual to understand how to do something like this?
- please don't say page 1 ;-)
thanks for all the pointers so I can learn and write the code myself
Cheers,
geebee
Wow, uh, you're jumping right into the deep end.
First, the answer to your question is ... check out the Sample Code at developer.apple.com. By playing with those sample projects, and running them in the debugger, and reading that code, you can learn a whole lot about what you're trying to do.
Using those samples as your tutorial, refer to specific information in the reference docs also available at developer.apple.com. There are specific docs on the classes you'll be using (e.g. UIButton Class Reference), and there are high level guides that talk about different areas, like the Event Handling Guide for iOS, and Drawing and Printing Guide for iOS.
Here are some of the issues you're going to face, and some of the areas that you should read up on:
UIButtons are going to respond to your touch and won't allow any other view to receive that touch without some special handling in your code.
Check out samples that deal with UITouch events, Quartz 2D drawing, creating custom controls
Try doing this without any buttons first, that's fairly simple. Just figure out how to detect a TouchDownInside event, track the finger movement across the screen, and detect a TouchUpInside event, all in a normal UIView. (There may be a sample for this.)
I am not a newbie to Cocos2D but I am building quite an advanced HUD with several sliding and overlapping CCLayer and CCMenu/CCMenuItemImage objects.
They are all responding to touches correctly in turn. However when things overlap, it seems the buttons underneath take priority over the things on the top, no matter what order I add them to the world.
Indeed, even implementing the registerWithTouchDispatcher method and returning YES/NO ccTouchBegan:withEvent: seems not to have the correct effect. It also appears that ccTouchBegan:withEvent: is then called on all buttons/menus in the world rather than just those underneath the touch.
I'd really like advice on a reliable way to detect and consume a touch on an object that is top most in the view without anything else hearing about the touch.
Thanks in advance!
How about this commit for develop branch of cocos2d-iphone?
v1.0.0-rc3 or earlier doesn't have the mechanism for touch priority. This commit seems to implement it.
Why can't you use tags? I'm not sure at the moment how to check z order but I would personally probably just use tags.
I'm currently drawing some mockups of my future iPhone app.
One of the app's functionalities is to display a bar graph showing the evolution of a value over time. Users can perform few gestures on the graph :
swipe/drag to move through time;
pinch to zoom in or zoom out (and therefore display a longer or shorter period of time);
double tap to add a cursor to the graph (i.e. a vertical line with a label on top).
What I'm afraid of is users not noticing these gestures. Of course, I would provide buttons for doing the same tasks, but if users ended up only using those, the interface's usability would not be very great...
Therefore, I am wondering if there is any way to show some visual clues to indicate the presence of gestures on the interface. Do you know any app that does something similar?
I think if you animate mentioned graph behavior it would be a great clue for user to perform this actions by fingers. For example if he(she) choose another date you should move you graph through time smoothly with easyInOut animation. Or if user changed scale you should gradually zoom from scale 1 to scale 2.
I'm starting a new project which involves developing an interface for a machine that measures wedge and roundness of lenses and stores the information in a database and reports on it. There's a decent chance we're going to be putting a touch screen on this machine so that it doesn't need to have a mouse or keyboard...
I don't have any experience developing for full size touch screens, so I'm looking for advice/tips/info from you guys...
I can imagine you want to make the elements a little larger than normal... space buttons out a bit more.... things like that... anyone have anything else to add?
A few things to consider:
You need to account for parallax error when touching controls. Basically, the user may touch the screen above or below your actual control and therefore miss the control. This is a combination of the size of the control (eg you can have the active area larger than visual control to allow the user to miss and still activate the control), the viewing angle of the user (which you may or may not be able to predict/control) and the type of touch screen you're using. If you know where the user will be placed relative to the screen when using it, you can usually accommodate this with appropriate calibration.
Depending on the type of touch screen, you may need to ensure that your users aren't wearing gloves or using an implement other than their fingers (eg the end of a pen) to touch the screen. Some screens (eg those depending on conductance) don't respond well to anything other than flesh and blood.
Avoid using double clicks because it can be very hard for users to reliably double click a control. This can be partly mitigated if you've got experienced/trained users working in a fairly controlled environment where they're used to the screens.
Linked to the above, if you are using double clicks, you may find the double click activated when the user only wants to single click. This is because it's very easy for the user's finger to bounce slightly on touching the screen and, depending on how sensitive the double click settings are, trigger a double rather than a single click. For this and the previous reason, we always disable double clicks and only use single clicks (or similar single activation controls).
However big you think you need to make the controls to allow for touch activation, they almost certainly need to be bigger still. Make sure you test the interface with real users in the real deployment environment (or as close to it as you can get). For example, we deployed some screens with nice big buttons you couldn't miss only to find that the control room was unheated and that the users were wearing thick gloves in the middle of winter, making their fingers way bigger than we had allowed for.
Don't put any controls near the edges of the screen - it's very hard to get your finger into the edges (particularly if the screen has a deep bezel) and a slight calibration problem can easily shift the control too close to the edge to use. Standard menus and scroll bars are a good example of controls that can be very tricky to use on a touch screen and you should either avoid them (which is preferable - they're not good for touch screens) or replicate them with jumbo equivalents.
Remember that the user's hand will be over the screen, obscuring some of the screen and controls (typically those below where the user is touching, but it depends on the position of the user relative to the screen). Don't put instructions or indicators where the user's hand or arm will obscure them when trying to use the control they relate to (eg typically put them above rather than below the control).
Depending on the environment, make sure your touch screen is suitably proofed against dust, damp, grease etc and make sure it's easy to clean without damaging it. You wouldn't believe the slime that can quickly accumulate on a touch screen in an industrial or public setting.
The other obvious one is that there's no equivalent of pointer 'hover'. Not that that affects many apps though.
If you decide to put in analog controls (scrollbars, rotation widgets, etc) be sure to put in a digital control also. Some companies think that a touch screen means perfect control over something with your fingers. In real life, this translates to minutes of frustration trying to fix a number that's just a little off.
The most obvious thing is that everything on the GUI needs to be big enough for a fingertip to hit, which is sometimes bigger than you think.
As has been mentioned, there's really no way for a right-click action to happen. Also, double-clicking can be tricky with a fingertip on a touch screen.
The other major thing is that you'll want to create a on-screen keyboard that pops up for text entry and an on-screen numpad for number only fields.
I wrote my own set of controls for a POS application designed specifically to be touchscreen friendly.
Remember to allow enough real estate for stubby fingers and talons. In our application the users can have these manicures that necessitate them to use the pad of their finger instead of the tip. This means that you need to allow more space for activation areas than you would normally consider in any other type of application.
I would also recommend that you accommodate yourself as a programmer from a testing standpoint and from the point of view that things change and there may need to be a keyboard/mouse attached to a non-touch workstation. I cannot tell you how many times I went to touch my flat panel LCD expecting something to happen, before remembering that I had to use the mouse.
Make sure to read your basic UI principles like Fitz law (The time to acquire a target is a function of the distance to and size of the target).
Also consider whether or not the device is stationary or not when it is in use (e.g., like a palmpilot or iphone), research shows that you must accomodate that into your design.
The larger gui elements is the major thing. But it applies to all elements, scroll bars, tabs and even text fields.
The other major thing that I can think of, it's hard for the user to right click. So things that require a right click should be avoided, context menus are the only thing that comes to mind at the moment.
The other responses are pretty good, but are you totally sure that a touch screen would actually be easier to use? There are a lot of devices where a touch screen actually makes them much harder to use, not easier. The main problem is that you can't use the device when you're not looking at it. If users are going to be doing a lot of repetitive actions, a keyboard could be a lot more efficient.
Also, a touch screen might be a lot harder to use by someone with a disability, if you think there's even a small chance that could happen.
Even though this is quite old now, I found it to still be useful, as a starting point for design considerations.
http://www.sapdesignguild.org/resources/tsdesigngl/index.htm
If you've not already done so, have a look at some of the documentation available for developers on mobile platforms, eg Windows Mobile, iPhone.