I have received comments from blind users that some of my sound and music related apps only work with VoiceOver off.
With VoiceOver Accessibility enabled on an iOS device, is it possible to enable a music keyboard or drum pad touch area so that music sounds can be played immediately, instead of VoiceOver prompts, when a keyboard key or virtual drum set (etc.) is tapped?
Just setting the UIAccessibilityTraitPlaysSound AccessibilityTrait on a UIView subview doesn't seem to do it. I get VoiceOver clicking instead of piano or drum sounds with VoiceOver enabled.
A blind user can turn VoiceOver completely off, but then all the other buttons (Instrument selection, Configuration, Help, etc.) and/or controls will no longer will have VoiceOver assistance.
I can now answer my own question.
iOS 5 has added a new API exactly for this need:
[ mySubView setAccessibilityTraits: UIAccessibilityTraitAllowsDirectInteraction ];
will disable VoiceOver just for that UIView subview, but leave other subviews (other buttons, etc.) unaffected. This API allows an app to get responsive touch handlers more suitable for keying a musical instrument within the specified subview, even with VoiceOver enabled and providing assistance for other portions of the app's UI outside the specified UIView.
I don't think it's possible for you the developer to turn off VoiceOver, and as you've discovered it's impractical for users to play an instrument with Voiceover intercepting gestures.
I think it's acceptable to leave this in the hands of the user. It's easy to toggle Voiceover off/on by triple-tapping the home button, although this is off by default (users can turn it on via Settings > General > Accessibility > Triple-click Home). With that option engaged, users can explore your interface with Voiceover on, then toggle it off when they're ready to make music, and toggle it back on again when they're finished.
With that in mind you just need to be sure all interactive components (e.g., piano keys, parts of the drum kit) have meaningful labels.
Related
How can I check if the device is a touch device or a mouse device?
Using kIsWeb is not sufficient because if using web version of the app on a mobile device kIsWeb returns true, but I need it to return false because it is a touch device.
Checking the platform doesn't work either because if using web version of the app on an iOS device for example returns false for iOS platform check.
Use case - I have two different types of video players for my app. One is suitable for touch devices (you tap to show and hide controls) and one is suitable for mouse devices (controls show when you mouse into the player and hide when you mouse out).
Youtube has the same idea. If I use the youtube app or website on my iPhone I get touch controls. If I use the youtube app or website on my iPad Pro I get touch controls. If I use the youtube website on my Mac I get mouse controls at all screen sizes (even mobile screen sizes).
So I guess I really just need to know platform on the web. I can get platform if not on the web.
Great question #jwasher! I had the same issue - a touch and swipe based UI that was great as a native mobile app, great as an single page web app (SPA) on mobile web browsers, but that was weird and clunky for mouse based interactions when the SPA was used in a desktop web browser.
The solution I have settled on is to wrap sensitive widgets in a MouseRegion (https://api.flutter.dev/flutter/widgets/MouseRegion-class.html) to detect mouse activity at runtime, then reacting by augmenting the UI with buttons to provide a mouse focussed way of triggering actions previously only linked to touch triggers.
You could use this to "mouse-enable" individual widgets, or you could wrap the whole app in a MouseRegion, trip a state field when activity was detected then rebuild the widget tree in a substantially different way for point and click users.
This strategy may incur some minor complexity/CPU overhead on devices that will never have a mouse attached (like a smartphone), but the solution will be more flexible than a build or configuration time capability determination. If a user starts the app on a tablet, then puts it in a stand and attaches a bluetooth mouse - it will react appropriately.
A device isn't "a mouse device" or "a pointer device". Events have an input type--see Pointer event.kind--but not the whole device. A laptop can have a touch screen, and a tablet can have a stylus or external mouse; an app running in those environments can receive both types of event.
Without knowing what you are trying to accomplish with this classification, is hard to advise on how to accomplish it. Trying to style your UI based on a guess of the primary interaction mode, for instance, is a completely different problem than reacting to a mouse event differently than a touch event.
My team is developing a HTML5 web application with Edge and JavaScript. We need to support touch devices also, but we've bumped into a problem:
How can we simulate a rollover or mouse-over event on a touch device?
Any idea is welcome, not necessarily a code example.
This is an ergonomic problem, not a technical one.
And the short answer is : you cant :)
Put simply, all the rollover actions on a standard device must be rethink with click actions.
For exemple, a rollover top navigation menu on a touch screen device must work with clicks on the menu instead of roll over actions.
At least this is what we do for multi-support web applications...
Is there any way in iOS SDK to detect the presence of an active Bluetooth keyboard? As many well know, when a Bluetooth keyboard is active, the on-screen keyboard does not show, so interface placements might have to change...
Right now I am doing this semi-passively by responding to keyboard events, but those notifications are a little slow to post and don't jive perfectly with my animation code. It would be nice to just have a BOOL somewhere to read on...
This is a bit curly, and I am not sure there is an answer.
I have a simple application that uses a handheld bluetooth scanner paired to an iPhone to keep lists of parcels coming of the back of trucks.
The scanner acts-as a keyboard sending character strings on scan. In the application the user must pair with the scanner in settings.
Upon scan complete a textfield is populated with the sent string. The last character is a return char, at which time the contents are added to a datasource for a UITableView.
The problem is this; Once the scanner has been used once the system seems to recognize it as the only user input. Any future attempt to bring up the soft keyboard fails. This goes beyond the scanning application ~ quitting the app completely and attempting to use Apples SMS app also fails to bring up the keyboard.
Is there any (apple legal) way of either using both or setting preferred input device? There seems to be a myriad of legal issues around Bluetooth and accessories, I am wondering if I am out of luck. Has anyone heard of anything that might help me out?
It appears I am not alone (as in this post regarding iPad soft keyboard)
I think you've pretty much covered it.
According to HT4111:
You can stop using a Bluetooth accessory by either turning off the accessory, or turning off Bluetooth on iPad.
According to Gizmodo's 10 iPad Essential Tips & Tricks:
When you have a Bluetooth keyboard connected to your iPad, the virtual keyboard will cease to appear. (This is a good thing.) However, what if, for some random reason, you needed that virtual keyboard? Don't unpair your Bluetooth. Just... Hit the eject key on Apple's physical keyboard. It'll bring up the virtual one.
If there's an off button on the scanner, then hit that. If you have an actual bluetooth keyboard, then use that (or hit its eject button if it's an Apple keyboard). If you have control over the design of the scanner hardware, then you can add a "show keyboard" button (I'm not sure which keycode Apple uses for "eject") if turning it off is too tedious.
Socket Mobile just added a new "double tap" feature to their Bluetooth barcode scanner that lets you open the onscreen keboard. There's a video demo on YouTube. http://www.youtube.com/socketmobile
I would like to allow controlling the iPod on top of my application when the user double taps the home button. I know there are a few apps out there that allow this. Unfortunately my app just quits.
Do I have to set anything to allow this behavior?
Is it possible that my app prevents the iPod app from showing above it because it also plays a sound from time to time?
The double-tap-for-iPod behaviour, if people are using the defaults (most people do) only works to bring up the mini player if music is already playing.
Check out Settings > General > Home. As you can see, there are a variety of things the user can choose to happen on double-click. Below that list of things, there is the option for "iPod controls" - when playing music, show iPod controls. This is what you want, and unfortunately there is no way you can choose to override the default behaviour and show iPod controls when music isn't playing.
There are 2 other options (both for the user, not you unfortunately):
If they turn the screen off, whilst in your app, and then double-click the home button, the mini player will show on screen and then they can play music. They will then still be within your app when they turn on and unlock the screen.
If they have a 3G-S or latest iPod Touch, then holding the home button will bring up the voice command menu, from which they can play music. They will then still be in your app when that menu closes.
Probably not the answer you wanted, but that's iPhone development for you!