Apple Watch Remote App - swift

I was wondering how the Apple Watch Remote app is able to detect swiping gestures and taps on an area that does not look like a button. Is this restricted to Apple only at this time, or is there a way for developers to take advantage of this?

At least for now (Jun 1, 2015) before they preview new SDK on the coming up WWDC, it is not available for 3rd party developers. All interactions you currently have on watch are tapping a button or table row, force tapping to show menu, swipe to page horizontally for pagination UI and scroll the entire the screen vertically.

You can't add that kind of gesture recognizers currently in WatchKit. This is only available for Apple-owned apps (Apple Watch Native Apps) at this time.
However, it is always good to wait for WWDC 2015 which takes place in San Francisco (June 8-12) and see the new kits and developing materials.
They will introduce a new WatchKit version that allows you to add gesture recognizers and make native Apple Watch apps without need to have iPhone nearby.

Related

In Flutter, how can I check if a mouse device or a touch device?

How can I check if the device is a touch device or a mouse device?
Using kIsWeb is not sufficient because if using web version of the app on a mobile device kIsWeb returns true, but I need it to return false because it is a touch device.
Checking the platform doesn't work either because if using web version of the app on an iOS device for example returns false for iOS platform check.
Use case - I have two different types of video players for my app. One is suitable for touch devices (you tap to show and hide controls) and one is suitable for mouse devices (controls show when you mouse into the player and hide when you mouse out).
Youtube has the same idea. If I use the youtube app or website on my iPhone I get touch controls. If I use the youtube app or website on my iPad Pro I get touch controls. If I use the youtube website on my Mac I get mouse controls at all screen sizes (even mobile screen sizes).
So I guess I really just need to know platform on the web. I can get platform if not on the web.
Great question #jwasher! I had the same issue - a touch and swipe based UI that was great as a native mobile app, great as an single page web app (SPA) on mobile web browsers, but that was weird and clunky for mouse based interactions when the SPA was used in a desktop web browser.
The solution I have settled on is to wrap sensitive widgets in a MouseRegion (https://api.flutter.dev/flutter/widgets/MouseRegion-class.html) to detect mouse activity at runtime, then reacting by augmenting the UI with buttons to provide a mouse focussed way of triggering actions previously only linked to touch triggers.
You could use this to "mouse-enable" individual widgets, or you could wrap the whole app in a MouseRegion, trip a state field when activity was detected then rebuild the widget tree in a substantially different way for point and click users.
This strategy may incur some minor complexity/CPU overhead on devices that will never have a mouse attached (like a smartphone), but the solution will be more flexible than a build or configuration time capability determination. If a user starts the app on a tablet, then puts it in a stand and attaches a bluetooth mouse - it will react appropriately.
A device isn't "a mouse device" or "a pointer device". Events have an input type--see Pointer event.kind--but not the whole device. A laptop can have a touch screen, and a tablet can have a stylus or external mouse; an app running in those environments can receive both types of event.
Without knowing what you are trying to accomplish with this classification, is hard to advise on how to accomplish it. Trying to style your UI based on a guess of the primary interaction mode, for instance, is a completely different problem than reacting to a mouse event differently than a touch event.

How to dismiss navigation bars on iphone iOS8 Mobile Safari when using tap drag and swipe gesture over the entire webpage

I work on a fullscreen iphone web application using gestures like tap, drag and swipes over the entire webpage. Minimal-ui was the best solution found for this kind of project.
According to Apple Specifications:
The minimal-ui viewport property is no longer supported in iOS 8.
What is the new way to simulate the old minimal-ui behavior?
Here is some information on telling iOS that a webpage is webapp-compatible so users can save it to their home screens and use it as if it were a separate app, with absolutely no safari controls visible.
From the apple developer docs:
A web application is designed to look and behave in a way similar to a native application—for example, it is scaled to fit the entire screen on iOS. You can tailor your web application for Safari on iOS even further, by making it appear like a native application when the user adds it to the Home screen. You do this by using settings for iOS that are ignored by other platforms.
...
On iOS, as part of optimizing your web application, have it use the standalone mode to look more like a native application. When you use this standalone mode, Safari is not used to display the web content—specifically, there is no browser URL text field at the top of the screen or button bar at the bottom of the screen. Only a status bar appears at the top of the screen.
...
Your web application can link to other built-in iOS apps by creating a link with a special URL. Available functionality includes calling a phone number, sending an SMS or iMessage, and opening a YouTube video in its native app if it is installed.
This would enable you to completely hide the safari navigation AND link to other built-in functionality such as placing a phone call or composing an SMS.

rollover on touch device

My team is developing a HTML5 web application with Edge and JavaScript. We need to support touch devices also, but we've bumped into a problem:
How can we simulate a rollover or mouse-over event on a touch device?
Any idea is welcome, not necessarily a code example.
This is an ergonomic problem, not a technical one.
And the short answer is : you cant :)
Put simply, all the rollover actions on a standard device must be rethink with click actions.
For exemple, a rollover top navigation menu on a touch screen device must work with clicks on the menu instead of roll over actions.
At least this is what we do for multi-support web applications...

Play splash movie every time app launch (with multi-task support)

the app I'm working on supports iOS multi-task feature by default, and I want to stick with this.
Upon app launch, a splash movie clip is played (code is in AppDelegate), after user hits the home button, and re-launches the app, I want to the same splash movie be played before showing the last view where use was.
I know by switching off the multi-task support, I can achieve this, but in the meanwhile, I'm losing the multi-task support feature, and I need to write code to save/resume user states. So, is there any workaround for this? thanks!
You could try the app delegate's applicationDidBecomeActive: method but quite frankly I'd consider this to be user hostile behaviour. Who wants to see a movie every time they switch between apps? The point of multitasking on the iPhone is to quickly change between apps and this violates that.

custom camera preview on the iPhone?

I have two of my apps rejected by Apple and sitting on the "shelves of approval" for 2 months, because both apps were using UIImagePickerController and I dared to add a rectangle on top of the UIImagePickerController, using something as
[picker.view addSubView:rectangle];
On the other hand, applications like CameraZoom and others, ditch the UIImagePickerController regular appearance completely and has its own interface, with custom graphics and sliders on top of the camera preview and even with the ability to zoom the preview image in real-time.
My question is: how can one do that and not be crucified by Apple?
thanks for any insight!
As far as I know, it's been hit and miss. Some apps get through, some don't, and it's really quite annoying (as is app approval in general).
In SDK 3.1, there is a new Camera Overlay concept, where you can overlay your own view on top of the camera view. You can find more documentation on the iPhone Developer website (since it is 3.1, it is under NDA).