I've tried looking for similar posts but looks like they are all for the browser. I'm running JellyBean and don't have a touch device, so some apps (such as SubwaySurfer) don't respond. Is there a way to use mouse events instead? Would truly appreciate any help
Is the app supposed to take care of this? Or can it be done somewhere within Android?
Related
Are anybody aware of any event api for the apple watch ? I'm looking for detecting events like on/off wrist events do determine if glances should be displayed or not.
There are not any events other than the simple ones declared in the various interface controls, like willActivate, didDeactivate, button taps, etc..
There are rumors that the next version of WatchKit this Fall will include more capabilities like being able to run some code on the watch itself, so maybe it'll be possible to have more event notifications once that comes out, but who knows.
Might be i am using a wrong title but i will try to explain here what i want.
In iOS i need to implement a functionality to get notify if the user is using their iOS device.
My app will be running in background using location services and i need to find out if the the user is using their device. It is doable as i have looked into this application which is sending notifications in background to the drivers who is suing their devices while driving.
https://itunes.apple.com/fr/app/cellcontrol/id661169580?l=en&mt=8&ign-mpt=uo=2
So i need similar kind of functionality to find out if a user is using iOS device or not. If anyone of you can suggest me any approach then it would be great for me to start.
Thank you!
Note: I have tried to find out touch events in background but that is not possible as i have done some research on this.
You won't be able to receive touch events when the app is in background using public API's. However, you can do that with the help of mobileSubstrate library ( http://iphonedevwiki.net/index.php/MobileSubstrate - The MobileHooker component is the one that would be used). by hooking your process to the OS. As an example, the display recorder app in Cydia tracks global gestures while recording the screen. This will be a cydia tweak and you will need to jailbreak your device to do all that.
Coming to your specific use-case, the example app you cited should be using one of the exceptions for background applications mentioned in https://developer.apple.com/library/ios/documentation/iphone/conceptual/iphoneosprogrammingguide/ManagingYourApplicationsFlow/ManagingYourApplicationsFlow.html (see the section - "Implementing Long-Running Background Tasks"), probably the one to receive updates from external accessories.
i'm developing a webGL Desktop app, and i'd like to provide a multi-touch interface.
It seems that the only solution concerning desktops is https://developer.mozilla.org/En/DOM/Mouse_gesture_events , but how do i include it in my javascript code? How do I call the gesture callback functions, what do i have to include?
I'm not used to web development and i'm still learning a lot, so forgive me if it's a dumb question.
thanks!
How were you planning to trigger the multi-touch events on a desktop? Touchscreen? Magic Trackpad style input? If you’re going down that route then Gecko has touch events that handle multi-touch fine.
The main problem with your original solution is summer up with this paragraph:
Note: These gesture events are available to add-ons and other browser chrome code, but are never sent to regular web page content.
If you’re just building your app as a Fluid-style SSB then you’re not going to have these events available; you’d have to build your app as a browser extension to get into the correct security model.
Is it possible to create an app for the mac (and iphone afterwards) that does something when it detects that the focus is on a certain object in the screen?
Meaning, the program runs in the background, and when it detects that the focus (or cursor) is on an edit box, it will run something.
Hopefully I made myself clear!
Thanks!
You can do this on the mac by using the Accessibility Framework.
Note that users will have to manually enable assistive devices and you will not be able to distribute your app on the Mac App Store due to Apple's soon-to-be-implemented sandboxing restriction.
On iOS, you can detect focus to certain but not all elements using specialized delegate methods such as textViewDidBeginEditing:. That said, as users use taps to navigate iOS apps most of the time, simple tap handling seems like a much better approach.
On the iPhone, you can only detect focus within your own app, there's no way to observe other apps from the background.
On the Mac, as 0x90 noted, the closest you'll get are the Accessibility APIs. The UIElementInspector sample code may help you to get started.
I need to know if this is possible. I want to develop an iPhone app that uses facebook credential to login (this is possible i know) and the create an event (like a dinner) and invite friends from facebook. When the time for the event comes (like 15-20 minutes before the start time) all the users that are attending the event can see how far are the others participants to this event using GPS (Core Location lookups) and see on a map as they move towards the place of the event
Yes. Everything you have described in your question is possible.
However, iPhone doesn't allow things like that to run in the background - your app would only work if each guest had the app open as they were travelling towards the event. The app would then update a server somewhere with their locations.
You might be able to do this with a notification that told them to open the app 10 minutes before the event started?
Android allows background tasks so you might want to write this for Android devices first and then make an iPhone version later?