How can I know when the Magnifying Glass comes up in UIWebView? - iphone

How I can detect when the magnifying glass comes up in a UIWebView?

Assuming you're writing an app to release on the app store, you can't - the structure of a UIWebView is opaque to you, and you can't drill down into its component parts. Whilst I suspect it would be possible to descend into its structure and figure out exactly what was getting triggered when, you run a very large risk that Apple will change UIWebView's structure in a subsequent OS update and break your app.
So unless you're going outside of the public API (and thus making available through channels other than the app store) I'm afraid you may be out of luck.

Related

Rendering a preview of a UIView in another UIView

I have a very intriguing obstacle to overcome. I am trying to display the live contents of a UIView in another, separate UIView.
What I am trying to accomplish is very similar to Mission Control in Mac OS X. In Mission Control, there are large views in the center, displaying the desktop or an application. Above that, there are small views that can be reorganized. These small views display a live preview of their corresponding app. The preview is instant, and the framerate is exact. Ultimately, I am trying to recreate this effect, as cheaply as possible.
I have tried many possible solutions, and the one shown here is as close as I have gotten. It works, however the - (void)drawLayer:(CALayer *)layer inContext:(CGContextRef)ctx method isn't called on every change. My solution was to call [cloneView setNeedsDisplay] using a CADisplayLink, so it is called on every screen refresh. It is very near my goal, but the framerate is extremely low. I think that [CALayer renderInContext:] is much too slow.
If it is possible to have two CALayers render the same source, that would be golden. However, I am not sure how to approach this. Luckily, this is simply a concept app and isn't destined for the App Store, so I can make use of the private APIs. I have looked into IOSurface and Quartz contexts, but I haven't been able to solve this puzzle so far. Any input would be greatly appreciated!
iOS and OSX are actually mostly the same underneath at the lowest level. (However, when you get higher up the stack iOS is actually largely more advanced than OSX as it is newer and had a fresh start)
However, in this case they both use the same thing (I believe). You'll notice something about Mission Control. It isolates "windows" rather than views. On iOS each UIWindow has a ".contentID" property and CALayerHost can use to make the render server share the render context between the 2 of them (2 layers that is).
So my advice is to make your views separate UIWindows and get native mirroring for free-(ish). (In my experience the CALayerHost takes over the target layers place with the render server and so if both the CALayerHost and the window are visible the window won't be anymore, only the layer host will be (which the way they are used on OSX and iOS isn't a problem).)
So if you are after true mirroring, 2 copies of it, you'll need to resort to the sort of thing you were thinking about.
1 Option for this is to create a UIView subclass that uses
https://github.com/yyfrankyy/iOS5.1-Framework-Headers/blob/master/UIKit.framework/UIView-Rendering.h#L12
this UIView private method to get an IOSurface for a target view and then using a CADisplayLink once per second get and draw the surface.
Another option which may work (I'm not sure as I don't know your setup or desired effect) is possibly just to use a CAReplicatorLayer which displays a mirror of a CALayer using the same backing store (very fast and efficient + public stable API).
Sorry I couldn't give you a fixed, "this is the answer reply", but hopefully I've given you enough ideas and possibilities to get started.
I've also included some links to things you might find useful to read.
What's the magic behind CAReplicatorLayer?
http://aptogo.co.uk/2011/08/no-fuss-reflections/
http://iphonedevwiki.net/index.php/SBAppContextHostManager
http://iphonedevwiki.net/index.php/SBAppContextHostView
http://iphonedevwiki.net/index.php/CALayerHost
http://iky1e.tumblr.com/post/33109276151/recreating-remote-views-ios5
http://iky1e.tumblr.com/post/14886675036/current-projects-understanding-ios-app-rendering

iOS - How to show hints for gestures for iOS app?

I have seen some apps where when you launch them for the first time after downloading (e.g. Chrome app on iPhone), it shows you a list of animated gestures on the screen, kind of giving you a tour of the app.
How do I build one something like that? And how does the app know to launch only for the first time after download and not since then? For the second question, I am guessing a "shown=TRUE" value can be saved inside a PList file and checking the value each time when the app finished launching. But I am more curious about the mechanism involved in creating a guided app tour.
You can use transparent and semi-transparent images with a UIImageView, so you can make up an image with arrows and notes and put over the whole screen. You could fade it out when the user taps.
To know if it's the first time running the app, you should use NSUserDefaults instead of a plist; it's much easier, and you should be app to find a quick tutorial on that fairly easily.
Also, you could check around on this site for controls like this one. I haven't used any of them myself, so I'm not sure how much they differ from a regular UIImageView. They look nice though.

Simulate touch on iphone

Im trying to simulate a touch on as UIWebView, how can I programmatically fire a touch event at a certain location? (x and y coordinates)
Just call touchesBegan?
Ideally I'd like to do it without any javascript hack because in the future it may not be a uiwebview
It's not easy to synthesize a touch event on the iPhone: you have to use undisclosed API, so you have a high probability of breaking on every update of the iOS and getting rejecting from Apple.
Here's a link that demonstrates how to synthesize a touch event on the iPhone:
Here's another question on StackOverflow: How to send a touch event to iPhone OS?
It's worth pointing out the KIF framework here. It's intended to run in the simulator but part of the code is simulating touch evens in code. with luck, this will be a good starting point.
https://github.com/square/KIF
Specifically, look at stepToTapViewWithAccessibilityLabel in KIFTestStep.m and the line
[view tapAtPoint:tappablePointInElement];
What you need to do is first create the events you want, and then send them to SpringBoard over the "purple port" eg. mach port. To make them system wide you must forward them to each application over the port. That means you need to actually do what the windowmanager does and looking at which app is active, screen locked, etc.
There are a hand full of private framework APIs that work (IOSurface, GraphicServices, SpringBoardServices, etc.) to get you the pieces you need.
You will have to load these private frameworks at runtime using something like dlopen().
This is 100% possible without jailbreak as of iOS 6.1.4 (current ATM), but you will be loading private frameworks which is not allowed by apple for AppStore ;)
It is possible. Exactly how you mentioned, using GSEvents and sending them to the purple named port of the aplication you are trying to control/simulate. Of course you need KennyTM's GSEvent.h to accomplish this.
I've done this for iOS 4.3, just by changing some of the values that Kenny had (like kGSHandInfoTypeTouchDown), but now I'm trying to do it for iOS 5 and it's not working, till now.
EDIT: It is now working for iOS 5.1.
Without jailbreaking there is no real way to hook a gesture recognizer into all views of the entire system. First off, your app running in the background doesn't have the ability of executing this code.

Using the webView method, "loadRequest", do I need to worry about threading? (iphone sdk question)

Lets suppose I am creating an application for the iphone with a webView down at the bottom of the window (the other part of the screen has a button and the user can interact with it).
I don't want the webView to stop the user from interacting with the other part of the UI when the webView loads a new url. From my limited testing through the iphone simulator, I haven't been able to determine IF it already behaves this way. Most of my web sites load pretty fast.
I seem to be able to load new requests and click the ui button while that happens.
So, again, do I need to worry about threading in this case?
No, you do not. The iPhone threads a great deal of the UI components behavior, or schedules them for you in the main run loop in such a way that you rarely need to be concerned, the UI elements will be available for user interaction.

iPhone: Custom UIImagePickerController?

It seems like the built in UIImagePickerController cannot accept sources other than its device camera Roll.
I would like to get the functionality of picking and enlarging pictures within my own app. Also I would like to allow the user to select the pictures and save it into their camera roll (so they can later use it as wall paper)
1) what is the recommended way of building a custom UIImagePickerController that supports what I need ? Is there another built in controller I am missing?
3) Is there a way to take a UIImage and save it to the desktop background of the device directly? Or is it a two step (first save to camera roll), then have the user load the picture from there to save as wallpaper
Thanks in advance!!
At this point, it seems that the only way to build a custom UIImagePickerController functionality is to subclass, and then muck with the view hierarchy directly. This allows you to move, hide, and replace UI elements and access the non-public classes that control the operation of the camera, but as you probably gather, this technique is both unsupported (in that it may, and probably will, break with future updates) and not recommended (in that it may, and probably will, get your app rejected from the App Store if detected).
As far as your second point (somehow numbered 3), John is right: there is no call in the public SDK to accomplish this. You could probably hack something together if you're clever, but remember the warnings in my first paragraph...
Regarding #3) there is no public call in the SDK to save an UIImage to the desktop.
I don't know about the rest of your questions though.
Just today I started a open source UIImagePickerController clone, it is not perfect but it works quite ok. Feel free to fork http://github.com/jeena/JPImagePickerController