Smoke (or) explosion animation while removing an item from my iPhone App - iphone

I want to remove an item (lets say a UIButton) from my iPhone application. I want to add some animation when i tap on the button to remove it.
This is the animation i want:
In your OSX dock, if you right-click on an item and tap on *remove from dock*,
it kinda like **explodes** with a funny noise, and removes itself from that dock.
Any knows how to do that smoke (or) tiny explosion animation on the iPhone ?? Is there a pre-defined name for it ?

Believe it or not, there's an API for exactly that on Mac OS X:
NSPoint centrePoint = ...;
NSSize size = ...;
NSShowAnimationEffect(NSAnimationEffectPoof, centrePoint, size, nil, NULL, NULL);
On iOS, there isn't because the animation is a Mac-specific animation. On iOS, one typically sees the deleted object collapse into a point. That animation can be done by animating the transform of a view (using CAAnimation or the UIView class methods) so that it scales to nothingness.

You can roll your own using CAEmitterLayer - there are lots of examples on SO and elsewhere. You could do some really nice stuff (or maybe find some really nice open source code).

Related

Rendering a preview of a UIView in another UIView

I have a very intriguing obstacle to overcome. I am trying to display the live contents of a UIView in another, separate UIView.
What I am trying to accomplish is very similar to Mission Control in Mac OS X. In Mission Control, there are large views in the center, displaying the desktop or an application. Above that, there are small views that can be reorganized. These small views display a live preview of their corresponding app. The preview is instant, and the framerate is exact. Ultimately, I am trying to recreate this effect, as cheaply as possible.
I have tried many possible solutions, and the one shown here is as close as I have gotten. It works, however the - (void)drawLayer:(CALayer *)layer inContext:(CGContextRef)ctx method isn't called on every change. My solution was to call [cloneView setNeedsDisplay] using a CADisplayLink, so it is called on every screen refresh. It is very near my goal, but the framerate is extremely low. I think that [CALayer renderInContext:] is much too slow.
If it is possible to have two CALayers render the same source, that would be golden. However, I am not sure how to approach this. Luckily, this is simply a concept app and isn't destined for the App Store, so I can make use of the private APIs. I have looked into IOSurface and Quartz contexts, but I haven't been able to solve this puzzle so far. Any input would be greatly appreciated!
iOS and OSX are actually mostly the same underneath at the lowest level. (However, when you get higher up the stack iOS is actually largely more advanced than OSX as it is newer and had a fresh start)
However, in this case they both use the same thing (I believe). You'll notice something about Mission Control. It isolates "windows" rather than views. On iOS each UIWindow has a ".contentID" property and CALayerHost can use to make the render server share the render context between the 2 of them (2 layers that is).
So my advice is to make your views separate UIWindows and get native mirroring for free-(ish). (In my experience the CALayerHost takes over the target layers place with the render server and so if both the CALayerHost and the window are visible the window won't be anymore, only the layer host will be (which the way they are used on OSX and iOS isn't a problem).)
So if you are after true mirroring, 2 copies of it, you'll need to resort to the sort of thing you were thinking about.
1 Option for this is to create a UIView subclass that uses
https://github.com/yyfrankyy/iOS5.1-Framework-Headers/blob/master/UIKit.framework/UIView-Rendering.h#L12
this UIView private method to get an IOSurface for a target view and then using a CADisplayLink once per second get and draw the surface.
Another option which may work (I'm not sure as I don't know your setup or desired effect) is possibly just to use a CAReplicatorLayer which displays a mirror of a CALayer using the same backing store (very fast and efficient + public stable API).
Sorry I couldn't give you a fixed, "this is the answer reply", but hopefully I've given you enough ideas and possibilities to get started.
I've also included some links to things you might find useful to read.
What's the magic behind CAReplicatorLayer?
http://aptogo.co.uk/2011/08/no-fuss-reflections/
http://iphonedevwiki.net/index.php/SBAppContextHostManager
http://iphonedevwiki.net/index.php/SBAppContextHostView
http://iphonedevwiki.net/index.php/CALayerHost
http://iky1e.tumblr.com/post/33109276151/recreating-remote-views-ios5
http://iky1e.tumblr.com/post/14886675036/current-projects-understanding-ios-app-rendering

How do I display all touches on screen in order to demo an iPhone app?

Now that we have display mirroring on the iPad 2 (wired now... wireless coming in iOS 5), is there an easy way to display all touches on screen? This would be useful when demoing an app?
What I am looking for is the ability to just include some SDK, and maybe change a line of code after which all of my touches will be displayed on screen.
I have seen many other ways to demo apps:
1)Using the simulator along with a screen capture tool that will turn your mouse cursor into a big white circle
2)Jailbreak hacks that can record the screen/display all touches
However, my goal is to just have touches displayed on a regular app running on an actual device.
I realise this question is old now, but none of the existing solutions were good enough for me. I needed something that worked out of the box with multiple windows, without having to subclass windows or do any fiddling about.
So I created ShowTime, which (up until Swift 4) literally just requires you to either add the pod to your podfile or add ShowTime.swift to your target. The rest is totally automatic unless you want to configure the defaults.
https://github.com/KaneCheshire/ShowTime
Starting from Swift 4 there's one extra step, in your AppDelegate, just set ShowTime.enabled = .always or ShowTime.enabled = .debugOnly
Edit: Swift 4 now has auto-enabling again so no need to manually enable.
You can use Touchposé: https://github.com/toddreed/Touchpose
After looking at all of the various libraries for this, I realized that, at least in my case, they were massive overkill. I just wanted to make an App Preview video, and I just needed it for two places in my app.
So, I spent a little time and came up with the following code that you can use with a SpriteKit-based scene to enable touch display on a scene-by-scene basis. (And, if you are starting from scratch, you could subclass SKScene and bake this right in for use by all your scenes.)
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInNode:self];
SKNode *node = [self nodeAtPoint:location];
#ifdef weBeRecording
// ^^ Only do this if we're building a special version for making App Previews
// Could also be switched on and off by a runtime control, I suppose
// Create a new SKSprite at the touch location
SKSpriteNode *touchesNode = [SKSpriteNode spriteNodeWithImageNamed:#"touchMarker.png"];
touchesNode.position = location;
// Initially see-through
touchesNode.alpha = 0.0;
// Position it above all your other nodes
touchesNode.zPosition = 199;
// And size it appropriately for your needs
touchesNode.size = CGSizeMake(80, 80);
// Add it to the scene
[self addChild:touchesNode];
// And then make it fade in and out. (Adjust in and out times as needed.)
SKAction *showTouch =
[SKAction sequence:#[
[SKAction fadeInWithDuration:0.25],
[SKAction fadeOutWithDuration:0.25],
[SKAction removeFromParent]
]];
[touchesNode runAction:showTouch];
#endif
/* process original touch on node here */
}
Of course, you'll have to make the touchMarker.png file yourself. I created mine in Photoshop and it's just a simple 100x100 white circle with a 50% transparency. Honestly, it took longer to get that image just right than it did to get the code working. (If there's a way to attach images here, I'll do that. But, um, it's my first day. Yeah.)
One caveat, you have find and save off the originally touched node (saved here in the "node" variable) before you do this. If you don't, that original value gets lost and none of your node testing will work after you display the touch marker.
Hope this helps someone else!
Diz
You'd have to code in custom touches on the container view and move the center of a circular view to the touch's locationInView.
I would probably try to capture all touches on the KeyWindow with touchesBegan, touchesMoved and touchesEnded. You could then have a transparent view lying over the application which would then show the touches at the appropriate location.
An alternative solution I cobbled together recently enables you to target any app (including SpringBoard and other iOS system applications ie Settings) and requires zero modifications or libraries for your own application, the only drawback is it requires a jailbreak. I figured i'd share regardless just in case anyone else finds it useful.
https://github.com/lechium/touchy
I based it around https://github.com/mapbox/Fingertips but inject into UIWindow instead of replacing it to add everything necessary to make showing the touches possible.
If you are jailbroken but don't feel like building and installing touchy on your own its on my Cydia repo: https://nitosoft.com/beta2

view based iphone development question

I am trying to develop and app that will display a number or text and the user will physically speek the said number or text then touch the number or text and a new window will show another number or text
and this will continue many times
how should i go about developing this?
Brian, first of all you should take a look at UIViewController Class Reference:
http://developer.apple.com/library/ios/#documentation/uikit/reference/UIViewController_Class/Reference/Reference.html
This file will help you a lot, you'll know how to manage different view controllers, and it's basically what you need.
Second thing is: to touch the number or text it's OK, but to speak and your app recognize it, it's a little bit harder. I would suggest you to develop the touch control and after you finish with this, maybe you can start to study a way to develop the speakable control.
Third, as a basic model, you should think in instantiate an NSArray with the "right" content, I mean, the content that should be touched to go to next "step".
Create a button with this "right content" and for every touch in the screen you call a method to check if the touch was inside the right button or not.
You can use one UIViewController for each index of this array, or you can use always the same UIViewController and just change the words or numbers on the screen with some custom method.
Hope it helps.

Minimize window effect of OS X on the iPhone - Objective-C

I would like to imitate the minimize window effect of OS X on the UIView's iPhone. I was thinking about making two animations: the first one extend and distort the view and the second one reduce the view!
What do you think? My problem is to distort the view!
Do you have any idea how I could make the whole thing?
There has been a lot of discussion on this see the response from Brad Larson on How can I replicate the trashing animation of Mail.app

Changing the location of a UIButton when it is clicked

I'm putting together an app that uses a keyboard (like the native iPhone one) that has symbols on the keys. My question is how to I change the location of a button when it is clicked (as the UIControlStateHighlighted images is bigger than the regular state).
Thanks for any help/pointers.
All UIView instances have a frame, bounds, center, and transform properties. You can use any or all of these to manipulate your button. Changing the center may be all you need to do.
Try this:
[yourView setFrame:CGRectMake(yourHorizontalPosition,yourVerticalPosition,[yourView.frame.size width],[yourView.frame.size height])];
Does that work for you?