A while ago I remember seeing a constant of some kind that defined the animation rate of the Keyboard on the iPhone and I can not for the life of me remember where I saw it....any insight?
- (NSTimeInterval)keyboardAnimationDurationForNotification:(NSNotification*)notification
{
NSDictionary* info = [notification userInfo];
NSValue* value = [info objectForKey:UIKeyboardAnimationDurationUserInfoKey];
NSTimeInterval duration = 0;
[value getValue:&duration];
return duration;
}
UIKeyboardAnimationDurationUserInfoKey now is a NSNumber object, that makes the code shorter.
- (void)keyboardWillShowNotification:(NSNotification *)notification
{
NSDictionary *info = [notification userInfo];
NSNumber *number = [info objectForKey:UIKeyboardAnimationDurationUserInfoKey];
double duration = [number doubleValue];
}
Since this is the first google hit, I'd like to point out that hard-coding 0.3 will mean that your view will incorrectly animate when international users (e.g. Japanese) swap between different-sized keyboards (when that action ought to be instant).
Always use the notification's userInfo dictionary's UIKeyboardAnimationDurationUserInfoKey value - it gets set to 0 when the user is flicking through keyboards.
To add a bit more to what Shaggy Frog wrote. The full implementation would be something like:
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(keyboardMovement:)
name:UIKeyboardWillShowNotification
object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(keyboardMovement:)
name:UIKeyboardWillHideNotification
object:nil];
-(void)keyboardMovement:(NSNotification *)notification{
if (_numericKeyboardShowing == false){
[UIView animateWithDuration:[self keyboardAnimationDurationForNotification:notification] delay:0
options:UIViewAnimationCurveEaseInOut
animations:^ {
self.bottomContainerView.center = CGPointMake(self.bottomContainerView.center.x, (self.bottomContainerView.center.y - 218));
}
completion:NULL];
_numericKeyboardShowing = true;
}
else{
[UIView animateWithDuration:[self keyboardAnimationDurationForNotification:notification] delay:0
options:UIViewAnimationCurveLinear
animations:^ {
self.bottomContainerView.center = CGPointMake(self.bottomContainerView.center.x, (self.bottomContainerView.center.y + 218));
}
completion:NULL];
_numericKeyboardShowing = false;
}
- (NSTimeInterval)keyboardAnimationDurationForNotification:(NSNotification *)notification
{
NSDictionary *info = [notification userInfo];
NSValue* value = [info objectForKey:UIKeyboardAnimationDurationUserInfoKey];
NSTimeInterval duration = 0;
[value getValue:&duration];
return duration;
}
UIKeyboardAnimationDurationUserInfoKey
The key for an NSValue object containing a double that identifies the duration of the animation in seconds.
In Swift your code will look like this:
let keyboardSize: CGSize = userInfo[UIKeyboardFrameBeginUserInfoKey]!.CGRectValue.size
let animationDuration = ((userInfo[UIKeyboardAnimationDurationUserInfoKey]) as! NSNumber).floatValue
let animationOptions = ((userInfo[UIKeyboardAnimationCurveUserInfoKey]) as! NSNumber).unsignedLongValue
UIView.animateWithDuration(NSTimeInterval(animationDuration), delay: 0,
options: UIViewAnimationOptions(rawValue: animationOptions),
animations: { () -> Void in
self.view.frame.origin.y += keyboardSize.height
},
completion: nil)
Swift 4 - worked for me:
if let duration = notification.userInfo?[UIKeyboardAnimationDurationUserInfoKey] as? Double {
UIView.animate(withDuration: duration, animations: {
self.view.layoutIfNeeded()
})
}
In debug mode my duration was 3.499999
Related
I've been dabbling with the new iOS 7 custom transition API and looked through all the tutorials/documentation I could find but I can't seem to figure this stuff out for my specific scenario.
So essentially what I'm trying to implement is a UIPanGestureRecognizer on a view where I would swipe up and transition to a VC whose view would slide up from the bottom while the current view would slide up as I drag my finger higher.
I have no problem accomplishing this without the interaction transition, but once I implement the interaction (the pan gesture) I can't seem to complete the transition.
Here's the relevant code from the VC that conforms to the UIViewControllerTransitionDelegate which is needed to vend the animator controllers:
- (void)prepareForSegue:(UIStoryboardSegue *)segue sender:(id)sender {
if ([segue.identifier isEqualToString:#"Swipe"]) {
NSLog(#"PREPARE FOR SEGUE METHOD CALLED");
UIViewController *toVC = segue.destinationViewController;
[interactionController wireToViewController:toVC];
toVC.transitioningDelegate = self;
toVC.modalPresentationStyle = UIModalPresentationCustom;
}
}
#pragma mark UIViewControllerTransition Delegate Methods
- (id <UIViewControllerAnimatedTransitioning>)animationControllerForPresentedController: (UIViewController *)presented presentingController: (UIViewController *)presenting sourceController:(UIViewController *)source {
NSLog(#"PRESENTING ANIMATION CONTROLLER CALLED");
SwipeDownPresentationAnimationController *transitionController = [SwipeDownPresentationAnimationController new];
return transitionController;
}
- (id <UIViewControllerAnimatedTransitioning>)animationControllerForDismissedController:(UIViewController *)dismissed {
NSLog(#"DISMISS ANIMATION CONTROLLER CALLED");
DismissAnimatorViewController *transitionController = [DismissAnimatorViewController new];
return transitionController;
}
- (id <UIViewControllerInteractiveTransitioning>)interactionControllerForDismissal:(id <UIViewControllerAnimatedTransitioning>)animator {
NSLog(#"Interaction controller for dimiss method caled");
return interactionController.interactionInProgress ? interactionController:nil;
}
NOTE: The interaction swipe is only for the dismissal of the VC which is why it's in the interactionControllerForDismissal method
Here's the code for the animator of the dismissal which works fine when I tap on a button to dismiss it:
#import "DismissAnimatorViewController.h"
#implementation DismissAnimatorViewController
- (NSTimeInterval)transitionDuration:(id <UIViewControllerContextTransitioning>)transitionContext {
return 1.0;
}
- (void)animateTransition:(id <UIViewControllerContextTransitioning>)transitionContext {
NSTimeInterval duration = [self transitionDuration:transitionContext];
UIViewController *toVC = [transitionContext viewControllerForKey:UITransitionContextToViewControllerKey];
UIViewController *fromVC = [transitionContext viewControllerForKey:UITransitionContextFromViewControllerKey];
CGRect initialFrameFromVC = [transitionContext initialFrameForViewController:fromVC];
UIView *containerView = [transitionContext containerView];
CGRect screenBounds = [[UIScreen mainScreen] bounds];
NSLog(#"The screen bounds is :%#", NSStringFromCGRect(screenBounds));
toVC.view.frame = CGRectOffset(initialFrameFromVC, 0, screenBounds.size.height);
toVC.view.alpha = 0.2;
CGRect pushedPresentingFrame = CGRectOffset(initialFrameFromVC, 0, -screenBounds.size.height);
[containerView addSubview:toVC.view];
[UIView animateWithDuration:duration
delay:0
usingSpringWithDamping:0.6
initialSpringVelocity:0
options:UIViewAnimationOptionCurveEaseIn
animations:^{
fromVC.view.frame = pushedPresentingFrame;
fromVC.view.alpha = 0.2;
toVC.view.frame = initialFrameFromVC;
toVC.view.alpha = 1.0;
} completion:^(BOOL finished) {
[transitionContext completeTransition:YES];
}];
}
#end
Here's the code for the UIPercentDrivenInteractiveTransition subclass which serves as the interaction controller:
#import "SwipeInteractionController.h"
#implementation SwipeInteractionController {
BOOL _shouldCompleteTransition;
UIViewController *_viewController;
}
- (void)wireToViewController:(UIViewController *)viewController {
_viewController = viewController;
[self prepareGestureRecognizerInView:_viewController.view];
}
- (void)prepareGestureRecognizerInView:(UIView*)view {
UIPanGestureRecognizer *gesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handleGesture:)];
gesture.minimumNumberOfTouches = 1.0;
[view addGestureRecognizer:gesture];
}
- (CGFloat)completionSpeed {
return 1 - self.percentComplete;
NSLog(#"PERCENT COMPLETE:%f",self.percentComplete);
}
- (void)handleGesture:(UIPanGestureRecognizer*)gestureRecognizer {
// CGPoint translation = [gestureRecognizer translationInView:gestureRecognizer.view.superview];
CGPoint translation = [gestureRecognizer translationInView:gestureRecognizer.view.superview];
switch (gestureRecognizer.state) {
case UIGestureRecognizerStateBegan:
// 1. Start an interactive transition!
self.interactionInProgress = YES;
[_viewController dismissViewControllerAnimated:YES completion:nil];
break;
case UIGestureRecognizerStateChanged: {
// 2. compute the current position
CGFloat fraction = fabsf(translation.y / 568);
NSLog(#"Fraction is %f",fraction);
fraction = fminf(fraction, 1.0);
fraction = fmaxf(fraction, 0.0);
// 3. should we complete?
_shouldCompleteTransition = (fraction > 0.23);
// 4. update the animation controller
[self updateInteractiveTransition:fraction];
NSLog(#"Percent complete:%f",self.percentComplete);
break;
}
case UIGestureRecognizerStateEnded:
case UIGestureRecognizerStateCancelled: {
// 5. finish or cancel
NSLog(#"UI GESTURE RECOGNIZER STATE CANCELED");
self.interactionInProgress = NO;
if (!_shouldCompleteTransition || gestureRecognizer.state == UIGestureRecognizerStateCancelled) {
[self cancelInteractiveTransition];
NSLog(#"Interactive Transition is cancled.");
}
else {
NSLog(#"Interactive Transition is FINISHED");
[self finishInteractiveTransition];
}
break;
}
default:
NSLog(#"Default is being called");
break;
}
}
#end
Once again, when I run the code now and I don't swipe all the way to purposefully cancel the transition, I just get a flash and am presented with the view controller I want to swipe to. This happens regardless if the transition completes or is canceled.
However, when I dismiss via the button I get the transition specified in my animator view controller.
I can see a couple of issues here - although I cannot be certain that these will fix your problem!
Firstly, your animation controller's UIView animation completion block has the following:
[transitionContext completeTransition:YES];
Whereas it should return completion based on the result of the interaction controller as follows:
[transitionContext completeTransition:![transitionContext transitionWasCancelled]]
Also, I have found that if you tell the UIPercentDrivenInteractiveTransition that a transition is 100% complete, it does not call the animation controller completion block. As a workaround, I limit it to ~99.9%
https://github.com/ColinEberhardt/VCTransitionsLibrary/issues/4
I've created a number of example interaction and animation controllers here, that you might find useful:
https://github.com/ColinEberhardt/VCTransitionsLibrary
I had this same problem. I tried the fixes above and others, but nothing worked. Then I stumbled upon https://github.com/MrAlek/AWPercentDrivenInteractiveTransition, which fixed everything.
Once you add it to your project, just replace UIPercentDrivenInteractiveTransition with AWPercentDrivenInteractiveTransition.
Also, you have to set the animator before starting an interactive transition. In my case, I use the same class for UIViewControllerAnimatedTransitioning and UIViewControllerInteractiveTransitioning, so I just did it in init():
init() {
super.init()
self.animator = self
}
I am new to Iphone and trying to play video in webView. So, in that i need to have the done button action. Pls help me out.
If the video is played in UIWebView then you can access when the video is played and when Done button is pressed using the following code along with the code you have used to load url in webview
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(videoPlayStarted:) name:#"UIMoviePlayerControllerDidEnterFullscreenNotification" object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(videoPlayFinished:) name:#"UIMoviePlayerControllerDidExitFullscreenNotification" object:nil];
And it can be accessed through
BOOL isVideoInFullScreenMode;//Temperory added here.. must be in .h file or at teh top if required
-(void)videoPlayStarted:(NSNotification *)notification{
//Your stuff here
isVideoInFullScreenMode = YES;
}
-(void)videoPlayFinished:(NSNotification *)notification{
//Your stuffs here
isVideoInFullScreenMode = NO;
}
Finally i found solution for Done button action...This logic is working for both ipad & iphone also..
I solved this prob by using UIGestureRecognizer.
Here is my code.
firstly i am adding notification for EnterFullScreen by using
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(youTubeStarted:) name:#"UIMoviePlayerControllerDidEnterFullscreenNotification" object:nil];
Now in youTubeStarted method i added this Code..
-(void)youTubeStarted:(NSNotification *)notification{
if (!tapRecognizer1) {
tapRecognizer1 = [[UITapGestureRecognizer alloc] init];
tapRecognizer1.delegate = self;
[tapRecognizer1 setNumberOfTapsRequired:1];
[self.view addGestureRecognizer:tapRecognizer1];
}
}
This will add TapRecognizer to the movieplayer..
And now the following delegate method will check that it is done button or not
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch{
NSLog(#"%#",touch.description);
NSString *ViewName = touch.description ;
CGPoint location = [touch locationInView:touch.view];
NSLog(#"location x===%f , location y=== %f",location.x,location.y);
NSRange Check = [ViewName rangeOfString:#"UINavigationButton"];
NSRange checkFrameOfButton;
if (UI_USER_INTERFACE_IDIOM()==UIUserInterfaceIdiomPad) {
checkFrameOfButton = [ViewName rangeOfString:#"frame = (7 7; 50 30)"];
if (checkFrameOfButton.length <= 0) {
checkFrameOfButton = [ViewName rangeOfString:#"frame = (7 7; 51 30)"];
}
}
else {
checkFrameOfButton = [ViewName rangeOfString:#"frame = (5 7; 48 30)"];
if (Check.length<=0) {
Check = [ViewName rangeOfString:#"UIView"];
}
}
if (location.y<40 && location.x<85 && Check.length>0 && checkFrameOfButton.length>0) {
[self endplaying];
}
return YES;
}
Now in endplaying method you can do what you want...
I have my NSTimer embedded in a class that plays image sequences. Basically it loops and changes a UIImageView. Everything goes well if I let the image sequence finish but... on it's own if I try to stop the timer while it is playing I get a sigabrt. Edit: No longer a sigabrt but now a DeAlloc I can't explain.
The "stop" at the end of a frame sequence is the same stop I am calling mid sequence.
So what might cause an NSTimer to break mid function and DeAlloc. More to the point what might I look at to fix it.
Thanks.
I am using some example code from here: http://www.modejong.com/iOS/PNGAnimatorDemo.zip
Edit: I'll add what I believe to be the pertinent code here.
// Invoke this method to start the animation
- (void) startAnimating
{
self.animationTimer = [NSTimer timerWithTimeInterval: animationFrameDuration target: self selector: #selector(animationTimerCallback:) userInfo: NULL repeats: TRUE];
[[NSRunLoop currentRunLoop] addTimer: animationTimer forMode: NSDefaultRunLoopMode];
animationStep = 0;
if (avAudioPlayer != nil)
[avAudioPlayer play];
// Send notification to object(s) that regestered interest in a start action
[[NSNotificationCenter defaultCenter]
postNotificationName:ImageAnimatorDidStartNotification
object:self];
}
- (void) animationTimerCallback: (NSTimer *)timer {
if (![self isAnimating])
return;
NSTimeInterval currentTime;
NSUInteger frameNow;
if (avAudioPlayer == nil) {
self.animationStep += 1;
// currentTime = animationStep * animationFrameDuration;
frameNow = animationStep;
} else {
currentTime = avAudioPlayer.currentTime;
frameNow = (NSInteger) (currentTime / animationFrameDuration);
}
// Limit the range of frameNow to [0, SIZE-1]
if (frameNow < 0) {
frameNow = 0;
} else if (frameNow >= animationNumFrames) {
frameNow = animationNumFrames - 1;
}
[self animationShowFrame: frameNow];
// animationStep = frameNow + 1;
if (animationStep >= animationNumFrames) {
[self stopAnimating];
// Continue to loop animation until loop counter reaches 0
if (animationRepeatCount > 0) {
self.animationRepeatCount = animationRepeatCount - 1;
[self startAnimating];
}
}
}
- (void) stopAnimating
{
if (![self isAnimating])
return;
[animationTimer invalidate];
self.animationTimer = nil;
animationStep = animationNumFrames - 1;
[self animationShowFrame: animationStep];
if (avAudioPlayer != nil) {
[avAudioPlayer stop];
avAudioPlayer.currentTime = 0.0;
self->lastReportedTime = 0.0;
}
// Send notification to object(s) that regestered interest in a stop action
[[NSNotificationCenter defaultCenter]
postNotificationName:ImageAnimatorDidStopNotification
object:self];
}
Edit2: So I commented out an NSAssert in DeAlloc, commenting that out shed a bit more light. Now getting to self.animationTimer = nil; and saying *** -ImageAnimator setAnimationTimer:]: Message sent to deallocated instance.
DeAlloc is being called right when I invalidate the timer... so I'm a bit confused here.
I have a solution which is working but not optimal.
I just added another function to set the frame it's on to the last frame and it ends the sequence as I need.
But it doesn't solve the question of why it's doing it's crashing if I try to stop the sequence mid run.
Well, I wrote this code so I can assure you that it should be working just fine :) You need to invoke stopAnimating before you are done with the view, the only way you would have got the assert in dealloc is if the view was deallocated while it was still animating. That should not happen, which is exactly why there way an assert there. There is also a big comment explaining the assert, what is unclear about that?
- (void)dealloc {
// This object can't be deallocated while animating, this could
// only happen if user code incorrectly dropped the last ref.
NSAssert([self isAnimating] == FALSE, #"dealloc while still animating");
self.animationURLs = nil;
self.imageView = nil;
self.animationData = nil;
self.animationTimer = nil;
[super dealloc];
}
Just call:
[view stopAnimating];
Before you remove the view from its superview and everything will be just fine.
Is it just me or has the action sheet on <img> tags been disabled in UIWebView? In Safari, e.g, when you want to save an image locally, you touch and hold on the image to get an action sheet shown. But it's not working in my custom UIWebView. I mean, it is still working for <a> tags, i.e, when I touch and hold on html links, an action sheet shows up. But not for the <img> tags.
I've tried things like putting img { -webkit-touch-callout: inherit; } in css, which didn't work. On the other hand, when I double-tap and hold on the images, a copy-balloon shows up.
So the question is, has the default action sheet callout for <img> tags been disabled for UIWebView? Is so, is there a way to re-enable it? I've googled around and saw many Q&As on how to disable it in UIWebView, so is it just me who aren't seeing the popup?
Thanks in advance!
Yes apple has disabled this feature (among others) in UIWebViews and kept it for Safari only.
However you can recreate this yourself by extending this tutorial, http://www.icab.de/blog/2010/07/11/customize-the-contextual-menu-of-uiwebview/.
Once you've finished this tutorial you'll want to add a few extra's so you can actually save images (which the tutorial doesn't cover).
I added an extra notification called #"tapAndHoldShortNotification" after 0.3 seconds which calls a method with just the disable callout code in it (to prevent both the default and your own menu popping while the page is still loading, a little bug fix).
Also to detect images you'll need to extend the JSTools.js, here's mine with the extra functions.
function MyAppGetHTMLElementsAtPoint(x,y) {
var tags = ",";
var e = document.elementFromPoint(x,y);
while (e) {
if (e.tagName) {
tags += e.tagName + ',';
}
e = e.parentNode;
}
return tags;
}
function MyAppGetLinkSRCAtPoint(x,y) {
var tags = "";
var e = document.elementFromPoint(x,y);
while (e) {
if (e.src) {
tags += e.src;
break;
}
e = e.parentNode;
}
return tags;
}
function MyAppGetLinkHREFAtPoint(x,y) {
var tags = "";
var e = document.elementFromPoint(x,y);
while (e) {
if (e.href) {
tags += e.href;
break;
}
e = e.parentNode;
}
return tags;
}
Now you can detect the user clicking on images and actually find out the images url they are clicking on, but we need to change the -(void)openContextualMenuAtPoint: method to provide extra options.
Again here's mine (I tried to copy Safari's behaviour for this):
- (void)openContextualMenuAt:(CGPoint)pt{
// Load the JavaScript code from the Resources and inject it into the web page
NSString *path = [[NSBundle mainBundle] pathForResource:#"JSTools" ofType:#"js"];
NSString *jsCode = [NSString stringWithContentsOfFile:path encoding:NSUTF8StringEncoding error:nil];
[webView stringByEvaluatingJavaScriptFromString:jsCode];
// get the Tags at the touch location
NSString *tags = [webView stringByEvaluatingJavaScriptFromString:
[NSString stringWithFormat:#"MyAppGetHTMLElementsAtPoint(%i,%i);",(NSInteger)pt.x,(NSInteger)pt.y]];
NSString *tagsHREF = [webView stringByEvaluatingJavaScriptFromString:
[NSString stringWithFormat:#"MyAppGetLinkHREFAtPoint(%i,%i);",(NSInteger)pt.x,(NSInteger)pt.y]];
NSString *tagsSRC = [webView stringByEvaluatingJavaScriptFromString:
[NSString stringWithFormat:#"MyAppGetLinkSRCAtPoint(%i,%i);",(NSInteger)pt.x,(NSInteger)pt.y]];
UIActionSheet *sheet = [[UIActionSheet alloc] initWithTitle:nil delegate:self cancelButtonTitle:nil destructiveButtonTitle:nil otherButtonTitles:nil];
selectedLinkURL = #"";
selectedImageURL = #"";
// If an image was touched, add image-related buttons.
if ([tags rangeOfString:#",IMG,"].location != NSNotFound) {
selectedImageURL = tagsSRC;
if (sheet.title == nil) {
sheet.title = tagsSRC;
}
[sheet addButtonWithTitle:#"Save Image"];
[sheet addButtonWithTitle:#"Copy Image"];
}
// If a link is pressed add image buttons.
if ([tags rangeOfString:#",A,"].location != NSNotFound){
selectedLinkURL = tagsHREF;
sheet.title = tagsHREF;
[sheet addButtonWithTitle:#"Open"];
[sheet addButtonWithTitle:#"Copy"];
}
if (sheet.numberOfButtons > 0) {
[sheet addButtonWithTitle:#"Cancel"];
sheet.cancelButtonIndex = (sheet.numberOfButtons-1);
[sheet showInView:webView];
}
[selectedLinkURL retain];
[selectedImageURL retain];
[sheet release];
}
(NOTES: selectedLinkURL and selectedImageURL are declared in the .h file to let them be accessed throughout the class, for saving or opening the link latter.
So far we've just been going back over the tutorials code making changes but now we will move into what the tutorial doesn't cover (it stops before actually mentioning how to handle saving the images or opening the links).
To handle the users choice we now need to add the actionSheet:clickedButtonAtIndex: method.
-(void)actionSheet:(UIActionSheet *)actionSheet clickedButtonAtIndex:(NSInteger)buttonIndex{
if ([[actionSheet buttonTitleAtIndex:buttonIndex] isEqualToString:#"Open"]){
[webView loadRequest:[NSURLRequest requestWithURL:[NSURL URLWithString:selectedLinkURL]]];
}
else if ([[actionSheet buttonTitleAtIndex:buttonIndex] isEqualToString:#"Copy"]){
[[UIPasteboard generalPasteboard] setString:selectedLinkURL];
}
else if ([[actionSheet buttonTitleAtIndex:buttonIndex] isEqualToString:#"Copy Image"]){
[[UIPasteboard generalPasteboard] setString:selectedImageURL];
}
else if ([[actionSheet buttonTitleAtIndex:buttonIndex] isEqualToString:#"Save Image"]){
NSOperationQueue *queue = [NSOperationQueue new];
NSInvocationOperation *operation = [[NSInvocationOperation alloc] initWithTarget:self selector:#selector(saveImageURL:) object:selectedImageURL];
[queue addOperation:operation];
[operation release];
}
}
This checks what the user wants to do and handles /most/ of them, only the "save image" operation needs another method to handle that. For the progress I used MBProgressHub.
Add an MBProgressHUB *progressHud; to the interface declaration in the .h and set it up in the init method (of whatever class you're handling the webview from).
progressHud = [[MBProgressHUD alloc] initWithView:self.view];
progressHud.customView = [[[UIImageView alloc] initWithImage:[UIImage imageNamed:#"Tick.png"]] autorelease];
progressHud.opacity = 0.8;
[self.view addSubview:progressHud];
[progressHud hide:NO];
progressHud.userInteractionEnabled = NO;
And the -(void)saveImageURL:(NSString*)url; method will actually save it to the image library.
(A better way would be to do the download through an NSURLRequest and update the progress hud in MBProgressHUDModeDeterminate to deflect how long it'll actually take to download, but this is a more hacked together implementation then that)
-(void)saveImageURL:(NSString*)url{
[self performSelectorOnMainThread:#selector(showStartSaveAlert) withObject:nil waitUntilDone:YES];
UIImageWriteToSavedPhotosAlbum([UIImage imageWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:url]]], nil, nil, nil);
[self performSelectorOnMainThread:#selector(showFinishedSaveAlert) withObject:nil waitUntilDone:YES];
}
-(void)showStartSaveAlert{
progressHud.mode = MBProgressHUDModeIndeterminate;
progressHud.labelText = #"Saving Image...";
[progressHud show:YES];
}
-(void)showFinishedSaveAlert{
// Set custom view mode
progressHud.mode = MBProgressHUDModeCustomView;
progressHud.labelText = #"Completed";
[progressHud performSelector:#selector(hide:) withObject:[NSNumber numberWithBool:YES] afterDelay:0.5];
}
And of cause add [progressHud release]; to the dealloc method.
Hopefully this shows you how to add some of the options to a webView that apple left out.
Of cause though you can add more things to this like a "Read Later" option for instapaper or a "Open In Safari" button.
(looking at the length of this post I'm seeing why the original tutorial left out the finial implementation details)
Edit: (updated with more info)
I was asked about the detail I glossed over at the top, the #"tapAndHoldShortNotification", so this is clarifying it.
This is my UIWindow subclass, it adds the second notification to cancel the default selection menu (this is because when I tried the tutorial it showed both menus).
- (void)tapAndHoldAction:(NSTimer*)timer {
contextualMenuTimer = nil;
UIView* clickedView = [self hitTest:CGPointMake(tapLocation.x, tapLocation.y) withEvent:nil];
while (clickedView != nil) {
if ([clickedView isKindOfClass:[UIWebView class]]) {
break;
}
clickedView = clickedView.superview;
}
if (clickedView) {
NSDictionary *coord = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat:tapLocation.x],#"x",
[NSNumber numberWithFloat:tapLocation.y],#"y",nil];
[[NSNotificationCenter defaultCenter] postNotificationName:#"TapAndHoldNotification" object:coord];
}
}
- (void)tapAndHoldActionShort:(NSTimer*)timer {
UIView* clickedView = [self hitTest:CGPointMake(tapLocation.x, tapLocation.y) withEvent:nil];
while (clickedView != nil) {
if ([clickedView isKindOfClass:[UIWebView class]]) {
break;
}
clickedView = clickedView.superview;
}
if (clickedView) {
NSDictionary *coord = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat:tapLocation.x],#"x",
[NSNumber numberWithFloat:tapLocation.y],#"y",nil];
[[NSNotificationCenter defaultCenter] postNotificationName:#"TapAndHoldShortNotification" object:coord];
}
}
- (void)sendEvent:(UIEvent *)event {
NSSet *touches = [event touchesForWindow:self];
[touches retain];
[super sendEvent:event]; // Call super to make sure the event is processed as usual
if ([touches count] == 1) { // We're only interested in one-finger events
UITouch *touch = [touches anyObject];
switch ([touch phase]) {
case UITouchPhaseBegan: // A finger touched the screen
tapLocation = [touch locationInView:self];
[contextualMenuTimer invalidate];
contextualMenuTimer = [NSTimer scheduledTimerWithTimeInterval:0.8 target:self selector:#selector(tapAndHoldAction:) userInfo:nil repeats:NO];
NSTimer *myTimer;
myTimer = [NSTimer scheduledTimerWithTimeInterval:0.2 target:self selector:#selector(tapAndHoldActionShort:) userInfo:nil repeats:NO];
break;
case UITouchPhaseEnded:
case UITouchPhaseMoved:
case UITouchPhaseCancelled:
[contextualMenuTimer invalidate];
contextualMenuTimer = nil;
break;
}
} else { // Multiple fingers are touching the screen
[contextualMenuTimer invalidate];
contextualMenuTimer = nil;
}
[touches release];
}
The notification is then handled like this:
// in -viewDidLoad
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(stopSelection:) name:#"TapAndHoldShortNotification" object:nil];
- (void)stopSelection:(NSNotification*)notification{
[webView stringByEvaluatingJavaScriptFromString:#"document.documentElement.style.webkitTouchCallout='none';"];
}
It's only a little change but it fixes the annoying little bug where you get 2 menus appear (the standard one and yours).
Also you could easily add iPad support by sending the touches location as the notification fires and then showing the UIActionSheet from that point, though this was written before the iPad so doesn't include support for that.
After struggling for, like 2 or 3 days non-stop on this problem, it seems like the position is computed "relatively" to the UIWebView's "TOP-LEFT" corner (I am programing for iOS 7).
So, to make this work, when you get the position, on the controller where your WebView is (i'll put a snippet of my code below), don't add the "scroll-offset"
SNIPPET - ContextualMenuAction:
- (void)contextualMenuAction:(NSNotification*)notification {
// Load javascript
[self loadJavascript];
// Initialize the coordinates
CGPoint pt;
pt.x = [[[notification object] objectForKey:#"x"] floatValue];
pt.y = [[[notification object] objectForKey:#"y"] floatValue];
// Convert point from window to view coordinate system
pt = [self.WebView convertPoint:pt fromView:nil];
// Get PAGE and UIWEBVIEW dimensions
CGSize pageDimensions = [self.WebView documentSize];
CGSize webviewDimensions = self.WebView.frame.size;
/***** If the page is in MOBILE version *****/
if (webviewDimensions.width == pageDimensions.width) {
}
/***** If the page is in DESKTOP version *****/
else {
// convert point from view to HTML coordinate system
CGSize viewSize = [self.WebView frame].size;
// Contiens la portion de la page visible depuis la webview (en fonction du zoom)
CGSize windowSize = [self.WebView windowSize];
CGFloat factor = windowSize.width / viewSize.width;
CGFloat factorHeight = windowSize.height / viewSize.height;
NSLog(#"factor: %f", factor);
pt.x = pt.x * factor; // ** logically, we would add the offset **
pt.y = pt.y * factorHeight; // ** logically, we would add the offset **
}
NSLog(#"x: %f and y: %f", pt.x, pt.y);
NSLog(#"WINDOW: width: %f height: %f", [self.WebView windowSize].width, [self.WebView windowSize].height);
NSLog(#"DOCUMENT: width: %f height: %f", pageDimensions.width, pageDimensions.height);
[self openContextualMenuAt:pt];
}
SNIPPET - in openContextualMenuAt:
To load the correct JS function:
- (void)openContextualMenuAt:(CGPoint)pt {
// Load javascript
[self loadJavascript];
// get the Tags at the touch location
NSString *tags = [self.WebView stringByEvaluatingJavaScriptFromString:[NSString stringWithFormat:#"getHTMLTagsAtPoint(%li,%li);",(long)pt.x,(long)pt.y]];
...
}
SNIPPET - in JSTools.js:
This is the function I use to get the element touched
function getHTMLTagsAtPoint(x,y) {
var tags = ",";
var element = document.elementFromPoint(x,y);
while (element) {
if (element.tagName) {
tags += element.tagName + ',';
}
element = element.parentNode;
}
return tags;
}
SNIPPET - loadJavascript
I use this one to inject my JS code in the webview
-(void)loadJavascript {
[self.WebView stringByEvaluatingJavaScriptFromString:
[NSString stringWithContentsOfFile:[[NSBundle mainBundle] pathForResource:#"JSTools" ofType:#"js"] encoding:NSUTF8StringEncoding error:nil]];
}
This part (everything I did to overrride the default UIActionSheet) is HEAVILY (should I say completely) based on
this post
#Freerunning's answer is complete (i did almost everything he said in my other classes, like on the post my code is based on), the snippets i posted is just to show you more "completely" how my code is.
Hope this helps! ^^
First of all thanks to Freerunnering for the great solution!
But you can do this with an UILongPressGestureRecognizer instead of a custom LongPressRecognizer. This makes things a bit easier to implement:
In the Viewcontroller Containing the webView:
Add UIGestureRecognizerDelegate to your ViewController
let mainJavascript = "function MyAppGetHTMLElementsAtPoint(x,y) { var tags = \",\"; var e = document.elementFromPoint(x,y); while (e) { if (e.tagName) { tags += e.tagName + ','; } e = e.parentNode; } return tags; } function MyAppGetLinkSRCAtPoint(x,y) { var tags = \"\"; var e = document.elementFromPoint(x,y); while (e) { if (e.src) { tags += e.src; break; } e = e.parentNode; } return tags; } function MyAppGetLinkHREFAtPoint(x,y) { var tags = \"\"; var e = document.elementFromPoint(x,y); while (e) { if (e.href) { tags += e.href; break; } e = e.parentNode; } return tags; }"
func viewDidLoad() {
...
let longPressRecognizer = UILongPressGestureRecognizer(target: self, action: #selector(CustomViewController.longPressRecognizerAction(_:)))
self.webView.scrollView.addGestureRecognizer(longPressRecognizer)
longPressRecognizer.delegate = self
...
}
func longPressRecognizerAction(sender: UILongPressGestureRecognizer) {
if sender.state == UIGestureRecognizerState.Began {
let tapPostion = sender.locationInView(self.webView)
let tags = self.webView.stringByEvaluatingJavaScriptFromString("MyAppGetHTMLElementsAtPoint(\(tapPostion.x),\(tapPostion.y));")
let href = self.webView.stringByEvaluatingJavaScriptFromString("MyAppGetLinkHREFAtPoint(\(tapPostion.x),\(tapPostion.y));")
let src = self.webView.stringByEvaluatingJavaScriptFromString("MyAppGetLinkSRCAtPoint(\(tapPostion.x),\(tapPostion.y));")
print("tags: \(tags)\nhref: \(href)\nsrc: \(src)")
// handle the results, for example with an UIDocumentInteractionController
}
}
// Without this function, the customLongPressRecognizer would be replaced by the original UIWebView LongPressRecognizer
func gestureRecognizer(gestureRecognizer: UIGestureRecognizer, shouldRecognizeSimultaneouslyWithGestureRecognizer otherGestureRecognizer: UIGestureRecognizer) -> Bool {
return true
}
And thats it!
I have an AVAudioPlayer playing some audio (duh!)
The audio is initiated when the user presses a button.
When they release it I want the audio to fade out.
I am using Interface builder...so I am trying to hook up a function on "touch up inside" that fades the audio out over 1 sec then stops.
Any ideas?
Thanks
Here's how I'm doing it:
-(void)doVolumeFade
{
if (self.player.volume > 0.1) {
self.player.volume = self.player.volume - 0.1;
[self performSelector:#selector(doVolumeFade) withObject:nil afterDelay:0.1];
} else {
// Stop and get the sound ready for playing again
[self.player stop];
self.player.currentTime = 0;
[self.player prepareToPlay];
self.player.volume = 1.0;
}
}
.
11 years later: do note that setVolume#fadeDuration now exists!
Swift has an AVAudioPlayer method you can use for fading out which was included as of iOS 10.0:
var audioPlayer = AVAudioPlayer()
...
audioPlayer.setVolume(0, fadeDuration: 3)
I tackled this problem using an NSOperation subclass so fading the volume doesn't block the main thread. It also allows fades to be queued and and forgotten about. This is especially useful for playing one shot sounds with fade-in and fade-out effects as they are dealloced after the last fade is completed.
// Example of MXAudioPlayerFadeOperation in NSOperationQueue
NSOperationQueue *audioFaderQueue = [[NSOperationQueue alloc] init];
[audioFaderQueue setMaxConcurrentOperationCount:1]; // Execute fades serially.
NSString *filePath = [[NSBundle mainBundle] pathForResource:#"bg" ofType:#"mp3"]; // path to bg.mp3
AVAudioPlayer *player = [[[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:filePath] error:NULL] autorelease];
[player setNumberOfLoops:-1];
[player setVolume:0.0];
// Note that delay is delay after last fade due to the Operation Queue working serially.
MXAudioPlayerFadeOperation *fadeIn = [[MXAudioPlayerFadeOperation alloc] initFadeWithAudioPlayer:player toVolume:1.0 overDuration:3.0];
[fadeIn setDelay:2.0];
MXAudioPlayerFadeOperation *fadeDown = [[MXAudioPlayerFadeOperation alloc] initFadeWithAudioPlayer:player toVolume:0.1 overDuration:3.0];
[fadeDown setDelay:0.0];
MXAudioPlayerFadeOperation *fadeUp = [[MXAudioPlayerFadeOperation alloc] initFadeWithAudioPlayer:player toVolume:1.0 overDuration:4.0];
[fadeUp setDelay:0.0];
MXAudioPlayerFadeOperation *fadeOut = [[MXAudioPlayerFadeOperation alloc] initFadeWithAudioPlayer:player toVolume:0.0 overDuration:3.0];
[fadeOut setDelay:2.0];
[audioFaderQueue addOperation:fadeIn]; // 2.0s - 5.0s
[audioFaderQueue addOperation:fadeDown]; // 5.0s - 8.0s
[audioFaderQueue addOperation:fadeUp]; // 8.0s - 12.0s
[audioFaderQueue addOperation:fadeOut]; // 14.0s - 17.0s
[fadeIn release];
[fadeDown release];
[fadeUp release];
[fadeOut release];
For MXAudioPlayerFadeOperation class code see this post.
I ended up combining some of the answer together and converted it to Swift ending up in this method:
func fadeVolumeAndPause(){
if self.player?.volume > 0.1 {
self.player?.volume = self.player!.volume - 0.1
var dispatchTime: dispatch_time_t = dispatch_time(DISPATCH_TIME_NOW, Int64(0.1 * Double(NSEC_PER_SEC)))
dispatch_after(dispatchTime, dispatch_get_main_queue(), {
self.fadeVolumeAndPause()
})
} else {
self.player?.pause()
self.player?.volume = 1.0
}
}
These are all good answers, however they don't deal with specifying the rate of fade (or applying a logarithmic curve to the fade, which is sometimes desirable), or specifying the number of dB's reduction from unity that you are fading to.
this is an except from one of my apps, with a few "bells and whistles" removed, that are not relevant to this question.
enjoy!
#define linearToDecibels(linear) (MIN(10,MAX(-100,20.0 * log10(linear))))
#define decibelsToLinear(decibels) (pow (10, (0.05 * decibels)))
#define fadeInfoId(n) [fadeInfo objectForKey:##n]
#define fadeInfoObject(NSObject,n) ((NSObject*) fadeInfoId(n))
#define fadeInfoFloat(n) [fadeInfoId(n) floatValue]
#define useFadeInfoObject(n) * n = fadeInfoId(n)
#define useFadeInfoFloat(n) n = fadeInfoFloat(n)
#define setFadeInfoId(n,x) [fadeInfo setObject:x forKey:##n]
#define setFadeInfoFloat(n,x) setFadeInfoId(n,[NSNumber numberWithFloat:x])
#define setFadeInfoFlag(n) setFadeInfoId(n,[NSNumber numberWithBool:YES])
#define saveFadeInfoId(n) setFadeInfoId(n,n)
#define saveFadeInfoFloat(n) setFadeInfoFloat(n,n)
#define fadeAVAudioPlayer_default nil
#define fadeAVAudioPlayer_linearFade #"linearFade"
#define fadeAVAudioPlayer_fadeToStop #"fadeToStop"
#define fadeAVAudioPlayer_linearFadeToStop #"linearFadeToStop"
-(void) fadeAVAudioPlayerTimerEvent:(NSTimer *) timer {
NSMutableDictionary *fadeInfo = timer.userInfo;
NSTimeInterval elapsed = 0 - [fadeInfoObject(NSDate,startTime) timeIntervalSinceNow];
NSTimeInterval useFadeInfoFloat(fadeTime);
float useFadeInfoFloat(fadeToLevel);
AVAudioPlayer useFadeInfoObject(player);
double linear;
if (elapsed>fadeTime) {
if (fadeInfoId(stopPlaybackAtFadeTime)) {
[player stop];
linear = fadeInfoFloat(fadeFromLevel);
} else {
linear = fadeToLevel;
}
[timer invalidate];
[fadeInfo release];
} else {
if (fadeInfoId(linearCurve)) {
float useFadeInfoFloat(fadeFromLevel);
float fadeDelta = fadeToLevel-fadeFromLevel;
linear = fadeFromLevel + (fadeDelta * (elapsed/fadeTime));
} else {
float useFadeInfoFloat(fadeToDB);
float useFadeInfoFloat(fadeFromDB);
float fadeDelta = fadeToDB-fadeFromDB;
float decibels = fadeFromDB + (fadeDelta * (elapsed/fadeTime));
linear = decibelsToLinear(decibels);
}
}
[player setVolume: linear];
//[self displayFaderLevelForMedia:player];
//[self updateMediaVolumeLabel:player];
}
-(void) fadeAVAudioPlayerLinear:(AVAudioPlayer *)player over:(NSTimeInterval) fadeTime fadeToLevel:(float) fadeToLevel fadeMode:(NSString*)fadeMode {
NSMutableDictionary *fadeInfo = [[NSMutableDictionary alloc ]init];
saveFadeInfoId(player);
float fadeFromLevel = player.volume;// to optimize macros put value in var, so we don't call method 3 times.
float fadeFromDB = linearToDecibels(fadeFromLevel);
float fadeToDB = linearToDecibels(fadeToLevel);
saveFadeInfoFloat(fadeFromLevel);
saveFadeInfoFloat(fadeToLevel);
saveFadeInfoFloat(fadeToDB);
saveFadeInfoFloat(fadeFromDB);
saveFadeInfoFloat(fadeTime);
setFadeInfoId(startTime,[NSDate date]);
if([fadeMode isEqualToString:fadeAVAudioPlayer_fadeToStop]||[fadeMode isEqualToString:fadeAVAudioPlayer_linearFadeToStop]){
setFadeInfoFlag(stopPlaybackAtFadeTime);
}
if([fadeMode isEqualToString:fadeAVAudioPlayer_linearFade]||[fadeMode isEqualToString:fadeAVAudioPlayer_linearFadeToStop]){
setFadeInfoFlag(linearCurve);
}
[NSTimer scheduledTimerWithTimeInterval:0.05 target:self selector:#selector(fadeAVAudioPlayerTimerEvent:) userInfo:fadeInfo repeats:YES];
}
-(void) fadeAVAudioPlayer:(AVAudioPlayer *)player over:(NSTimeInterval) fadeTime fadeToDB:(float) fadeToDB fadeMode:(NSString*)fadeMode {
[self fadeAVAudioPlayerLinear:player over:fadeTime fadeToLevel:decibelsToLinear(fadeToDB) fadeMode:fadeMode ];
}
-(void) fadeoutAVAudioPlayer:(AVAudioPlayer *)player {
[self fadeAVAudioPlayerLinear:player over:5.0 fadeToLevel:0 fadeMode:fadeAVAudioPlayer_default];
}
-(void) fadeinAVAudioPlayer:(AVAudioPlayer *)player {
[self fadeAVAudioPlayerLinear:player over:5.0 fadeToLevel:0 fadeMode:fadeAVAudioPlayer_default];
}
An extension for swift 3 inspired from the most voted answer. For those of you who like copy-pasting :)
extension AVAudioPlayer {
func fadeOut() {
if volume > 0.1 {
// Fade
volume -= 0.1
perform(#selector(fadeOut), with: nil, afterDelay: 0.1)
} else {
// Stop and get the sound ready for playing again
stop()
prepareToPlay()
volume = 1
}
}
}
I wrote a helper class in Swift for fading AvAudioPlayer in and out. You can use logarithmic volume function for more gradual fading effect.
let player = AVAudioPlayer(contentsOfURL: soundURL, error: nil)
let fader = iiFaderForAvAudioPlayer(player: player)
fader.fadeIn()
fader.fadeOut()
Here a demo app: https://github.com/evgenyneu/sound-fader-ios
Swift 3
I like Ambroise Collon answer's , so i voted up but Swift is statically typed so the performSelector: methods are to fall by the wayside, maybe an alternative could be dispatch async (in this version I've added also the destination volume as parameter)
func dispatchDelay(delay:Double, closure:#escaping ()->()) {
DispatchQueue.main.asyncAfter(deadline: DispatchTime.now() + delay, execute: closure)
}
extension AVAudioPlayer {
func fadeOut(vol:Float) {
if volume > vol {
//print("vol is : \(vol) and volume is: \(volume)")
dispatchDelay(delay: 0.1, closure: {
[weak self] in
guard let strongSelf = self else { return }
strongSelf.volume -= 0.01
strongSelf.fadeOut(vol: vol)
})
} else {
volume = vol
}
}
func fadeIn(vol:Float) {
if volume < vol {
dispatchDelay(delay: 0.1, closure: {
[weak self] in
guard let strongSelf = self else { return }
strongSelf.volume += 0.01
strongSelf.fadeIn(vol: vol)
})
} else {
volume = vol
}
}
}
This seems to me to be a descent use of an NSOperationQueue.
Hence here is my solution:
-(void) fadeIn
{
if (self.currentPlayer.volume >= 1.0f) return;
else {
self.currentPlayer.volume+=0.10;
__weak typeof (self) weakSelf = self;
[NSThread sleepForTimeInterval:0.2f];
[self.fadingQueue addOperationWithBlock:^{
NSLog(#"fading in %.2f", self.currentPlayer.volume);
[weakSelf fadeIn];
}];
}
}
-(void) fadeOut
{
if (self.currentPlayer.volume <= 0.0f) return;
else {
self.currentPlayer.volume -=0.1;
__weak typeof (self) weakSelf = self;
[NSThread sleepForTimeInterval:0.2f];
[self.fadingQueue addOperationWithBlock:^{
NSLog(#"fading out %.2f", self.currentPlayer.volume);
[weakSelf fadeOut];
}];
}
}
How about this: (if time passed in is negative then fade out the sound, otherwise fade in)
- (void) fadeInOutVolumeOverTime: (NSNumber *)time
{
#define fade_out_steps 0.1
float theVolume = player.volume;
NSTimeInterval theTime = [time doubleValue];
int sign = (theTime >= 0) ? 1 : -1;
// before we call this, if we are fading out, we save the volume
// so that we can restore back to that level in the fade in
if ((sign == 1) &&
((theVolume >= savedVolume) ||
(theTime == 0))) {
player.volume = savedVolume;
}
else if ((sign == -1) && (theVolume <= 0)) {
NSLog(#"fading");
[player pause];
[self performSelector:#selector(fadeInOutVolumeOverTime:) withObject:[NSNumber numberWithDouble:0] afterDelay:1.0];
}
else {
theTime *= fade_out_steps;
player.volume = theVolume + fade_out_steps * sign;
[self performSelector:#selector(fadeInOutVolumeOverTime:) withObject:time afterDelay:fabs(theTime)];
}
}
In Objective-C try this:
NSURL *trackURL = [[NSURL alloc]initFileURLWithPath:#"path to audio track"];
// fade duration in seconds
NSTimeInterval fadeDuration = 0.3;
// duration of the audio track in seconds
NSTimeInterval audioTrackDuration = 5.0;
AVAudioPlayer *audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:trackURL error:nil];
[audioPlayer play];
// first we set the volume to 1 - highest
[audioPlayer setVolume:1.0];
// then to 0 - lowest
[musicAudioPlayer setVolume:0 fadeDuration:audioTrackDuration - fadeDuration];
Swift solution:
The top rated answer here is great but it gives a stuttering effect as the volume step of 0.1 is too much. Using 0.01 gives a smoother fade effect to hear.
Using this code you can specify how long you want the fade transition to last.
let fadeVolumeStep: Float = 0.01
let fadeTime = 0.5 // Fade time in seconds
var fadeVolumeStepTime: Double {
return fadeTime / Double(1.0 / fadeVolumeStep)
}
func fadeOut() {
guard let player = self.player else {
return
}
if !player.playing { return }
func fadeOutPlayer() {
if player.volume > fadeVolumeStep {
player.volume -= fadeVolumeStep
delay(time: fadeVolumeStepTime, closure: {
fadeOutPlayer()
})
} else {
player.stop()
player.currentTime = 0
player.prepareToPlay()
}
}
fadeOutPlayer()
}
func fadeIn() {
guard let player = self.player else {
return
}
if player.playing { return }
player.volume = 0
player.play()
func fadeInPlayer() {
if player.volume <= 1 - fadeVolumeStep {
player.volume += fadeVolumeStep
delay(time: fadeVolumeStepTime, closure: {
fadeInPlayer()
})
} else {
player.volume = 1
}
}
fadeInPlayer()
}
func delay(time delay:Double, closure:()->()) {
dispatch_after(
dispatch_time(
DISPATCH_TIME_NOW,
Int64(delay * Double(NSEC_PER_SEC))
),
dispatch_get_main_queue(), closure)
}
you can adjust the time using fadeTime constant.