Playing a .mov file in UIImageView appears to lose frame rate? - ios5

I have a 480p MOV file that looks fantastic (quality wise) in a UiImageView. However, it appears to lose framerate when being played. Is there a better way to play this movie w/out the controls? It needs to play like an animation on top of a UI.
Thanks in advance!
Here's my code:
-(void)imagesExtractionThread {
NSDate * now = [[NSDate alloc] init];
startTime = [now timeIntervalSinceReferenceDate];
do{
[self performSelectorOnMainThread:#selector(setImageToAvatar) withObject:nil waitUntilDone:YES];
usleep(9000);
}while(!closing);}
-(void)setImageToAvatar{
NSDate * now = [[NSDate alloc] init];
NSTimeInterval curTime = [now timeIntervalSinceReferenceDate];
currentTime = curTime - startTime;
if(currentTime>player.playableDuration){
NSLog(#"Current time %lf",currentTime);
currentTime = 0.0;
startTime = curTime;
if(player.playableDuration >0){
closing = YES;
avatar.hidden = YES;
}
}
UIImage *singleFrameImage = [player thumbnailImageAtTime:currentTime
timeOption:1];
//singleFrameImage = [self changeGreenColorWithTransparent:singleFrameImage];
//
/*sourcePicture = [[GPUImagePicture alloc] initWithImage:singleFrameImage smoothlyScaleOutput:YES];
[sourcePicture addTarget:filter];
[sourcePicture processImage];
singleFrameImage = [filter imageFromCurrentlyProcessedOutput];*/
avatar.image = singleFrameImage;}
I start the play process with this code:
player = [[MPMoviePlayerController alloc] initWithContentURL: url];
player.shouldAutoplay = NO;
[NSThread detachNewThreadSelector:#selector(imagesExtractionThread) toTarget:self withObject:nil];

Use AVFoundation framework and AVPlayer class to play the video file. You get free hand to customize the appearance, controls etc. You may hide the controls if you don't want them.

Related

ALAsset thumbnail at specific timestamp

I'm working working on an iPhone application for uploading video files to a specific platform, and one feature I would really love is to be able to present, say, ten different thumbnails for the same video for the user to pick from.
The problem is, that ALAsset only provides a thumbnail method, which just returns the default thumbnail. I have read through the ALAssetRepresentation and ALAsset documentation and I can't seem to find a way to get a thumbnail for a specific timestamp.
I guess one option would be to use something along the lines of libav to get thumbnails but that seems a little "over the top" for something like this. Can anyone help me on this one?
Best regards,
Nick
i think this will help you , and
you can also see through this prompt
Video File thumbnail timestamp missing in ALAsset
{
if ([theAsset valueForProperty:ALAssetPropertyType] == ALAssetTypeVideo) {
// Black semi-transparent background at the bottom of the item
CGRect containerFrame = CGRectMake(0, frame.size.height - AGIPC_ITEM_HEIGHT, frame.size.width, AGIPC_ITEM_HEIGHT);
UIView *containerForMovieInfo = [[[UIView alloc] initWithFrame:containerFrame] autorelease];
containerForMovieInfo.backgroundColor = [UIColor blackColor];
containerForMovieInfo.alpha = 0.7f;
// Movie icon on left side
CGRect movieFrame = CGRectMake(4, 60, 26, 15);
UIImageView *movieImageView = [[[UIImageView alloc] initWithFrame:movieFrame] autorelease];
if (IS_IPAD()) {
movieImageView.image = [UIImage imageNamed:#"AGIPC-Movie-iPad"];
} else {
movieImageView.image = [UIImage imageNamed:#"AGIPC-Movie-iPhone"];
}
[containerForMovieInfo addSubview:movieImageView];
// Movie duration on right side
if ([theAsset valueForProperty:ALAssetPropertyDuration] != ALErrorInvalidProperty) {
NSDateFormatter *formatter = [[[NSDateFormatter alloc] init] autorelease];
[formatter setDateFormat:#"mm:ss"];
CGRect durationFrame = CGRectMake(frame.size.width - 26 - 4, 60, 26, 15);
UILabel *durationView = [[[UILabel alloc] initWithFrame:durationFrame] autorelease];
durationView.backgroundColor = [UIColor clearColor];
durationView.textColor = [UIColor whiteColor];
durationView.text = [formatter stringFromDate:[NSDate dateWithTimeIntervalSince1970:[[theAsset valueForProperty:ALAssetPropertyDuration] doubleValue]]];
durationView.font = [UIFont systemFontOfSize:10];
[containerForMovieInfo addSubview:durationView];
}
[self addSubview:containerForMovieInfo];
}
}
last but not least, you must creat the image of the camera on your own.
// Get URL from ALAsset* asset:
NSURL* assetURL = [asset valueForProperty:ALAssetPropertyAssetURL];
// Create AVURLAsset using this URL (assetOptions is optional):
NSDictionary* assetOptions = nil;
// assetOptions = #{AVURLAssetPreferPreciseDurationAndTimingKey : #(YES)};
AVAsset* avAsset = [[AVURLAsset alloc] initWithURL:assetURL options:assetOptions];
// Create generator:
AVAssetImageGenerator* generator = [[AVAssetImageGenerator alloc] initWithAsset:avAsset];
generator.appliesPreferredTrackTransform = YES;
// Create array with CMTimes of thumbnails using your own logic.
// (Use +(NSValue*)valueWithCMTime:(CMTime)time to add CMTime in array).
NSArray* times = [self generateThumbnailTimesForVideo:avAsset];
// Generate thumbnail images asynchronously:
[generator generateCGImagesAsynchronouslyForTimes:times
completionHandler:^(CMTime requestedTime,
CGImageRef image,
CMTime actualTime,
AVAssetImageGeneratorResult result,
NSError* error)
{
// This block is performed for each CMTime in times array.
UIImage* thumbnail = [[UIImage alloc] initWithCGImage:image];
}
];
Synchronous method to get thumbnail at any time is
// PS: SYNC method:
CGImageRef imgRef = [generator copyCGImageAtTime:time actualTime:NULL error:&error];
UIImage* thumbnail = [[UIImage alloc] initWithCGImage:image];

Updating the progress bar in iOS 5

I am developing an iOS audio player, and I want to implement a progress bar that will indicate the progress of the current song which is playing.
In my ViewController class, I have 2 double instances - time and duration, and a AVAudioPlayer instance called background.
- (IBAction)play:(id)sender {
NSString *filePath = [[NSBundle mainBundle] pathForResource:#"some_song" ofType:#"mp3"];
NSURL *fileURL = [[NSURL alloc] initFileURLWithPath:filePath];
background = [[AVAudioPlayer alloc] initWithContentsOfURL:fileURL error:nil];
background.delegate = self;
[background setNumberOfLoops:1];
[background setVolume:0.5];
[background play];
time = 0;
duration = [background duration];
while(time < duration){
[progressBar setProgress: (double) time/duration animated:YES];
time += 1;
}
}
Could anyone explain what I am doing wrong?
Thanks in advance.
you don't update the progress of the progress bar during playback. When you start to play the sound you set the progress bar to 1, to 2, to 3, to 4, to 5... to 100%. All without leaving the current runloop. Which means you will only see the last step, a full progress bar.
You should use a NSTimer to update the progress bar. Something like this:
- (IBAction)play:(id)sender {
/* ... */
[self.player play];
self.timer = [NSTimer scheduledTimerWithTimeInterval:0.23 target:self selector:#selector(updateProgressBar:) userInfo:nil repeats:YES];
}
- (void)updateProgressBar:(NSTimer *)timer {
NSTimeInterval playTime = [self.player currentTime];
NSTimeInterval duration = [self.player duration];
float progress = playTime/duration;
[self.progressView setProgress:progress];
}
when you stop playing invalidate the timer.
[self.timer invalidate];
self.timer = nil;

iPhone UIImageView delays display when setImage

I am working on displaying several Images in a scrollview loaded from a MapServer which returns an image of showing a map.
So what I did is I have created 4 UIImageViews, put them in a NSMutableDictionary.
Then when scrolling to the desirable Image it will initiate to load the data from the url asynchronous.
so first I display an UIActivityIndicatorView, then it will load the data and at the end hide the UIActivityIndicatorView and display the UIImageView.
everything is working more or less fine, apart that it takes sooo long until the image displays, allthough the image is not that big and I have a log text indicating that the end of the function arrived... this log message appears immediately but the image still does not show...
If I call the URL by the Webbrowser the image is shown immediately as well.
below you see my piece of code.
- (void) loadSRGImage:(int) page {
UIImageView *currentSRGMap = (UIImageView *)[srgMaps objectForKey:[NSString stringWithFormat:#"image_%i", page]];
UIActivityIndicatorView *currentLoading = (UIActivityIndicatorView *)[srgMaps objectForKey:[NSString stringWithFormat:#"loading_%i", page]];
// if the image has been loaded already, do not load again
if ( currentSRGMap.image != nil ) return;
if ( page > 1 ) {
MKCoordinateSpan currentSpan;
currentSpan.latitudeDelta = [[[srgMaps objectForKey:[NSString stringWithFormat:#"span_%i", page]] objectForKey:#"lat"] floatValue];
currentSpan.longitudeDelta = [[[srgMaps objectForKey:[NSString stringWithFormat:#"span_%i", page]] objectForKey:#"lon"] floatValue];
region.span = currentSpan;
region.center = mapV.region.center;
[mapV setRegion:region animated:TRUE];
//[mapV regionThatFits:region];
}
srgLegende.hidden = NO;
currentSRGMap.hidden = YES;
currentLoading.hidden = NO;
[currentLoading startAnimating];
NSOperationQueue *queue = [NSOperationQueue new];
NSInvocationOperation *operation = [[NSInvocationOperation alloc]
initWithTarget:self
selector:#selector(loadImage:)
object:[NSString stringWithFormat:#"%i", page]];
[queue addOperation:operation];
[operation release];
}
- (void) loadImage:(NSInvocationOperation *) operation {
NSString *imgStr = [#"image_" stringByAppendingString:(NSString *)operation];
NSString *loadStr = [#"loading_" stringByAppendingString:(NSString *)operation];
WGS84ToCH1903 *converter = [[WGS84ToCH1903 alloc] init];
CLLocationCoordinate2D coord1 = [mapV convertPoint:mapV.bounds.origin toCoordinateFromView:mapV];
CLLocationCoordinate2D coord2 = [mapV convertPoint:CGPointMake(mapV.bounds.size.width, mapV.bounds.size.height) toCoordinateFromView:mapV];
int x1 = [converter WGStoCHx:coord1.longitude withLat:coord1.latitude];
int y1 = [converter WGStoCHy:coord1.longitude withLat:coord1.latitude];
int x2 = [converter WGStoCHx:coord2.longitude withLat:coord2.latitude];
int y2 = [converter WGStoCHy:coord2.longitude withLat:coord2.latitude];
NSString *URL = [NSString stringWithFormat:#"http://map.ssatr.ch/mapserv?mode=map&map=import/dab/maps/dab_online.map&mapext=%i+%i+%i+%i&mapsize=320+372&layers=DAB_Radio_Top_Two", y1, x1, y2, x2];
NSData* imageData = [[NSData alloc] initWithContentsOfURL:[NSURL URLWithString:URL]];
UIImage* image = [[UIImage alloc] initWithData:imageData];
[imageData release];
UIImageView *currentSRGMap = (UIImageView *)[srgMaps objectForKey:imgStr];
UIActivityIndicatorView *currentLoading = (UIActivityIndicatorView *)[srgMaps objectForKey:loadStr];
currentSRGMap.hidden = NO;
currentLoading.hidden = YES;
[currentLoading stopAnimating];
[currentSRGMap setImage:image]; //UIImageView
[image release];
NSLog(#"finished loading image: %#", URL);
}
I had a similar thing in my app, and i used the SDWebImage from https://github.com/rs/SDWebImage. This has a category written over UIImageView. You need to mention a placeholder image and a url to the UIImageView. It will display the image once it is downloaded and also maintains a cache of the downloaded images, hence avoiding multiple server calls.

AVFoundation - Retiming CMSampleBufferRef Video Output

First time asking a question here. I'm hoping the post is clear and sample code is formatted correctly.
I'm experimenting with AVFoundation and time lapse photography.
My intent is to grab every Nth frame from the video camera of an iOS device (my iPod touch, version 4) and write each of those frames out to a file to create a timelapse. I'm using AVCaptureVideoDataOutput, AVAssetWriter and AVAssetWriterInput.
The problem is, if I use the CMSampleBufferRef passed to captureOutput:idOutputSampleBuffer:fromConnection:, the playback of each frame is the length of time between original input frames. A frame rate of say 1fps. I'm looking to get 30fps.
I've tried using CMSampleBufferCreateCopyWithNewTiming(), but then after 13 frames are written to the file, the captureOutput:idOutputSampleBuffer:fromConnection: stops being called. The interface is active and I can tap a button to stop the capture and save it to the photo library for playback. It appears to play back as I want it, 30fps, but it only has those 13 frames.
How can I accomplish my goal of 30fps playback?
How can I tell where the app is getting lost and why?
I've placed a flag called useNativeTime so I can test both cases. When set to YES, I get all frames I'm interested in as the callback doesn't 'get lost'. When I set that flag to NO, I only ever get 13 frames processed and am never returned to that method again. As mentioned above, in both cases I can playback the video.
Thanks for any help.
Here is where I'm trying to do the retiming.
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
BOOL useNativeTime = NO;
BOOL appendSuccessFlag = NO;
//NSLog(#"in captureOutpput sample buffer method");
if( !CMSampleBufferDataIsReady(sampleBuffer) )
{
NSLog( #"sample buffer is not ready. Skipping sample" );
//CMSampleBufferInvalidate(sampleBuffer);
return;
}
if (! [inputWriterBuffer isReadyForMoreMediaData])
{
NSLog(#"Not ready for data.");
}
else {
// Write every first frame of n frames (30 native from camera).
intervalFrames++;
if (intervalFrames > 30) {
intervalFrames = 1;
}
else if (intervalFrames != 1) {
//CMSampleBufferInvalidate(sampleBuffer);
return;
}
// Need to initialize start session time.
if (writtenFrames < 1) {
if (useNativeTime) imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
else imageSourceTime = CMTimeMake( 0 * 20 ,600); //CMTimeMake(1,30);
[outputWriter startSessionAtSourceTime: imageSourceTime];
NSLog(#"Starting CMtime");
CMTimeShow(imageSourceTime);
}
if (useNativeTime) {
imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
CMTimeShow(imageSourceTime);
// CMTime myTiming = CMTimeMake(writtenFrames * 20,600);
// CMSampleBufferSetOutputPresentationTimeStamp(sampleBuffer, myTiming); // Tried but has no affect.
appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:sampleBuffer];
}
else {
CMSampleBufferRef newSampleBuffer;
CMSampleTimingInfo sampleTimingInfo;
sampleTimingInfo.duration = CMTimeMake(20,600);
sampleTimingInfo.presentationTimeStamp = CMTimeMake( (writtenFrames + 0) * 20,600);
sampleTimingInfo.decodeTimeStamp = kCMTimeInvalid;
OSStatus myStatus;
//NSLog(#"numSamples of sampleBuffer: %i", CMSampleBufferGetNumSamples(sampleBuffer) );
myStatus = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault,
sampleBuffer,
1,
&sampleTimingInfo, // maybe a little confused on this param.
&newSampleBuffer);
// These confirm the good heath of our newSampleBuffer.
if (myStatus != 0) NSLog(#"CMSampleBufferCreateCopyWithNewTiming() myStatus: %i",myStatus);
if (! CMSampleBufferIsValid(newSampleBuffer)) NSLog(#"CMSampleBufferIsValid NOT!");
// No affect.
//myStatus = CMSampleBufferMakeDataReady(newSampleBuffer); // How is this different; CMSampleBufferSetDataReady ?
//if (myStatus != 0) NSLog(#"CMSampleBufferMakeDataReady() myStatus: %i",myStatus);
imageSourceTime = CMSampleBufferGetPresentationTimeStamp(newSampleBuffer);
CMTimeShow(imageSourceTime);
appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:newSampleBuffer];
//CMSampleBufferInvalidate(sampleBuffer); // Docs don't describe action. WTF does it do? Doesn't seem to affect my problem. Used with CMSampleBufferSetInvalidateCallback maybe?
//CFRelease(sampleBuffer); // - Not surprisingly - “EXC_BAD_ACCESS”
}
if (!appendSuccessFlag)
{
NSLog(#"Failed to append pixel buffer");
}
else {
writtenFrames++;
NSLog(#"writtenFrames: %i", writtenFrames);
}
}
//[self displayOuptutWritterStatus]; // Expect and see AVAssetWriterStatusWriting.
}
My setup routine.
- (IBAction) recordingStartStop: (id) sender
{
NSError * error;
if (self.isRecording) {
NSLog(#"~~~~~~~~~ STOPPING RECORDING ~~~~~~~~~");
self.isRecording = NO;
[recordingStarStop setTitle: #"Record" forState: UIControlStateNormal];
//[self.captureSession stopRunning];
[inputWriterBuffer markAsFinished];
[outputWriter endSessionAtSourceTime:imageSourceTime];
[outputWriter finishWriting]; // Blocks until file is completely written, or an error occurs.
NSLog(#"finished CMtime");
CMTimeShow(imageSourceTime);
// Really, I should loop through the outputs and close all of them or target specific ones.
// Since I'm only recording video right now, I feel safe doing this.
[self.captureSession removeOutput: [[self.captureSession outputs] objectAtIndex: 0]];
[videoOutput release];
[inputWriterBuffer release];
[outputWriter release];
videoOutput = nil;
inputWriterBuffer = nil;
outputWriter = nil;
NSLog(#"~~~~~~~~~ STOPPED RECORDING ~~~~~~~~~");
NSLog(#"Calling UIVideoAtPathIsCompatibleWithSavedPhotosAlbum.");
NSLog(#"filePath: %#", [projectPaths movieFilePath]);
if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum([projectPaths movieFilePath])) {
NSLog(#"Calling UISaveVideoAtPathToSavedPhotosAlbum.");
UISaveVideoAtPathToSavedPhotosAlbum ([projectPaths movieFilePath], self, #selector(video:didFinishSavingWithError: contextInfo:), nil);
}
NSLog(#"~~~~~~~~~ WROTE RECORDING to PhotosAlbum ~~~~~~~~~");
}
else {
NSLog(#"~~~~~~~~~ STARTING RECORDING ~~~~~~~~~");
projectPaths = [[ProjectPaths alloc] initWithProjectFolder: #"TestProject"];
intervalFrames = 30;
videoOutput = [[AVCaptureVideoDataOutput alloc] init];
NSMutableDictionary * cameraVideoSettings = [[[NSMutableDictionary alloc] init] autorelease];
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt: kCVPixelFormatType_32BGRA]; //kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange];
[cameraVideoSettings setValue: value forKey: key];
[videoOutput setVideoSettings: cameraVideoSettings];
[videoOutput setMinFrameDuration: CMTimeMake(20, 600)]; //CMTimeMake(1, 30)]; // 30fps
[videoOutput setAlwaysDiscardsLateVideoFrames: YES];
queue = dispatch_queue_create("cameraQueue", NULL);
[videoOutput setSampleBufferDelegate: self queue: queue];
dispatch_release(queue);
NSMutableDictionary *outputSettings = [[[NSMutableDictionary alloc] init] autorelease];
[outputSettings setValue: AVVideoCodecH264 forKey: AVVideoCodecKey];
[outputSettings setValue: [NSNumber numberWithInt: 1280] forKey: AVVideoWidthKey]; // currently assuming
[outputSettings setValue: [NSNumber numberWithInt: 720] forKey: AVVideoHeightKey];
NSMutableDictionary *compressionSettings = [[[NSMutableDictionary alloc] init] autorelease];
[compressionSettings setValue: AVVideoProfileLevelH264Main30 forKey: AVVideoProfileLevelKey];
//[compressionSettings setValue: [NSNumber numberWithDouble:1024.0*1024.0] forKey: AVVideoAverageBitRateKey];
[outputSettings setValue: compressionSettings forKey: AVVideoCompressionPropertiesKey];
inputWriterBuffer = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo outputSettings: outputSettings];
[inputWriterBuffer retain];
inputWriterBuffer.expectsMediaDataInRealTime = YES;
outputWriter = [AVAssetWriter assetWriterWithURL: [projectPaths movieURLPath] fileType: AVFileTypeQuickTimeMovie error: &error];
[outputWriter retain];
if (error) NSLog(#"error for outputWriter = [AVAssetWriter assetWriterWithURL:fileType:error:");
if ([outputWriter canAddInput: inputWriterBuffer]) [outputWriter addInput: inputWriterBuffer];
else NSLog(#"can not add input");
if (![outputWriter canApplyOutputSettings: outputSettings forMediaType:AVMediaTypeVideo]) NSLog(#"ouptutSettings are NOT supported");
if ([captureSession canAddOutput: videoOutput]) [self.captureSession addOutput: videoOutput];
else NSLog(#"could not addOutput: videoOutput to captureSession");
//[self.captureSession startRunning];
self.isRecording = YES;
[recordingStarStop setTitle: #"Stop" forState: UIControlStateNormal];
writtenFrames = 0;
imageSourceTime = kCMTimeZero;
[outputWriter startWriting];
//[outputWriter startSessionAtSourceTime: imageSourceTime];
NSLog(#"~~~~~~~~~ STARTED RECORDING ~~~~~~~~~");
NSLog (#"recording to fileURL: %#", [projectPaths movieURLPath]);
}
NSLog(#"isRecording: %#", self.isRecording ? #"YES" : #"NO");
[self displayOuptutWritterStatus];
}
OK, I found the bug in my first post.
When using
myStatus = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault,
sampleBuffer,
1,
&sampleTimingInfo,
&newSampleBuffer);
you need to balance that with a CFRelease(newSampleBuffer);
The same idea holds true when using a CVPixelBufferRef with a piexBufferPool of an AVAssetWriterInputPixelBufferAdaptor instance. You would use CVPixelBufferRelease(yourCVPixelBufferRef); after calling the appendPixelBuffer: withPresentationTime: method.
Hope this is helpful to someone else.
With a little more searching and reading I have a working solution. Don't know that it is best method, but so far, so good.
In my setup area I've setup an AVAssetWriterInputPixelBufferAdaptor. The code addition looks like this.
InputWriterBufferAdaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput: inputWriterBuffer
sourcePixelBufferAttributes: nil];
[inputWriterBufferAdaptor retain];
For completeness to understand the code below, I also have these three lines in the setup method.
fpsOutput = 30; //Some possible values: 30, 10, 15 24, 25, 30/1.001 or 29.97;
cmTimeSecondsDenominatorTimescale = 600 * 100000; //To more precisely handle 29.97.
cmTimeNumeratorValue = cmTimeSecondsDenominatorTimescale / fpsOutput;
Instead of applying a retiming to a copy of the sample buffer. I now have the following three lines of code that effectively does the same thing. Notice the withPresentationTime parameter for the adapter. By passing my custom value to that, I gain the correct timing I'm seeking.
CVPixelBufferRef myImage = CMSampleBufferGetImageBuffer( sampleBuffer );
imageSourceTime = CMTimeMake( writtenFrames * cmTimeNumeratorValue, cmTimeSecondsDenominatorTimescale);
appendSuccessFlag = [inputWriterBufferAdaptor appendPixelBuffer: myImage withPresentationTime: imageSourceTime];
Use of the AVAssetWriterInputPixelBufferAdaptor.pixelBufferPool property may have some gains, but I haven't figured that out.

how to pause and resume NSTimer in iphone

hello
I am developing small gameApp.
I need to pause the timer,when user goes to another view [say settings view].
when user comes back to that view , I need to resume the timer.
can anybody solve this issue ...
Thanks in Advance...
NSTimer does not give you the ability to pause it. However, with a few simple variables, you can create the effect yourself:
NSTimer *timer;
double timerInterval = 10.0;
double timerElapsed = 0.0;
NSDate *timerStarted;
-(void) startTimer {
timer = [NSTimer scheduledTimerWithTimeInterval:(timerInterval - timerElapsed) target:self selector:#selector(fired) userInfo:nil repeats:NO];
timerStarted = [NSDate date];
}
-(void) fired {
[timer invalidate];
timer = nil;
timerElapsed = 0.0;
[self startTimer];
// react to timer event here
}
-(void) pauseTimer {
[timer invalidate];
timer = nil;
timerElapsed = [[NSDate date] timeIntervalSinceDate:timerStarted];
}
This has been working out quite well for me.
You can't pause a timer. However, when the user goes to the settings view, you can save the fireDate of the timer and also the current date. After this you invalidate the timer and let the user do his/her stuff.
Once he/she switches back to the game, you create a new timer object and set the fire date to the old fire date plus the time the user was in the menu (oldTime + currentTime).
you can use this code to implement pause and resume functionality in NSTimer
//=========new timer update method=============
-(void) updateTimer {
NSDate *currentDate = [NSDate date];
NSTimeInterval timeInterval = [currentDate timeIntervalSinceDate:startDate];
NSDate *timerDate = [NSDate dateWithTimeIntervalSince1970:timeInterval];
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:#"mm:ss.S"];
[dateFormatter setTimeZone:[NSTimeZone timeZoneForSecondsFromGMT:0.0]];
NSString *timeString = [dateFormatter stringFromDate:timerDate];
lblMessage.text = timeString;
pauseTimeInterval = timeInterval;
}
-(IBAction) startBtnPressed:(id)sender
{
//=============================new update with start pause==================
if(running == NO) {
running = YES;
startDate = [NSDate date] ;
startDate = [[startDate dateByAddingTimeInterval:((-1)*(pauseTimeInterval))] retain];
stopWatchTimer = [NSTimer scheduledTimerWithTimeInterval:1.0/10.0 target:self selector:#selector(updateTimer) userInfo:nil repeats:YES];
}
else {
running = NO;
[stopWatchTimer invalidate];
stopWatchTimer = nil;
[self updateTimer];
}
}
declare in .h file NSDate startDate;
NSTimeInterval pauseTimeinterval;
and in viewdidload pausetimeInterval=0.0; good luck
You can't pause an NSTimer. You can, however invalidate it and create a new one when needed.
on Start -
startNewCapture = [NSDate date];
on Pause -
captureSessionPauseDate = [NSDate date];
captureSessionTimeInterval = [captureSessionPauseDate timeIntervalSinceDate:startNewCapture];
on Resume -
NSDate *dateNow = [NSDate date];
startNewCapture = [NSDate dateWithTimeInterval:-captureSessionTimeInterval sinceDate:dateNow];
- (IBAction)pauseResumeTimer:(id)sender {
if (timerRunning == NO) {
timerRunning = YES;
[pauseTimerBtn setTitle:#"Resume" forState:UIControlStateNormal];
NSString *stringVal = [NSString stringWithFormat:#"%#",timeTxt.text];
stringVal = [stringVal stringByReplacingOccurrencesOfString:#":" withString:#"."];
float tempFloatVal = [stringVal floatValue];
int minuteValue = floorf(tempFloatVal);
float tempSecVal = [stringVal floatValue] - floorf(tempFloatVal);
int secondVal = tempSecVal*100;
minuteValue = minuteValue*60;
oldTimeValue = minuteValue + secondVal;
[timer invalidate];
timer = nil;
}
else
{
timerRunning = NO;
[pauseTimerBtn setTitle:#"Pause" forState:UIControlStateNormal];
startDate = [NSDate date];
timer = [NSTimer scheduledTimerWithTimeInterval:0.25 target:self selector:#selector(timer:) userInfo:nil repeats:YES];
}
}
- (void)runTimer:(NSTimer *)timer {
NSInteger secondsAtStart = (NSInteger)[[NSDate date] timeIntervalSinceDate:startDate];
secondsAtStart = secondsAtStart + oldTimeValue;
NSInteger seconds = secondsAtStart % 60;
NSInteger minutes = (secondsAtStart / 60) % 60;
NSInteger hours = secondsAtStart / (60 * 60);
NSString *result = nil;
result = [NSString stringWithFormat:#"%02ld:%02ld",(long)minutes,(long)seconds];
timeTxt.text = result;
}
Did you end up figuring it out? I saw that you said that you cannot invalidate your timer when you go into another view. Can you explain what you mean by that? NSTimers cannot be paused, and methods to simulate pausing them usually involve invalidating them. You can then simulate "unpausing" by creating a new timer that will then start up again. This will simulate a timer being paused and unpaused.
For people who would like a potentially more convenient method, I wrote a controller class that can conveniently handle pausing and unpausing timers. You can find it here: https://github.com/LianaChu/LCPausableTimer
You can use the controller class that I wrote to create new timers, and then you can pause and unpause the timers by call the methods "pauseTimer" and "unpauseTimer" on the controller class.
I would greatly appreciate any comments or feedback like how you used it, what changes you would like to see, or any features you would like me to add. Please don't hesitate to reply with your comments here, or post on the issues tab on the Github page.