I have a camera preview window which is working well 90% of the time. Sometimes however, when returning to my app if it's been in the background, the preview will not display. This is the code I call when the view loads:
- (void) startCamera {
session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetPhoto;
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = _cameraView.bounds;
[_cameraView.layer addSublayer:captureVideoPreviewLayer];
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
captureVideoPreviewLayer.position=CGPointMake(CGRectGetMidX(_cameraView.bounds), 160);
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
NSLog(#"ERROR: %#", error);
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Important!"
message:#"Unable to find a camera."
delegate:nil
cancelButtonTitle:#"Ok"
otherButtonTitles:nil];
[alert show];
[alert autorelease];
}
[session addInput:input];
stillImage = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG , AVVideoCodecKey, nil];
[stillImage setOutputSettings:outputSettings];
[session addOutput:stillImage];
[session startRunning];
}
If this happens, I can switch to my preferences view and back again and al is well,but it's an annoying bug I'd like to kill. The preview window is a UIView in my storyboard.
Do not start the capture session on view load, instead start it on viewWillAppear and stop it on viewWillDissapear.
Seems like your view controller is cleaning up some memory when the app is in the background. Make sure you are initializing your capture session with this in mind.
Allocate your session lazily in a private property getter method rather than in your start method you will avoid memory leaks this way.
Related
I am having difficulty implementing what I think should be a pretty simple capability. I have an button that compares the user's location to another set of lat/lon coordinates and displays an alert telling the user if they are "nearby" or "far away" from the location. This works well if the app remains in the foreground.
However, if the app gets sent to the background, the locationManager stops updating the user's location. If this happens, when the user brings the app back up and presses the "compare location button", the app doesn't seem to be able to get the user's new coordinates correctly.
Adding receive location updates to the background modes seems to work, but I know this will drain battery an may be rejected by Apple.
All I really want is to be able to have the button wait for an accurate set of lat/lon coordinates before trying to calculate the distance between the user and the other set lat/lon coordinates. Here is what I have so far:
The Location Button Does This:
-(void) locationButton {
NSString *userLatitude =[(PDCAppDelegate *)[UIApplication sharedApplication].delegate
getUserLatitude];
NSString *userLongitude =[(PDCAppDelegate *)[UIApplication sharedApplication].delegate
getUserLongitude];
NSString *placeLatitude = [[NSUserDefaults standardUserDefaults]
stringForKey:#"savedLatitude"];
NSString *placeLongitude = [[NSUserDefaults standardUserDefaults]
stringForKey:#"savedLongitude"];
NSString *distanceURL = [NSString stringWithFormat:#"http://www.website.com/page.php?
lat1=%#&lon1=%#&lat2=%#&lon2=%#",userLatitude, userLongitude, placeLatitude,
placeLongitude];
NSData *distanceURLResult = [NSData dataWithContentsOfURL:[NSURL
URLWithString:distanceURL]];
NSString *distanceInFeet = [[NSString alloc] initWithData:distanceURLResult
encoding:NSUTF8StringEncoding];
if ([distanceInFeet isEqualToString:#"1"]) {
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"You're Near!"
message:#"" delegate:self cancelButtonTitle:#"OK" otherButtonTitles:nil, nil];
[alert show];
return;
}
if ([distanceInFeet isEqualToString:#"0"]) {
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"You're Far!" message:#""
delegate:self cancelButtonTitle:#"OK" otherButtonTitles:nil, nil];
[alert show];
return;
}
}
The first few lines of the locationButton method get the userLatitude and userLongitude from my App Delegate:
- (NSString *)getUserCoordinates {
NSString *userCoordinates = [NSString stringWithFormat:#"latitude: %f longitude: %f",
locationManager.location.coordinate.latitude,
locationManager.location.coordinate.longitude];
locationManager = [[CLLocationManager alloc] init];
locationManager.distanceFilter = kCLDistanceFilterNone; // whenever we move
locationManager.desiredAccuracy = kCLLocationAccuracyHundredMeters; // 100 m
[locationManager startUpdatingLocation];
return userCoordinates;
}
- (NSString *)getUserLatitude {
NSString *userLatitude = [NSString stringWithFormat:#"%f",
locationManager.location.coordinate.latitude];
return userLatitude;
}
- (NSString *)getUserLongitude {
NSString *userLongitude = [NSString stringWithFormat:#"%f",
locationManager.location.coordinate.longitude];
return userLongitude;
}
I want to change the video orientation from front camera,
So I choose the previewLayer.orientation,and it indeed work .
Since the method is deprecated from IOS6 and I get a warning to use previewLayer.connection.videoOrientation
but I cannot access the connection property from previewLayer
- (void)addDeviceInput {
[session beginConfiguration];
[session removeInput:videoInput];
[self setVideoInput:nil];
AVCaptureDeviceInput *newDeviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self frontFacingCamera] error:nil];
if ([session canAddInput:newDeviceInput]) {
[session addInput:newDeviceInput];
[self setVideoInput:newDeviceInput];
}
newDeviceInput = nil;
if (previewLayer) {
[previewLayer removeFromSuperlayer];
[previewLayer release];
previewLayer = nil;
}
[session commitConfiguration];
}
- (id)setupCameraSession {
if (!(self = [super init])) {
return nil;
}
self.frameRate = 0;
session = [[AVCaptureSession alloc] init];
[session beginConfiguration];
[session setSessionPreset:AVCaptureSessionPreset352x288];
[self addDeviceInput]; //invoke previous method
AVCaptureVideoDataOutput * newVideoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
newVideoDataOutput.videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA],
kCVPixelBufferPixelFormatTypeKey,nil];
newVideoDataOutput.alwaysDiscardsLateVideoFrames = YES;
if ([session canAddOutput:newVideoDataOutput]) {
[session addOutput:newVideoDataOutput];
[self setVideoOutput:newVideoDataOutput];
}
[newVideoDataOutput release];
self.frameRate = VIDEO_FRAME_RATE;
[session commitConfiguration];
AVCaptureVideoPreviewLayer * newPreviewLayer =
[[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
//[newPreviewLayer setSession:session]; //it doesn't work? and get error
[self setPreviewLayer:newPreviewLayer];
[newPreviewLayer release];
[session startRunning];
return self;
}
I get the error
error: Execution was interrupted, reason: Attempted to dereference an invalid ObjC Object or send it an unrecognized selector.
My application appears to work fine when deployed to my iPhone via Xcode however now that it is on the app store when I download it, it crashes. It must have worked for Apple as like this it would never have passed review????
I have tried other phones but they crash also. From looking at the crash logs the code below is the area of concern.
dispatch_async(dispatch_get_global_queue( DISPATCH_QUEUE_PRIORITY_LOW, 0), ^{
// Do a tasks in the background
library = [[ALAssetsLibrary alloc] init];
NSMutableArray *itemFileName = [[NSMutableArray alloc] init];
for (Item *it in tag.Items) {
[itemFileName addObject:it.name];
}
void (^assetEnumerator)( ALAsset *, NSUInteger, BOOL *) = ^(ALAsset *asset, NSUInteger index, BOOL *stop) {
if(asset != NULL && [asset valueForProperty:ALAssetPropertyType] == ALAssetTypePhoto) {
ALAssetRepresentation *rep = [asset defaultRepresentation];
NSString *fileName = [rep filename];
if ([itemFileName containsObject:fileName]) {
AssetView *assetView = [[AssetView alloc] initWithAsset:asset];
assetView.delegate = self;
assetView.fileName = fileName;
//assetView.index = counter;
[self.assetViews addObject:assetView];
//NSLog(#"%i",index);
}
}
};
void (^assetGroupEnumerator)( ALAssetsGroup *, BOOL *) = ^(ALAssetsGroup *group, BOOL *stop) {
if(group != nil) {
[group enumerateAssetsUsingBlock:assetEnumerator];
}
else{
//[self performSelectorOnMainThread:#selector(reloadTableView) withObject:nil waitUntilDone:YES];
// Hide the HUD in the main tread
dispatch_async(dispatch_get_main_queue(), ^{
[self reloadTableView];
[MBProgressHUD hideHUDForView:self.navigationController.view animated:YES];
[self loadImageTags];
if([self.assetViews count] != [tag.Items count]){
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Info"
message:#"Some items have been deleted from the device. If none remain for this tag it will be deleted."
delegate:self
cancelButtonTitle:nil
otherButtonTitles: #"OK", nil];
[alert show];
MBProgressHUD *hud = [MBProgressHUD showHUDAddedTo:self.navigationController.view animated:YES];
hud.labelText = #"Cleaning...";
dispatch_async(dispatch_get_global_queue( DISPATCH_QUEUE_PRIORITY_LOW, 0), ^{
[self clearupOrphanedItems];
dispatch_async(dispatch_get_main_queue(), ^{
[MBProgressHUD hideHUDForView:self.navigationController.view animated:YES];
});
});
}
});
}
};
// Group Enumerator Failure Block
void (^assetGroupEnumberatorFailure)(NSError *) = ^(NSError *error) {
UIAlertView * alert = [[UIAlertView alloc] initWithTitle:#"Error" message:[NSString stringWithFormat:#"Album Error: %# - %#", [error localizedDescription], [error localizedRecoverySuggestion]] delegate:nil cancelButtonTitle:#"Ok" otherButtonTitles:nil];
[alert show];
NSLog(#"A problem occured %#", [error description]);
};
// Enumerate Albums
[library enumerateGroupsWithTypes:ALAssetsGroupAll
usingBlock:assetGroupEnumerator
failureBlock:assetGroupEnumberatorFailure]; //HERE IS WHERE IT DIES!!!
});
I am not sure if the issue is to due with instantiating the ALAssetLibrary inside the block as either it way to works fine during debugging or even just if deployed to phone via Xcode
Can anyone shed some light on this for me?
To avoid coredata and threading issues, please make sure:
1) that you create only one alassetslibrary Instance for the complete lifecycle of the application. Therefore its recommended you initialize your assetslibrary Instance either in the appdelegate or using dispatch_once
2) Make sure you create the alassetslibrary Instance on the Main thread.
Cheers,
Hendrik
I am calling URL on my Search button event to load data. I have put the code for Loading indicator Alert view by taking NSThread. I am using 3.2 xcode with 4.3 iOS. Every thing run smooth but on search button it shows Loading indicator and then crashing and showing following in console
Program received signal: “EXC_BAD_ACCESS”.
warning: Unable to read symbols for /Developer/Platforms/iPhoneOS.platform/DeviceSupport/4.3.2 (8H7)/Symbols/Developer/usr/lib/libXcodeDebuggerSupport.dylib (file not found).
Code under the Search Button click event:
- (IBAction) searchButton {
if([addressField.text length]==0)
{
UIAlertView *myAlert = [[[UIAlertView alloc] initWithTitle:#"Alert" message:#"Please Tap on 'Show Me' & choose the 'Radius' first!!!" delegate:self cancelButtonTitle:#"Ok" otherButtonTitles:nil] autorelease];
[myAlert show];
}
else
{
[NSThread detachNewThreadSelector:#selector(updateFilterProgress) toTarget:self withObject:nil];
appDelegate = (MapTutorialAppDelegate *)[[UIApplication sharedApplication] delegate];
CLLocationCoordinate2D location;
float radius = [[arrayNo objectAtIndex:[pickerView selectedRowInComponent:0]] floatValue];
NSString *url = [NSString stringWithFormat:#"http://....url...../hespdirectory/phpsqlsearch_genxml.php?lat=%f&lng=%f&radius=%f",locationManager.location.coordinate.latitude,locationManager.location.coordinate.longitude,radius];
NSLog(#"%#", url);
NSURL *URL = [NSURL URLWithString:url];
NSXMLParser *xmlParser = [[NSXMLParser alloc] initWithContentsOfURL:URL];
//Initialize the delegate.
XMLParser *parser = [[XMLParser alloc] initXMLParser];
//Set delegate
[xmlParser setDelegate:parser];
//Start parsing the XML file.
BOOL success = [xmlParser parse];
if(success)
{
if([appDelegate.markers count] == 0){
UIAlertView *myAlert = [[[UIAlertView alloc] initWithTitle:#"Alert" message:#"No results fond!!!" delegate:self cancelButtonTitle:#"Ok" otherButtonTitles:nil] autorelease];
[myAlert show];
}
else
{
resultButton.userInteractionEnabled = YES;
for (int i = 0; i < [appDelegate.markers count]; i++)
{
marker *aMarker = [appDelegate.markers objectAtIndex:i];
location.latitude = [aMarker.lat floatValue];
location.longitude =[aMarker.lng floatValue];
AddressAnnotation *annob = [[AddressAnnotation alloc] initWithCoordinate:location];
annob.title = aMarker.name;
annob.subTitle = aMarker.address;
[mapView addAnnotation:annob];
[annob release];
CLLocationCoordinate2D ausLoc = {location.latitude,location.longitude}; //for zoom in the showroom results region
MKCoordinateSpan ausSpan = MKCoordinateSpanMake(0.108889, 0.169922);
MKCoordinateRegion ausRegion = MKCoordinateRegionMake(ausLoc, ausSpan);
NSLog(#"No Errors");
mapView.region = ausRegion;
}
}
}
else
NSLog(#"Error Error Error!!!");
[addressField resignFirstResponder];
}
}
And for NSThread to Show Loaading indicator while loading data at the back.
- (void) updateFilterProgress{
NSAutoreleasePool *pool = [NSAutoreleasePool new];
Reachability *r = [Reachability reachabilityWithHostName:#"www.google.com"];
NetworkStatus internetStatus = [r currentReachabilityStatus];
if ((internetStatus != ReachableViaWiFi) && (internetStatus != ReachableViaWWAN))
{
UIAlertView *myAlert = [[[UIAlertView alloc] initWithTitle:#"No Internet Connection" message:#"This app require an internet connection via WiFi or cellular network to work." delegate:self cancelButtonTitle:#"Ok" otherButtonTitles:nil] autorelease];
[myAlert show];
}
else{
UIAlertView *alertMe = [[[UIAlertView alloc] initWithTitle:#"Loading..." message:nil delegate:self cancelButtonTitle:nil otherButtonTitles: nil] autorelease];
[alertMe show];
UIActivityIndicatorView *indicator = [[UIActivityIndicatorView alloc] initWithActivityIndicatorStyle:UIActivityIndicatorViewStyleWhiteLarge];
// Adjust the indicator so it is up a few pixels from the bottom of the alert
indicator.center = CGPointMake(alertMe.bounds.size.width / 2, alertMe.bounds.size.height - 50);
[indicator startAnimating];
[alertMe addSubview:indicator];
[indicator release];
[alertMe release];
for (int i = 200; i > [appDelegate.markers count]; i--)
{
marker *aMarker = [appDelegate.markers objectAtIndex:i];
[alertMe dismissWithClickedButtonIndex:0 animated:YES];
}
}
[pool release]; }
Is there anything remaining in my code. Pleas correct me....
The variable alertMe is autoreleased and therefore you can't just send a release message to it. Remove the line [alertMe release]; and it will run perfectly.
You are probably accessing a released object. To see which one it is, set NSZombiesEnabled - this will show you which already released object you try to access, and you should be able to identify your problem.
Please have a look here to see how to enable the zombies in XCode 4:
http://42games.net/quick-note-on-setting-nszombieenabled-environment-variable-in-xcode-4/
I am trying to record and play video simultaneously. Is this possible with avfoundation? Currently i am able to do it as long as i dont record audio. As soon as i add audio input to AVCaptureSession and restart the whole thing i receive "AVCaptureSessionWasInterruptedNotification" and recording stops.
This is how i play video.
MPMoviePlayerController *moviePlayer = [[MPMoviePlayerController alloc] initWithContentURL:[NSURL fileURLWithPath:path]];
[moviePlayer.view setFrame:self.playerView.bounds];
moviePlayer.useApplicationAudioSession=NO;
self.player = moviePlayer;
[moviePlayer release];
[self.playerView addSubview:player.view];
[player play];
And this is how I record video:
NSError *error;
AVCamCaptureManager *captureManager = [[AVCamCaptureManager alloc] init];
if ([captureManager setupSessionWithPreset:AVCaptureSessionPresetLow error:&error])
{
[self setCaptureManager:captureManager];
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:[captureManager session]];
self.captureVideoPreviewLayer= previewLayer;
UIView *view = [self cameraView];
CALayer *viewLayer = [view layer];
[viewLayer setMasksToBounds:YES];
CGRect bounds = [view bounds];
[captureVideoPreviewLayer setFrame:bounds];
if ([captureVideoPreviewLayer isOrientationSupported])
[captureVideoPreviewLayer setOrientation:AVCaptureVideoOrientationPortrait];
[captureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[[captureManager session] startRunning];
[self setCaptureVideoPreviewLayer:captureVideoPreviewLayer];
if ([[captureManager session] isRunning])
{
[captureManager setOrientation:AVCaptureVideoOrientationPortrait];
[captureManager setDelegate:self];
[viewLayer insertSublayer:captureVideoPreviewLayer below:[[viewLayer sublayers] objectAtIndex:0]];
NSString *countString = [[NSString alloc] initWithFormat:#"%d", [[AVCaptureDevice devices] count]];
NSLog(#"Device count: %#",countString);
} else {
UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:#"Failure"
message:#"Failed to start session."
delegate:nil
cancelButtonTitle:#"Okay"
otherButtonTitles:nil];
[alertView show];
[alertView release];
}
} else {
UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:#"Input Device Init Failed"
message:[error localizedDescription]
delegate:nil
cancelButtonTitle:#"Okay"
otherButtonTitles:nil];
[alertView show];
[alertView release];
}
[captureManager release];
if (![[self captureManager] isRecording]) {
[[self captureManager] startRecording];
}
Where as i am using "AVCamCaptureManager" from apple AVCam sample code.
First, set the moviePlayer to use the application audio session:
moviePlayer.useApplicationAudioSession=YES;
Then, before calling [[captureManager session] startRunning], activate an audio session with its category set to "play and record" and override its property to allow it to mix with others.
// Set audio session category to "play and record"
NSError* error = nil;
AVAudioSession* audioSession = [AVAudioSession sharedInstance];
if (![audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&error]) {
NSLog(#"AVAudioSession setCategory failed: %#", [error localizedDescription]);
}
// Set audio session property "allow mixing" to true so audio can be recorded while it is playing
UInt32 allowMixing = true;
OSStatus status = AudioSessionSetProperty(kAudioSessionProperty_OverrideCategoryMixWithOthers, sizeof(allowMixing), &allowMixing);
if (status != kAudioSessionNoError) {
NSLog(#"AudioSessionSetProperty(kAudioSessionProperty_OverrideCategoryMixWithOthers) failed: %ld", status);
}
// Activate the audio session
error = nil;
if (![audioSession setActive:YES error:&error]) {
NSLog(#"AVAudioSession setActive:YES failed: %#", [error localizedDescription]);
}
Just if anyone else is wondering... I got the above approach to work as well that Wertesse proposed, but it also works with an AVPlayer (which doesn't have the useApplicationAudioSession attribute).