The Apple docs seem to indicate that while recording video to a file, the app can change the URL on the fly with no problem. But I'm seeing a problem. When I try this, the recording delegate gets called with an error...
The operation couldn’t be completed. (OSStatus error -12780.) Info
dictionary is: {
AVErrorRecordingSuccessfullyFinishedKey = 0; }
(funky single quote in "couldn't" comes from logging [error localizedDescription])
Here's the code, which is basically tweaks to WWDC10 AVCam sample:
1) Start recording. Start timer to change the output URL every few seconds
- (void) startRecording
{
// start the chunk timer
self.chunkTimer = [NSTimer scheduledTimerWithTimeInterval:5
target:self
selector:#selector(chunkTimerFired:)
userInfo:nil
repeats:YES];
AVCaptureConnection *videoConnection = [AVCamCaptureManager connectionWithMediaType:AVMediaTypeVideo fromConnections:[[self movieFileOutput] connections]];
if ([videoConnection isVideoOrientationSupported]) {
[videoConnection setVideoOrientation:[self orientation]];
}
if ([[UIDevice currentDevice] isMultitaskingSupported]) {
[self setBackgroundRecordingID:[[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:^{}]];
}
NSURL *fileUrl = [[ChunkManager sharedInstance] nextURL];
NSLog(#"now recording to %#", [fileUrl absoluteString]);
[[self movieFileOutput] startRecordingToOutputFileURL:fileUrl recordingDelegate:self];
}
2) When the timer fires, change the output file name without stopping recording
- (void)chunkTimerFired:(NSTimer *)aTimer {
if ([[UIDevice currentDevice] isMultitaskingSupported]) {
[self setBackgroundRecordingID:[[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:^{}]];
}
NSURL *nextUrl = [self nextURL];
NSLog(#"changing capture output to %#", [[nextUrl absoluteString] lastPathComponent]);
[[self movieFileOutput] startRecordingToOutputFileURL:nextUrl recordingDelegate:self];
}
Note: [self nextURL] generates file urls like file-0.mov, file-5.mov, file-10.mov and so on.
3) This gets called each time the file changes, and every other invocation is an error...
- (void) captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
fromConnections:(NSArray *)connections
error:(NSError *)error
{
id delegate = [self delegate];
if (error && [delegate respondsToSelector:#selector(someOtherError:)]) {
NSLog(#"got an error, tell delegate");
[delegate someOtherError:error];
}
if ([self backgroundRecordingID]) {
if ([[UIDevice currentDevice] isMultitaskingSupported]) {
[[UIApplication sharedApplication] endBackgroundTask:[self backgroundRecordingID]];
}
[self setBackgroundRecordingID:0];
}
if ([delegate respondsToSelector:#selector(recordingFinished)]) {
[delegate recordingFinished];
}
}
When this runs, file-0 gets written, then we see error -12780 right after changing the url to file-5, file-10 gets written, then an error, then okay, and so on.
It's appears that changing the URL on the fly doesn't work, but it stops the writing which allows the next URL change to work.
Thanks all, for the review and good thoughts on this. Here's the word from Apple DTS...
I spoke with our AV Foundation engineers, and it is definitely a bug
in that this method is not doing what the documentation says it should
("You do not need to call stopRecording before calling this method
while another recording is in progress."). Please file a bug report
using the Apple Bug Reporter (http://developer.apple.com/bugreporter/)
so the team can investigate. Make sure and include your minimal
project in the report.
I've filed this with Apple as bug 11632087
In the docs it's stated this:
If a file at the given URL already exists when capturing starts, recording to the new file will fail.
Are you sure you check that nextUrl is a non-existing file name?
According to the documentation, calling to 2 consecutive startRecordingToOutputFileURL is not supported.
you can read about it here
In iOS, this frame accurate file switching is not supported. You must call stopRecording before calling this method again to avoid any errors.
Related
I'm trying to make a simple application that will scan for nearby Bluetooth devices and list their names as they are discovered. I'm using CoreBluetooth in accordance with every guide I've found, including Apple's guide here
However, it never works. I put an iPhone 4S in discoverable mode next to the iPhone 5 running the app, and it never discovers it. I also tried a Bluetooth-enabled car, but I don't know if it has BLE. What am I doing wrong? Here is the essence of my code, in ViewController.m
- (void)viewDidLoad
{
[super viewDidLoad];
[activity stopAnimating]; isScanning = NO; //activity is a GUI activity wheel
centralManager = [[CBCentralManager alloc] initWithDelegate: self queue: nil];
}
- (void)centralManagerDidUpdateState:(CBCentralManager *)central{
int state = central.state;
[self log: [NSString stringWithFormat: #"CBCentralManagerDidUpdateState: %d", state]];
//[self log] just NSLogs the message and adds it to a text view for the user to see.
if (state!=CBCentralManagerStatePoweredOn){
[self log: #"Error! Bluetooth not powered on!"]; //I never get this error.
}
}
- (void)centralManager:(CBCentralManager *)central didDiscoverPeripheral:(CBPeripheral *)peripheral advertisementData:(NSDictionary *)advertisementData RSSI:(NSNumber *)RSSI{
[self log: [NSString stringWithFormat: #"Peripheral found with CoreBluetooth: %#", peripheral.name]];
//And I never see any of these "peripheral found" messages.
}
- (IBAction)scanButton:(id)sender {
if (!isScanning){
[activity startAnimating];
isScanning = YES;
[centralManager scanForPeripheralsWithServices:nil options:nil];
[self log: #"Scanning started."];
}
else{
[activity stopAnimating];
isScanning = NO;
[centralManager stopScan];
[self log: #"Scanning stopped."];
}
}
Thanks for any suggestions.
I found an answer here: Can't seem to get core bluetooth to work
I need an iOS device in peripheral mode or a BLE peripheral. Very annoying because basically no peripherals use BLE.
It works with my brother's iPhone 4S running a free app from the App Store called LightBlue. The app lets you put the device in peripheral mode... it's kind of the developer to put out a nice testing app like this.
I am building an iOS application (my first) that processes video still frames on the fly. To dive into this, I followed an example from the AV* documentation from Apple.
The process involves setting up an input (the camera) and an output. The output works with a delegate, which in this case is the controller itself (it conforms and implements the method needed).
The problem I am having is that the delegate method never gets called. The code below is the implementation of the controller and it has a couple of NSLogs. I can see the "started" message, but the "delegate method called" never shows.
This code is all within a controller that implements the "AVCaptureVideoDataOutputSampleBufferDelegate" protocol.
- (void)viewDidLoad {
[super viewDidLoad];
// Initialize AV session
AVCaptureSession *session = [AVCaptureSession new];
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone)
[session setSessionPreset:AVCaptureSessionPreset640x480];
else
[session setSessionPreset:AVCaptureSessionPresetPhoto];
// Initialize back camera input
AVCaptureDevice *camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:camera error:&error];
if( [session canAddInput:input] ){
[session addInput:input];
}
// Initialize image output
AVCaptureVideoDataOutput *output = [AVCaptureVideoDataOutput new];
NSDictionary *rgbOutputSettings = [NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:kCMPixelFormat_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
[output setVideoSettings:rgbOutputSettings];
[output setAlwaysDiscardsLateVideoFrames:YES]; // discard if the data output queue is blocked (as we process the still image)
//[output addObserver:self forKeyPath:#"capturingStillImage" options:NSKeyValueObservingOptionNew context:#"AVCaptureStillImageIsCapturingStillImageContext"];
videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue", DISPATCH_QUEUE_SERIAL);
[output setSampleBufferDelegate:self queue:videoDataOutputQueue];
if( [session canAddOutput:output] ){
[session addOutput:output];
}
[[output connectionWithMediaType:AVMediaTypeVideo] setEnabled:YES];
[session startRunning];
NSLog(#"started");
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
NSLog(#"delegate method called");
CGImageRef cgImage = [self imageFromSampleBuffer:sampleBuffer];
self.theImage.image = [UIImage imageWithCGImage: cgImage ];
CGImageRelease( cgImage );
}
Note: I'm building with iOS 5.0 as a target.
Edit:
I've found a question that, although asking for a solution to a different problem, is doing exactly what my code is supposed to do. I've copied the code from that question verbatim into a blank xcode app, added NSLogs to the captureOutput function and it doesn't get called. Is this a configuration issue? Is there something I'm missing?
Your session is a local variable. Its scope is limited to viewDidLoad. Since this is a new project, I assume it's safe to say that you're using ARC. In that case that object won't leak and therefore continue to live as it would have done in the linked question, rather the compiler will ensure the object is deallocated before viewDidLoad exits.
Hence your session isn't running because it no longer exists.
(aside: the self.theImage.image = ... is unsafe since it performs a UIKit action of the main queue; you probably want to dispatch_async that over to dispatch_get_main_queue())
So, sample corrections:
#implementation YourViewController
{
AVCaptureSession *session;
}
- (void)viewDidLoad {
[super viewDidLoad];
// Initialize AV session
session = [AVCaptureSession new];
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone)
[session setSessionPreset:AVCaptureSessionPreset640x480];
else
/* ... etc ... */
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
NSLog(#"delegate method called");
CGImageRef cgImage = [self imageFromSampleBuffer:sampleBuffer];
dispatch_sync(dispatch_get_main_queue(),
^{
self.theImage.image = [UIImage imageWithCGImage: cgImage ];
CGImageRelease( cgImage );
});
}
Most people advocate using an underscore at the beginning of instance variable names nowadays but I omitted it for simplicity. You can use Xcode's built in refactor tool to fix that up after you've verified that the diagnosis is correct.
I moved the CGImageRelease inside the block sent to the main queue to ensure its lifetime extends beyond its capture into a UIImage. I'm not immediately able to find any documentation to confirm that CoreFoundation objects have their lifetime automatically extended when captured in a block.
I've found one more reason why didOutputSampleBuffer delegate method may not be called — save to file and get sample buffer output connections are mutually exclusive. In other words, if your session already has AVCaptureMovieFileOutput and then you add AVCaptureVideoDataOutput, only AVCaptureFileOutputRecordingDelegate delegate methods are called.
Just for the reference, I couldn't find anywhere in AV Foundation framework documentation explicit description of this limitation, but Apple support confirmed this a few years ago, as noted in this SO answer.
One way to solve the problem is to remove AVCaptureMovieFileOutput entirely and manually write recorded frames to the file in didOutputSampleBuffer delegate method, alongside your custom buffer data processing. You may find these two SO answers useful.
In my case the problem is there because I call
if ([_session canAddOutput:_videoDataOutput])
[_session addOutput:_videoDataOutput];
before I call
[_session startRunning];
I'm just start call addOutput: after startRunning
Hope it's help somebody.
My captureOutput function was not called either. And the accepted answer did not exactly point at my problem, as my session was already an instance variable.
BUT, my DispatchQueue for my video frames was local. And the dispatchQueue must ALSO be an instance variable. I don't quite understand why this should be necessary. Perhaps the underlying AVCapture code only keeps a weak pointer to it?
The documentation is very confusing on this.
I use ZXing for an app, this is mainly the same code than the ZXing original code except that I allow to scan several time in a row (ie., the ZXingWidgetController is not necesseraly dismissed as soon as something is detected).
I experience a long long freeze (sometimes it never ends) when I press the dismiss button that call
- (void)cancelled {
// if (!self.isStatusBarHidden) {
// [[UIApplication sharedApplication] setStatusBarHidden:NO];
// }
[self stopCapture];
wasCancelled = YES;
if (delegate != nil) {
[delegate zxingControllerDidCancel:self];
}
}
with
- (void)stopCapture {
decoding = NO;
#if HAS_AVFF
if([captureSession isRunning])[captureSession stopRunning];
AVCaptureInput* input = [captureSession.inputs objectAtIndex:0];
[captureSession removeInput:input];
AVCaptureVideoDataOutput* output = (AVCaptureVideoDataOutput*)[captureSession.outputs objectAtIndex:0];
[captureSession removeOutput:output];
[self.prevLayer removeFromSuperlayer];
/*
// heebee jeebees here ... is iOS still writing into the layer?
if (self.prevLayer) {
layer.session = nil;
AVCaptureVideoPreviewLayer* layer = prevLayer;
[self.prevLayer retain];
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, 12000000000), dispatch_get_main_queue(), ^{
[layer release];
});
}
*/
self.prevLayer = nil;
self.captureSession = nil;
#endif
}
(please notice that the dismissModalViewController that remove the view is within the delegate method)
I experience the freeze only while dismissing only if I made several scans in a row, and only with an iPhone 4 (no freeze with a 4S)
Any idea ?
Cheers
Rom
According to the AV Cam View Controller Example calling startRunning or stopRunning does not return until the session completes the requested operation. Since you are sending these messages to the session on the main thread, it freezes all the UI until the requested operation completes. What I would recommend is that you wrap your calls in an Asynchronous dispatch so that the view does not lock-up.
- (void)cancelled
{
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[self stopCapture];
});
//You might want to think about putting the following in another method
//and calling it when the stop capture method finishes
wasCancelled = YES;
if (delegate != nil) {
[delegate zxingControllerDidCancel:self];
}
}
There is a way to add an image to the lock screen for Background Audio, along with setting the Track and Artist name. It was also mentioned in a WWDC 2011 video, but nothing specific to go off of. I have looked everywhere in the docs and cannot find it. I know it is an iOS5 only thing, and Spotify's newest version has this feature. Does anyone know where they can point me in the right direction?
Thank You,
Matthew
Here's an answer I found for you:
(1) You must handle remote control events. You can't be the Now
Playing app unless you do. (See the AudioMixer (MixerHost) sample) code.)
(2) Set the Now Playing info:
MPNowPlayingInfoCenter *infoCenter = [MPNowPlayingInfoCenter defaultCenter];
infoCenter.nowPlayingInfo =
[NSDictionary dictionaryWithObjectsAndKeys:#"my title", MPMediaItemPropertyTitle,
#"my artist", MPMediaItemPropertyArtist,
nil];
This is independent of whichever API you are using to play audio or
video.
as per Michaels answer above, simply append
#{MPMediaItemPropertyArtwork: [[MPMediaItemArtwork alloc] initWithImage:[UIImage ...]]}
to the nowPlayingInfo dict
the full options of available keys are ...
// MPMediaItemPropertyAlbumTitle
// MPMediaItemPropertyAlbumTrackCount
// MPMediaItemPropertyAlbumTrackNumber
// MPMediaItemPropertyArtist
// MPMediaItemPropertyArtwork
// MPMediaItemPropertyComposer
// MPMediaItemPropertyDiscCount
// MPMediaItemPropertyDiscNumber
// MPMediaItemPropertyGenre
// MPMediaItemPropertyPersistentID
// MPMediaItemPropertyPlaybackDuration
// MPMediaItemPropertyTitle
To make controls work....
- (BOOL)canBecomeFirstResponder {
return YES;
}
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[self becomeFirstResponder];
}
- (void)viewWillDisappear:(BOOL)animated {
[[UIApplication sharedApplication] endReceivingRemoteControlEvents];
[self resignFirstResponder];
[super viewWillDisappear:animated];
}
- (void)remoteControlReceivedWithEvent:(UIEvent *)receivedEvent {
if (receivedEvent.type == UIEventTypeRemoteControl) {
switch (receivedEvent.subtype) {
case UIEventSubtypeRemoteControlPlay:
[player play];
break;
case UIEventSubtypeRemoteControlPause:
[player pause];
break;
case UIEventSubtypeRemoteControlTogglePlayPause:
if (player.playbackState == MPMoviePlaybackStatePlaying) {
[player pause];
}
else {
[player play];
}
break;
default:
break;
}
}
}
It only works on a real iOS Device, not on the simulator
I try to solve this problem for several days now I have to ask you...
I've got a View (and a ViewController) with a UITableview. There is a TableViewController for that table which is generated in the ViewController. The TableViewController calls a DataSyncManager sharedInstant object (which is obviously in a separate class) which starts to sync data with the server.
I do it this way (first the refresh method):
-(void) refresh{
[serverQueueProgressView setProgress:0.0];
[syncingLabel setAlpha:0.5];
[serverQueueProgressView setAlpha:1];
[self performSelector:#selector(reloadTableViewDataSource) withObject:nil afterDelay:1.0];
}
Then the method reloadTableViewDataSource (of TableViewController) is called:
- (void)reloadTableViewDataSource
{
[dataSyncManager getEntriesFromServer];
}
dataSyncManager is my sharedInstance.
In the getEntriesFromServer method of dataSyncManager I do the loop with different sync items and call everytime
[[NSNotificationCenter defaultCenter]
postNotificationName:#"ServerQueueProgress"
object:progress];
with the proper progress as NSNumber (that part works well). The message is now sent and catched by my ViewController (it works, I checked with a breakpoint, it also gets the right progress-NSNumber and converts it to float):
- (void)serverQueueProgress:(NSNotification *)notification {
if(![NSThread isMainThread])
{
[self performSelectorOnMainThread:_cmd withObject:notification waitUntilDone:NO];
return;
}
[queueProgressView setProgress:[[notification object] floatValue]];
}
This is one solution which I found here on stackoverflow. But the if is always skipped because obviously I'm on main thread.
Unfortunately the UIProgressview doesn't get updated, it just hangs around, but I connected it well in Interface Builder (I checked that by setting the progress in another method of ViewController.
I also tried to catch the Notification with my TableViewController and put in some other solutions, but no chance, the UIProgressView doesn't get updated live. Only after the sync is done.
Here is the mentioned code in TableViewController which also gets executed without errors (I also stepped it to make sure every line gehts executed well):
This is the method called when received a the notification:
- (void)serverQueueProgress:(NSNotification *)notification {
[self performSelectorOnMainThread:#selector(updateProgress:) withObject:[notification object] waitUntilDone:NO];
[serverQueueProgressView setProgress:[[notification object] floatValue]];
}
Which also calls updateProgress: of the same class:
- (void)updateProgress:(NSNumber *)newProgressValue {
[serverQueueProgressView setProgress:[newProgressValue floatValue]];
}
No chance. I tried many ways and implemented some in parallel as you see, but the ProgressView won't get updated live. Only at the end of syncing. What am I doing wrong??
EDIT: Here is my getEntriesFromServer and some other stuff in DataSyncManager:
- (void)getEntriesFromServer
{
[[NSNotificationCenter defaultCenter]
postNotificationName:#"SynchingStarted"
object:nil];
[self completeServerQueue];
...
}
and completeServerQueue is the function which sends messages to my ViewController with the proper progress float value (it's only a dummy for loop, which gets executed properly... I've checked it):
- (void)completeServerQueue {
NSNumber *progress = [[NSNumber alloc] init];
for (int i=0; i<15; i++) {
progress = [[NSNumber alloc] initWithFloat:(100/15*i) ];
[[NSNotificationCenter defaultCenter]
postNotificationName:#"ServerQueueProgress"
object:progress];
sleep(1);
}
}
also, when you're having trouble, break the problem down a bit. Instead of:
[serverQueueProgressView setProgress:[[notification object] floatValue]];
do this;
float prog = [notification object] floatValue];
[serverQueueProgressView setProgress:prog];
then debugging would give a clue that this isn't working.
my guess is the problem isn't the code you've shown here, but other code in getEntriesFromServer. Are you using NSURLConnection? Something like:
NSURLConnection *connection = [[NSURLConnection alloc] initWithRequest:request delegate:self];
then you will get callbacks asynchronously that you can use to update your progress view.