Saving the video when app is interrupted by call - iphone

I setup and use AVAssetWriterInput --- AVAssetWriter chain to record a video. This works fine.
When the video recording is stopped, AVAssetWriter's finishWriting returns a status saying it is Completed.
However, if the recording is interrupted by a call then the finishWriting status says it is Failed and the video is not saved.
To fix this issue, following line of code is used.
self.assetWriter.movieFragmentInterval = CMTimeMakeWithSeconds(1.0, 1000000000);
Here, even if the status says Failed, the video is saved.
Will this affect the performance, as the header is now added at regular intervals to the recording video?
Also, is it right to hard code the values say, 1000000000?
Thanks in advance for your time and help.

Related

For how long does the AVPlayer continues to buffer from an URL after a pause?

I was reading the AVPlayer class documentation and I couldn't find the answer for my question.
I'm playing a streamed audio from the Internet on my iPhone app and I'd like to know if after a [myAVPlayer pause]; invocation myAVPlayer will keep downloading the audio file on the background for a long time.
If the user pushes the "Pause" button, invoking [myAVPlayer pause]; and then leaves the app, will myAVPlayer keep downloading a large amount of data?
I'm concerned about this when the user is on 3G Network.
I am faced with the same question and have done some experimentation. My observations are only valid for video, but if they are also valid for audio, then AVPlayer will try to buffer around 30s of content when you press pause. If you have access to the webserver, you could run tcpdump/wireshark and see how long after you press pause that the server continues to send data.
You can manage how long AVPlayer continues to buffer.
You need to manage preferredForwardBufferDuration of avplayer currentItem. If you want to stop buffering set value to 1 because if you set it to 0 it will be set up automatically
self.avPlayer.currentItem.preferredForwardBufferDuration = 1;
From Apple documentation: This property defines the preferred forward buffer duration in seconds. If set to 0, the player will choose an appropriate level of buffering for most use cases. Setting this property to a low value will increase the chance that playback will stall and re-buffer, while setting it to a high value will increase demand on system resources.

How much video content is pre buffered when using AV player with HTTP Live Streaming and can that value be changed

I am writing a video app that plays streaming videos from the web and I am using AV player to do so. My question is how do I find out how much video content is pre buffered, in MPMoviePlayerController you can see the amount of buffered content on the UISlider. I would like to show the same using AV Player and also be able to change the amount of pre buffered content.
My ideal situation is - User streaming a movie file using my app, if he pauses the play button, the movie keeps buffering just like when you watch youtube videos.
Please Help !!
Thank you.
You can see the amount of data that has been loaded and buffered ahead of the playhead by looking at the AVPlayerItem loadedTimeRanges property.
e.g.
AVPlayer *player;
NSArray *loadedTimeRanges = player.currentItem.loadedTimeRanges;
NSLog(#"LoadedTimeRanges: %#", loadedTimeRanges);
In the case of my app I can see:
LoadedTimeRanges: (
"CMTimeRange: {{338070700809/1000000000 = 338.071}, {54651145016/1000000000 = 54.651, rounded}}"
)
where the second value (54.651) appears to be the amount of buffering that exists in front of the playhead. In the case of a stall this value decreases as playback continues until reaching approximately 0.
Between 55 and 60 seconds of pre-buffered content is all I've seen – you can only examine this value and cannot force the player to buffer any more data. You could, however, use this value to visually indicate the amount of data buffered to the user.

How to get an AVCapture Timecode?

I'm working on a video capture app using the AVFoundation framework, based on the AVCam sample by Apple. I'd like to implement functionality to set a maximum video length, and have the capture automatically stops when this limit is reached (similar to UIImagePickerController.videoMaximumDuration).
I'm assuming I need to register for some notification as the capture is recording, and to check the timestamp in this callback. I looked through the AV Foundation Programming Guide and did a bit of Googling, and I can't find a way to retrieve the elapsed time of a AVCaptureSession, AVCaptureMovieFileOutput, or AVCaptureSomethingElse.
Any insight would help. Thanks!
You can set the maxRecordedDuration or maxRecordedFileSize. However, you need to make sure you handle the error correctly on the captureOutput:didFinishRecordingToOutputFileAtURL:fromConnections:error: delegate call to detect whether the recording stopped due to an error or due to reaching max duration/file size.
check the error code like:
if (([error code] == AVErrorMaximumDurationReached)) {
[delegate captureSessionMaxDurationReached];
}

Recording Chunked Audio in Background iOS

I am trying to figure out a way to record audio in the background of an iOS application and have it streamed to the server.
I have pretty much got this working for when the app is in the foreground. I use AVAudioRecorder to record input for X seconds. Once I get the callback that this has finished, I record for another X seconds. Each recording session gets stored to a different file and I send these files asynchronously to the server.
However, this doesn't seem to work while my app goes into background mode.
When going into the background, the current record session continues recording until the X seconds are up, however my app gets suspended, before I can start another recording session.
Any ideas?
Here's the code for my callback:
- (void)audioRecorderDidFinishRecording:(AVAudioRecorder *)aRecorder successfully:(BOOL)flag {
NSLog(#"hello");
[self initRecorder];
[recorder recordForDuration:5];
}
You can't restart recording in the background.
So use the Audio Queue or Audio Unit RemoteIO APIs instead, which will give you smaller "chunks" (callback buffer blocks) of audio without stopping the audio recording.
Concatenate small audio callback chunks into larger file chunks if needed for your network protocol.
Background audio playing is supported with multitasking but it's not very clear that background audio recording is. However, I have not tried it. The Audio Unit API might let you continue to record audio while the application is in the background. However, this is kind of a trick and I Imagine it might get pulled out at some point.

record and play the same file

I am trying to record the audio using AVAudioRecorder. My problem is i want to play the file while it is being recorded. Is there any way i can do this? I have set the category to PlayAndRecord. I have created the file for recording.
I have tried to initialize both the recorder and the player for the URL and tried to play the file after the record function. But it seems that only one chunk of the file gets played.
I seriously doubt you can do this with AVAudioRecorder and AVAudioPlayer convenience classes.
The AVAudioPlayer assumes that the file it plays is static and complete. For example, it has no method to dynamically update the track length. It reads the track length once at startup and that is why only the initial chunk of the file gets played. Any modifications to the file made after the AVAudioPlayer instances opens the file are ignored.
You'll need to set up a custom audio queue if you want to get fancy.
As an aside, how do you plan on getting around feedback problem of caused by recording what you are playing? You have nothing but a shrieking squeal in very short order.