Added KVO for AVPlayer when to play video as queuePlayer is AVPlayer
[self.queuePlayer addObserver:self forKeyPath:#"status" options:0 context:NULL];
observer method:
-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
if ([keyPath isEqualToString:#"status"]) {
if (self.queuePlayer.status == AVPlayerStatusReadyToPlay) {
NSInteger step = (NSInteger)(startTimeForVideo/0.04);
[self.queuePlayer.currentItem stepByCount:step];
//CMTime seekTime = CMTimeMake(startTimeForVideo*timeScale,timeScale);
//if (CMTIME_IS_VALID(seekTime))
// [self.queuePlayer seekToTime:seekTime toleranceBefore:kCMTimePositiveInfinity toleranceAfter:kCMTimePositiveInfinity];
//else
// NSLog(#"In valid time");
[self.queuePlayer play];
} else if (self.queuePlayer.status == AVPlayerStatusFailed) {
/* An error was encountered */
}
}
Here startTimeForVideo intial playBack time for video
seekToTime not working neither stepByCount
EDIT : Values of object used in methods are correct and even though no luck
Changes in KVO method as queuePlayer is AVPlayer:
if (self.queuePlayer.status == AVPlayerStatusReadyToPlay)
{
//firstly make cmtime as here startTimeForVideo is Float64 value in seconds
//get video's time scale as CMTime has it.
int32_t timeScale = self.queuePlayer.currentItem.asset.duration.timescale
CMTime seektime=CMTimeMakeWithSeconds(startTimeForVideo, timeScale);
//use this cmtime for seekToTime.
[self.queuePlayer seekToTime:seektime toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero];
}
............
EDIT : Here kCMTimeZero in both tolerance field for exact location in seekToTime: toleranceBefore: toleranceAfter: method
Related
AFTER tapping to take picture, I want to lock exposure and turn off torch as soon as exposure is no longer adjusting. So, I added an observer to handle adjustingExposure:
- (IBAction)configureImageCapture:(id)sender
{
[self.session beginConfiguration];
[self.cameraController device:self.inputDevice exposureMode:AVCaptureExposureModeAutoExpose];
[self.cameraController device:self.inputDevice torchMode:AVCaptureTorchModeOn torchLevel:0.8f];
[self.session commitConfiguration];
[(AVCaptureDevice *)self.inputDevice addObserver:self forKeyPath:#"adjustingExposure" options:NSKeyValueObservingOptionNew context:MyAdjustingExposureObservationContext];
}
Here is the observeValueForKeyPath method:
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
if (context == MyAdjustingExposureObservationContext) {
if( [keyPath isEqualToString:#"adjustingExposure"] )
{
BOOL adjustingExposure = [ [change objectForKey:NSKeyValueChangeNewKey] isEqualToNumber:[NSNumber numberWithInt:1] ];
if (!adjustingExposure)
{
[(AVCaptureDevice *)self.cameraController.inputDevice removeObserver:self forKeyPath:#"adjustingExposure"];
if ([self.inputDevice isExposureModeSupported:AVCaptureExposureModeLocked]) {
dispatch_async(dispatch_get_main_queue(),
^{
NSError *error = nil;
if ([self.inputDevice lockForConfiguration:&error]) {
// 5) lock the exposure
[self.cameraController device:self.inputDevice exposureMode:AVCaptureExposureModeLocked];
// 6) turn off the Torch
[self.cameraController device:self.inputDevice torchMode:AVCaptureTorchModeOn torchLevel:0.0001f];
[self.inputDevice unlockForConfiguration];
}
});
}
}
}
} else {
[super observeValueForKeyPath:keyPath ofObject:object change:change context:context];
}
}
#user3115647 posted this information, which is exactly what I am trying to do.
But my picture is taken BEFORE the torch is turned off.
Here is my captureStillImageAsynchronouslyFromConnection:self.captureConnection completionHandler. This block occurs after the image is taken. The observeValueForKeyPath is supposed to occur while the camera is adjusting exposure BEFORE the image is taken. But my torch is not going low BEFORE the image is being taken. Either this is a timing issue or I'm not setting up the camera configuration correctly.
- (void)captureImage
{
// configureImageCapture has already been done
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:self.captureConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
if (imageSampleBuffer != NULL)
{
// Log the image properties
CFDictionaryRef attachmentsRef = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
NSDictionary *properties = (__bridge NSDictionary *)(attachmentsRef);
NSLog(#"Image Properties => %#", (properties.count) ? properties : #"none");
I got something similar to happen by using flash instead of the torch. I have an observer for #"videoDevice.flashActive" as well. I did try using exposureModeLocked first, but it didn't work for me either.
Take a photo with flash on
Then instantly turn flash off and take another photo before the exposure has time to adjust
The code below probably doesn't just work on its own, but it's simplified from what I did.
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void*)context
{
if (context == AdjustingExposureContext)
{
self.isAdjustingExposure = [change[NSKeyValueChangeNewKey] boolValue];
}
else if (context == FlashModeChangedContext)
{
self.isFlashActive = [change[NSKeyValueChangeNewKey] boolValue];
if (!self.flashActive)
{
[self captureImage]; // QUICKLY! capture 2nd image without
} // flash before exposure adjusts
}
if (!self.isAdjustingExposure && self.flashActive)
{
[self removeObserver:self forKeyPath:#"videoDevice.adjustingExposure" context:AdjustingExposureContext];
[self captureImage]; // capture 1st image with the flash on
}
}
Now in the callback for captureStillImageAsynchronouslyFromConnection:,
if (self.isFlashActive)
[self.videoDeviceInput.device setFlashMode:NO];
However, if you need to take more than one photo without flash at the lowered exposure, this strategy may not work.
It is almost certainly a timing issue. Call captureStillImageAsynchronouslyFromConnection:completionHandler: inside your if block. Then the capture will always be executed after exposure has been locked.
if ([self.inputDevice isExposureModeSupported:AVCaptureExposureModeLocked]) {
dispatch_async(dispatch_get_main_queue(),
^{
NSError *error = nil;
if ([self.inputDevice lockForConfiguration:&error]) {
//code to lock exposure here
//take photo here
}
});
}
I'm using AVPlayer to play audio from remote server.
I want to display a progress bar showing the buffering progress and the "time played" progress, like the MPMoviePlayerController does when you play a movie.
Does the AVPlayer has any UI component that display this info? If not, how can I get this info (buffering state)?
Thanks
Does the AVPlayer has any UI component that display this info?
No, there's no UI component for AVPlayer.
If not, how can I get this info (buffering state)?
You should observer AVPlayerItem.loadedTimeRanges
[yourPlayerItem addObserver:self forKeyPath:#"loadedTimeRanges" options:NSKeyValueObservingOptionNew context:nil];
then using KVO to watch
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object
change:(NSDictionary *)change context:(void *)context {
if(object == player.currentItem && [keyPath isEqualToString:#"loadedTimeRanges"]){
NSArray *timeRanges = (NSArray*)[change objectForKey:NSKeyValueChangeNewKey];
if (timeRanges && [timeRanges count]) {
CMTimeRange timerange=[[timeRanges objectAtIndex:0]CMTimeRangeValue];
}
}
}
timerange.duration is what you expect for.
And you have to draw buffer progress manually.
Refs here.
I really new to Obj-C and iOS development, i found very much useful information here, but here is a question I didn't find the answer.
I got instance of AVQueuePlayer which plays audio stream from url.
How can I know that audio stream is loaded? For example when I press "Play" button, there is couple of seconds delay between a button press and actual start of streaming.
I looked at developer.apple.com library and didn't find any method that I can use to check status of AVQueuePlayer. There is one in AVPLayer, but AVPlayer is not supporting stream over http as far as i know.
Thank you.
I am not sure what you mean by "loaded": do you mean when the item is fully loaded or when the item is ready to play?
AVQueuePlayer supports http streams (HTTP Live and files) in the same way as AVPlayer. You should review the AVFoundation Programming Guide, Handling Different Types of Asset.
The most common case is when an item is ready to play, so I'll answer that one. If you are working with iOS with AVQueuePlayer < 4.3, you need to check the status of AVPlayerItem by observing the value of the AVPlayerItem status key:
static int LoadingItemContext = 1;
- (void)loadExampleItem
{
NSURL *remoteURL = [NSURL URLWithString:#"http://media.example.com/file.mp3"];
AVPlayerItem *item = [AVPlayerItem playerItemWithURL:remoteURL];
// insert the new item at the end
if (item) {
[self registerAVItemObserver:item];
if ([self.player canInsertItem:item afterItem:nil]) {
[self.player insertItem:item afterItem:nil];
// now observe item.status for when it is ready to play
}
}
}
- (void)registerAVItemObserver:(AVPlayerItem *)playerItem
{
[playerItem addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionNew context:(void*)&LoadingItemContext];
}
- (void)removeAVItemObserver:(AVPlayerItem *)playerItem
{
#try {
[playerItem removeObserver:self forKeyPath:#"status"];
}
#catch (...) { }
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
if (context == &LoadingItemContext) {
AVPlayerItem *item = (AVPlayerItem*)object;
AVPlayerItemStatus status = item.status;
if (status == AVPlayerItemStatusReadyToPlay) {
// now you know you can set your player to play, update your UI...
} else if (status == AVPlayerItemStatusFailed) {
// handle error here, i.e., skip to next item
}
}
}
That is just a pre-4.3 example. After 4.3 you can load a remote file (or HTTP Live playlist) using the code example in AVFoundation Programming Guide, Preparing an Asset For Use, with loadValuesAsynchronouslyForKeys:completionHandler. If you are using loadValuesAsynchronouslyForKeys for a HTTP Live stream you should observe the #"tracks" property.
I'm currently working on a project that involves playing music from the iphone music library within the app inside. I'm using MPMediaPickerController to allow the user to select their music and play it using the iPod music player within the iPhone.
However, i ran into problem when the user insert his earpiece and removes it. The music will suddenly stop playing for no reason. After some testing, i found out that the iPod player will pause playing when the user unplug his earpiece from the device. So is there any way to programatically detect if the earpiece has been unplug so that i can resume playing the music? Or is there any way to prevent iPod player from pausing when the user unplug his earpiece?
You should register for AudioRoute changed notification and implement how you want to handle the rout changes
// Registers the audio route change listener callback function
AudioSessionAddPropertyListener (kAudioSessionProperty_AudioRouteChange,
audioRouteChangeListenerCallback,
self);
and within the callback, you can get the reason for route change
CFDictionaryRef routeChangeDictionary = inPropertyValue;
CFNumberRef routeChangeReasonRef =
CFDictionaryGetValue (routeChangeDictionary,
CFSTR (kAudioSession_AudioRouteChangeKey_Reason));
SInt32 routeChangeReason;
CFNumberGetValue (routeChangeReasonRef, kCFNumberSInt32Type, &routeChangeReason);
if (routeChangeReason == kAudioSessionRouteChangeReason_OldDeviceUnavailable)
{
// Headset is unplugged..
}
if (routeChangeReason == kAudioSessionRouteChangeReason_NewDeviceAvailable)
{
// Headset is plugged in..
}
If you just want to check whether headphones are plugged in at any given time, without listening to route changes, you can simply do the following:
OSStatus error = AudioSessionInitialize(NULL, NULL, NULL, NULL);
if (error)
NSLog("Error %d while initializing session", error);
UInt32 routeSize = sizeof (CFStringRef);
CFStringRef route;
error = AudioSessionGetProperty (kAudioSessionProperty_AudioRoute,
&routeSize,
&route);
if (error)
NSLog("Error %d while retrieving audio property", error);
else if (route == NULL) {
NSLog(#"Silent switch is currently on");
} else if([route isEqual:#"Headset"]) {
NSLog(#"Using headphones");
} else {
NSLog(#"Using %#", route);
}
Cheers,
Raffaello Colasante
I see you are using the MPMediaPlayer Framework however the microphone handling is done using the AVAudioPlayer framework, which you will need to add to your project.
Apple's website has code from the AVAudioPlayer framework which I use to handle interruptions from a user plugging in or removing the Apple microphone headphones.
Check out Apple's iPhone Dev Center Audio Session Programming Guide.
- (void) beginInterruption {
if (playing) {
playing = NO;
interruptedWhilePlaying = YES;
[self updateUserInterface];
}
}
NSError *activationError = nil;
- (void) endInterruption {
if (interruptedWhilePlaying) {
[[AVAudioSession sharedInstance] setActive: YES error: &activationError];
[player play];
playing = YES;
interruptedWhilePlaying = NO;
[self updateUserInterface];
}
}
My code is a little different and some of this may help you:
void interruptionListenerCallback (
void *inUserData,
UInt32 interruptionState
) {
// This callback, being outside the implementation block, needs a reference
// to the AudioViewController object
RecordingListViewController *controller = (RecordingListViewController *) inUserData;
if (interruptionState == kAudioSessionBeginInterruption) {
//NSLog (#"Interrupted. Stopping playback or recording.");
if (controller.audioRecorder) {
// if currently recording, stop
[controller recordOrStop: (id) controller];
} else if (controller.audioPlayer) {
// if currently playing, pause
[controller pausePlayback];
controller.interruptedOnPlayback = YES;
}
} else if ((interruptionState == kAudioSessionEndInterruption) && controller.interruptedOnPlayback) {
// if the interruption was removed, and the app had been playing, resume playback
[controller resumePlayback];
controller.interruptedOnPlayback = NO;
}
}
void recordingListViewMicrophoneListener (
void *inUserData,
AudioSessionPropertyID inPropertyID,
UInt32 inPropertyValueSize,
const void *isMicConnected
) {
// ensure that this callback was invoked for a change to microphone connection
if (inPropertyID != kAudioSessionProperty_AudioInputAvailable) {
return;
}
RecordingListViewController *controller = (RecordingListViewController *) inUserData;
// kAudioSessionProperty_AudioInputAvailable is a UInt32 (see Apple Audio Session Services Reference documentation)
// to read isMicConnected, convert the const void pointer to a UInt32 pointer
// then dereference the memory address contained in that pointer
UInt32 connected = * (UInt32 *) isMicConnected;
if (connected){
[controller setMicrophoneConnected : YES];
}
else{
[controller setMicrophoneConnected: NO];
}
// check to see if microphone disconnected while recording
// cancel the recording if it was
if(controller.isRecording && !connected){
[controller cancelDueToMicrophoneError];
}
}
Hey guys just check AddMusic sample app. Will solve all your iPod related issues
First register iPod player for notification with following code
NSNotificationCenter *notificationCenter = [NSNotificationCenter defaultCenter];
[notificationCenter
addObserver: self
selector: #selector (handle_PlaybackStateChanged:)
name: MPMusicPlayerControllerPlaybackStateDidChangeNotification
object: musicPlayer];
[musicPlayer beginGeneratingPlaybackNotifications];
And implement the following code in the notification
- (void) handle_PlaybackStateChanged: (id) notification
{
MPMusicPlaybackState playbackState = [musicPlayer playbackState];
if (playbackState == MPMusicPlaybackStatePaused)
{
[self playiPodMusic];
}
else if (playbackState == MPMusicPlaybackStatePlaying)
{
}
else if (playbackState == MPMusicPlaybackStateStopped)
{
[musicPlayer stop];
}
}
A while ago I remember seeing a constant of some kind that defined the animation rate of the Keyboard on the iPhone and I can not for the life of me remember where I saw it....any insight?
- (NSTimeInterval)keyboardAnimationDurationForNotification:(NSNotification*)notification
{
NSDictionary* info = [notification userInfo];
NSValue* value = [info objectForKey:UIKeyboardAnimationDurationUserInfoKey];
NSTimeInterval duration = 0;
[value getValue:&duration];
return duration;
}
UIKeyboardAnimationDurationUserInfoKey now is a NSNumber object, that makes the code shorter.
- (void)keyboardWillShowNotification:(NSNotification *)notification
{
NSDictionary *info = [notification userInfo];
NSNumber *number = [info objectForKey:UIKeyboardAnimationDurationUserInfoKey];
double duration = [number doubleValue];
}
Since this is the first google hit, I'd like to point out that hard-coding 0.3 will mean that your view will incorrectly animate when international users (e.g. Japanese) swap between different-sized keyboards (when that action ought to be instant).
Always use the notification's userInfo dictionary's UIKeyboardAnimationDurationUserInfoKey value - it gets set to 0 when the user is flicking through keyboards.
To add a bit more to what Shaggy Frog wrote. The full implementation would be something like:
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(keyboardMovement:)
name:UIKeyboardWillShowNotification
object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(keyboardMovement:)
name:UIKeyboardWillHideNotification
object:nil];
-(void)keyboardMovement:(NSNotification *)notification{
if (_numericKeyboardShowing == false){
[UIView animateWithDuration:[self keyboardAnimationDurationForNotification:notification] delay:0
options:UIViewAnimationCurveEaseInOut
animations:^ {
self.bottomContainerView.center = CGPointMake(self.bottomContainerView.center.x, (self.bottomContainerView.center.y - 218));
}
completion:NULL];
_numericKeyboardShowing = true;
}
else{
[UIView animateWithDuration:[self keyboardAnimationDurationForNotification:notification] delay:0
options:UIViewAnimationCurveLinear
animations:^ {
self.bottomContainerView.center = CGPointMake(self.bottomContainerView.center.x, (self.bottomContainerView.center.y + 218));
}
completion:NULL];
_numericKeyboardShowing = false;
}
- (NSTimeInterval)keyboardAnimationDurationForNotification:(NSNotification *)notification
{
NSDictionary *info = [notification userInfo];
NSValue* value = [info objectForKey:UIKeyboardAnimationDurationUserInfoKey];
NSTimeInterval duration = 0;
[value getValue:&duration];
return duration;
}
UIKeyboardAnimationDurationUserInfoKey
The key for an NSValue object containing a double that identifies the duration of the animation in seconds.
In Swift your code will look like this:
let keyboardSize: CGSize = userInfo[UIKeyboardFrameBeginUserInfoKey]!.CGRectValue.size
let animationDuration = ((userInfo[UIKeyboardAnimationDurationUserInfoKey]) as! NSNumber).floatValue
let animationOptions = ((userInfo[UIKeyboardAnimationCurveUserInfoKey]) as! NSNumber).unsignedLongValue
UIView.animateWithDuration(NSTimeInterval(animationDuration), delay: 0,
options: UIViewAnimationOptions(rawValue: animationOptions),
animations: { () -> Void in
self.view.frame.origin.y += keyboardSize.height
},
completion: nil)
Swift 4 - worked for me:
if let duration = notification.userInfo?[UIKeyboardAnimationDurationUserInfoKey] as? Double {
UIView.animate(withDuration: duration, animations: {
self.view.layoutIfNeeded()
})
}
In debug mode my duration was 3.499999