I have AVPlayer that load video from url and put player inside AVPlayerViewController but I do not want to buffer and download video until user press play button. How should I do it?
var player: AVPlayer = AVPlayer(URL: nsurl)
var newVideoChunk: AVPlayerViewController = AVPlayerViewController()
newVideoChunk.player = player
AVPlayerViewController with AVPlayer from NSURL?
You will need to setup the video asset and create a playerItem with that NSURL based asset. Then you will need to add an observer to that playerItem (immediately):
self.playerItem?.addObserver(self, forKeyPath: "status", options: NSKeyValueObservingOptions.New, context: Constants.AVPlayerStatusObservationContext)
From within the key value observer routine you can trap the context and call an external function:
override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
//
if context == Constants.AVPlayerStatusObservationContext {
if (keyPath! == "status") {
if (player!.status == AVPlayerStatus.ReadyToPlay) {
print("ready")
readyToPlay()
} else if (player!.status == AVPlayerStatus.Failed) {
// something went wrong. player.error should contain some information
} else if (player!.status == AVPlayerStatus.Unknown) {
print("unknown")
}
}
}
}
If you only want to handle the buffering and download when the button is clicked then make sure you add the observer only within the button action method. This will work just as well with a file URL as an online URL.
Please check my sample gist for more information:
VideoPlayerViewController.swift
Related
Ive been trying to add an observer to listen to AVPlayer's "timeControlStatus", mostly taken dirrectly from Apple's example;
https://developer.apple.com/documentation/avfoundation/media_playback_and_selection/observing_playback_state
I created a sperate class called Play and im calling the below from the ViewController
Play().playMusic(url: url!)
Class Play()
import Foundation
import AVFoundation
var player: AVPlayer! = nil
var playerItemContext = 0
class Play: AVPlayer {
func playMusic(url : URL) {
let asset = AVAsset(url: url)
let playerItem = AVPlayerItem(asset: asset)
if player == nil {
player = AVPlayer(playerItem: playerItem)
if player.status.rawValue == 0 {
player.play()
player.addObserver(player, forKeyPath: "timeControlStatus", options: [.old, .new], context: &playerItemContext)
}
} else {
player.replaceCurrentItem(with: playerItem)
player.play()
}
}
override func observeValue(forKeyPath keyPath: String?,
of object: Any?,
change: [NSKeyValueChangeKey : Any]?,
context: UnsafeMutableRawPointer?) {
// Only handle observations
guard context == &playerItemContext else {
super.observeValue(forKeyPath: keyPath,
of: object,
change: change,
context: context)
return
}
if keyPath == "timeControlStatus" { print("Result") }
}
}
The above always crashes with;
<AVPlayer: 0x6000030a4770>: An -observeValueForKeyPath:ofObject:change:context: message was received but not handled.
Key path: timeControlStatus
Observed object: <AVPlayer: 0x6000030a4770>
Change: {
kind = 1;
new = 1;
old = 1;
}
Context: 0x1003f3e98'
If I remove the 'addObserver', the code acts as intended and plays the audio file, the weird thing is, if move all the observer code from the Play class over to ViewContoller it works? what gives?.
The difference is merely that when you move the code to the view controller, the view controller persists. It is a stable object living in the view controller hierarchy. So it lives long enough to do some work.
But on the other hand, in this line:
Play().playMusic(url: url!)
...the Play instance is created and immediately goes out of existence again, like a quantum virtual particle. It doesn't live long enough to be there when the playing proceeds. Hence the crash: you have allowed the observer to go out of existence too soon.
If you wanted your Play instance to persist, you would need to assign it to some long-lived variable, such as a property of your view controller.
trying to play a video with AVPlayer like this:
if let video = card.pageImageVideoController.controllers[0] as? VideoController{
video.player.play()
}
I noticed that the video doesn't play. So I inspected deeper and found out that when I call the function .play() the AVPlayer current Item is nil.
I thought that the solution for this should be to add KVO observer for the player to see when the item is ready to play. I used this stack overflow question.
And I modified the previous code like this:
var playbackLikelyToKeepUpContext = 0
if let video = card.pageImageVideoController.controllers[0] as? VideoController{
video.player.addObserver(self, forKeyPath: "currentItem.playbackLikelyToKeepUp",
options: .new, context: &playbackLikelyToKeepUpContext)
}
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
guard let videoController = topCard!.pageImageVideoController.controllers[0] as? VideoController else { return }
if context == &playbackLikelyToKeepUpContext {
if videoController.player.currentItem!.isPlaybackLikelyToKeepUp {
// loadingIndicatorView.stopAnimating() or something else
print("ready")
} else {
// loadingIndicatorView.startAnimating() or something else
print("not ready")
}
}
}
But the function observeValue is never called. I don't know why.
If your idea is to check if the item is ready to play or not. Then better you put observer for status. And check error in the observer function. As mentioned in the following document:
https://developer.apple.com/documentation/avfoundation/avplayeritem
While implementing the Google Interactive Media Ads (IMA) SDK protocols, my mediaPlayer/audioManager, which is an AVPlayer object, is not pausing during the adsManagerDidRequestContentPause delegate method. My mediaPlayer conforms to ObservableObject, which is where I think the problem is coming from but i'm not 100% positive.
When I press the play button, I play the audio and request the ads from my adsManager class. The issue is the preroll video plays but the audio from the content player plays over the preroll. The audio from the content player is supposed to pause when the preroll is playing and resume after it finishes. As you'll see in the code, AudioManager is also a singleton class.
Here's the code for when a user presses play.
#ObservedObject var manager = AudioManager.sharedInstance
func didTapPlayButton(){
isPlaying.toggle()
if isPlaying {
audioManager.playLiveStream(with: pageInfo.tritonMount)
adManager.requestAds()
} else {
audioManager.pause()
}
}
And here's the adManager class with the Google IMA delegate methods. After setting breakpoints at each delegate method, I found that each call to audioManager is being called successfully, however, the audio from the audioManager doesn't actually pause.
#ObservedObject var manager = AudioManager.sharedInstance
func setUpContentPlayer() {
contentPlayhead = IMAAVPlayerContentPlayhead(avPlayer: player)
// Create a player layer for the player.
playerLayer = AVPlayerLayer(player: player)
playerLayer!.frame = videoView.underlyingView.layer.bounds
videoView.underlyingView.layer.addSublayer(playerLayer!)
}
func setUpAdsLoader() {
adsLoader = IMAAdsLoader(settings: nil)
adsLoader!.delegate = self
}
func requestAds() {
// Create an ad display container for ad rendering.
let adDisplayContainer = IMAAdDisplayContainer(adContainer: videoView.underlyingView, companionSlots: nil)
// Create an ad request with our ad tag, display container, and optional user context.
let request = IMAAdsRequest(
adTagUrl: kLivePrerollVastTag,
adDisplayContainer: adDisplayContainer,
contentPlayhead: contentPlayhead,
userContext: nil)
adsLoader!.requestAds(with: request)
}
func adsLoader(_ loader: IMAAdsLoader!, adsLoadedWith adsLoadedData: IMAAdsLoadedData!) {
// Grab the instance of the IMAAdsManager and set ourselves as the delegate
adsManager = adsLoadedData.adsManager
adsManager!.delegate = self
// Create ads rendering settings and tell the SDK to use the in-app browser.
let adsRenderingSettings = IMAAdsRenderingSettings()
adsRenderingSettings.webOpenerPresentingController = viewController
// Initialize the ads manager.
adsManager!.initialize(with: adsRenderingSettings)
}
//Ads Manager recieved ad request and is loading and starting ad
func adsManager(_ adsManager: IMAAdsManager!, didReceive event: IMAAdEvent!) {
if (event.type == IMAAdEventType.LOADED) {
// When the SDK notifies us that ads have been loaded, play them.
adsManager.start()
}
}
//Ads manager failed to receive ad request
func adsLoader(_ loader: IMAAdsLoader!, failedWith adErrorData: IMAAdLoadingErrorData!) {
print("Error loading ads: \(String(describing: adErrorData.adError.message))")
audioManager.resume()
}
func adsManager(_ adsManager: IMAAdsManager!, didReceive error: IMAAdError!) {
// Something went wrong with the ads manager after ads were loaded. Log the
// error and play the content.
NSLog("AdsManager error: \(String(describing: error.message))")
audioManager.resume()
}
/*Ads manager received request, initiated avplayer and is now
requesting player be paused in order or the Ads manager to play preroll*/
func adsManagerDidRequestContentPause(_ adsManager: IMAAdsManager!) {
// The SDK is going to play ads, so pause the content.
audioManager.pause()
}
/*Ads manager received request, played the preroll and is now
requesting avplayer to resume live stream*/
func adsManagerDidRequestContentResume(_ adsManager: IMAAdsManager!) {
// The SDK is done playing ads (at least for now), so resume the content.
audioManager.resume()
}
AudioManagerClass:
/// Upon setting this property any observers for 'currentItem' as well as any time observers will
/// be removed from the old value where applicable and added to the new value where applicable.
var player: AVPlayer? {
didSet {
oldValue?.removeObserver(self, forKeyPath: "currentItem", context: &AudioManager.ObserveAVPlayerCurrentItem)
oldValue?.removeObserver(self, forKeyPath: "rate", context: &AudioManager.ObserveAVPlayerRate)
guard let player = self.player else { return }
player.addObserver(self, forKeyPath: "currentItem", options: [.initial, .new], context: &AudioManager.ObserveAVPlayerCurrentItem)
player.addObserver(self, forKeyPath: "rate", options: [.initial, .new], context: &AudioManager.ObserveAVPlayerRate)
player.automaticallyWaitsToMinimizeStalling = false
if let tmpObs = timeObserver {
print("Already have timeobserver, removing it")
oldValue?.removeTimeObserver(tmpObs)
timeObserver = nil
}
timeObserver = player.addPeriodicTimeObserver(forInterval: CMTimeMakeWithSeconds(1,preferredTimescale: 1),
queue: nil,
using: timeObserverCallback) as AnyObject?
}
}
my problem is that when having very bad connection,the activtiy indicator start animating till first frame in video is shown,then disappear thinking is the video is playing,but the video stops playing stuck on loaded first frame,until the whole video is loaded then its resume playing,how to show activity indicator while video is stuck on frame and buffering ,then play until next loaded frame ?
notes:
it's working when internet connetion is off ,video is played until the loaded frame and activity indicator is shown,then when turn on video us resumed to play and activity indicator is hidden
it's working when normal internet connection is present
removing and showing indicator using override observevalue for key path
"currentItem.loadedTimeRanges"/"currentItem.playbackBufferEmpty"
i made a uiview class with avplayer in it
import UIKit
import AVKit
import AVFoundation
class videoplaying: UIView {
override static var layerClass: AnyClass {
return AVPlayerLayer.self;
}
var playerlayer: AVPlayerLayer{
return layer as! AVPlayerLayer;
}
var player: AVPlayer?{
get{
return playerlayer.player
}
set {
playerlayer.player = newValue
}
}
var playetitem: AVPlayerItem?
}
i assigned a uivew in uicollectioncell to this class(using storyboard)
avplayer starts playing and adding observes when pressing play in uicollectioncell
#IBAction func play(_ sender: Any) {
activityindicator.isHidden = false
activityindicator.startAnimating()
self.butttoonheight.isHidden = true
self.postimage.isHidden = true
let url2 = URL(string: "https://clips.vorwaerts-gmbh.de/big_buck_bunny.mp4")
let avplayer = AVPlayer(url: url2! )
let playeritem = AVPlayerItem(url: url2!)
//videoss is class of type uiview
videoss.playetitem = playeritem
videoss.playerlayer.player = avplayer
videoss.player?.play()
videoss.player?.addObserver(self, forKeyPath: "currentItem.loadedTimeRanges", options: .new, context: nil)
videoss.player?.addObserver(self, forKeyPath: "rate", options: .new
, context: nil)
videoss.player?.addObserver(self, forKeyPath: "currentItem.playbackBufferEmpty", options: .new, context: nil)
playying.isHidden = false
}
//observing when video is playing
//playpause button to play or pause video while bad network is present video is stuck on first frame and playorpause is not changing while pressed
#IBAction func playorpause(_ sender: Any) {
if videoss.player?.timeControlStatus == AVPlayerTimeControlStatus.paused{
videoss.player?.play()
playying.setImage(UIImage(named: "pas50"), for: .normal)
}
if videoss.player?.timeControlStatus == AVPlayerTimeControlStatus.playing{
videoss.player?.pause()
playying.setImage(UIImage(named: "p24"), for: .normal)
}
}
override public func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if keyPath == "rate"{
print(videoss.player?.rate)
if videoss.player?.rate == 0.0 {
print("dawdaopwdaopwdipo")
}
}
if keyPath == "currentItem.loadedTimeRanges"{
print("its is working")
activityindicator.stopAnimating()
activityindicator.isHidden = true
}
if keyPath == "currentItem.playbackBufferEmpty"
{
activityindicator.startAnimating()
activityindicator.isHidden = false
print("pkawdawdawd")
}
}
I solved this problem by adding a timer that runs ever 0.3 seconds,evrry 0.3seconds it checks the current time of the video if current time equals previous time then the video is not playing activity indicator is shown,if not video if playing activity indicators was hidden,also you need to check if the users pause the video,an advantage is you also get the current time also in the same function.
What i tried also was seeing rate of the video ,but over slow internet the rate was always giving that is playing,but it wasn’t,which didn’t help
When playing a live stream using the HTTP Live Streaming method, is it possible read the current metadata (eg. Title and Artist)? This is for an iPhone radio app.
Not sure that this question is still actual for its author, but may be it will help someone. After two days of pain I investigated that it's quite simple. Here is the code that works for me:
AVPlayerItem* playerItem = [AVPlayerItem playerItemWithURL:[NSURL URLWithString:<here your http stream url>]];
[playerItem addObserver:self forKeyPath:#"timedMetadata" options:NSKeyValueObservingOptionNew context:nil];
AVPlayer* player = [[AVPlayer playerWithPlayerItem:playerItem] retain];
[player play];
and then:
- (void) observeValueForKeyPath:(NSString*)keyPath ofObject:(id)object
change:(NSDictionary*)change context:(void*)context {
if ([keyPath isEqualToString:#"timedMetadata"])
{
AVPlayerItem* playerItem = object;
for (AVMetadataItem* metadata in playerItem.timedMetadata)
{
NSLog(#"\nkey: %#\nkeySpace: %#\ncommonKey: %#\nvalue: %#", [metadata.key description], metadata.keySpace, metadata.commonKey, metadata.stringValue);
}
}
}
That's it. I dont know why Apple didn't provide in the docs for AVPlayerItem this sample for access "title" of the stream which is the key feature for real world streaming audio. In "AV Foundation Framework Reference" they tell about "timedMetadata" nowhere where needed. And Matt's sample does not work with all streams (but AVPlayer does).
in swift 2.0 getting metadata info music streaming:
PlayerItem.addObserver(self, forKeyPath: "timedMetadata", options: NSKeyValueObservingOptions.New, context: nil)
add this method:
override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
//Atualiza Nome Musica
if keyPath == "timedMetadata" {
if let meta = PlayerItem.timedMetadata {
print("Novo Metadata \(meta)")
for metadata in meta {
if let nomemusica = metadata.valueForKey("value") as? String{
LB_NomeMusica.text = nomemusica
if NSClassFromString("MPNowPlayingInfoCenter") != nil {
let image:UIImage = UIImage(named: "logo.gif")!
let albumArt = MPMediaItemArtwork(image: image)
var songInfo: [String:AnyObject] = [
MPMediaItemPropertyTitle: nomemusica,
MPMediaItemPropertyArtist: "Ao Vivo",
MPMediaItemPropertyArtwork: albumArt
]
MPNowPlayingInfoCenter.defaultCenter().nowPlayingInfo = songInfo
}
}
}
}
}
}
Swift solution. This is a sample of simple streaming audio player. You can read metadata in the method of delegate AVPlayerItemMetadataOutputPushDelegate.
import UIKit
import AVFoundation
class PlayerViewController: UIViewController {
var player = AVPlayer()
override func viewDidLoad() {
super.viewDidLoad()
configurePlayer()
player.play()
}
private func configurePlayer() {
guard let url = URL(string: "Your stream URL") else { return }
let asset = AVAsset(url: url)
let playerItem = AVPlayerItem(asset: asset)
let metadataOutput = AVPlayerItemMetadataOutput(identifiers: nil)
metadataOutput.setDelegate(self, queue: DispatchQueue.main)
playerItem.add(metadataOutput)
player = AVPlayer(playerItem: playerItem)
}
}
extension PlayerViewController: AVPlayerItemMetadataOutputPushDelegate {
func metadataOutput(_ output: AVPlayerItemMetadataOutput, didOutputTimedMetadataGroups groups: [AVTimedMetadataGroup], from track: AVPlayerItemTrack?) {
let item = groups.first?.items.first
item?.value(forKeyPath: "value")
print(item!.value(forKeyPath: "value")!)
}
}
It is, but it's not easy. Matt Gallagher has a nice post on his blog about streaming audio. To quote him on the subject:
The easiest source of metadata comes
from the HTTP headers. Inside the
handleReadFromStream:eventType:
method, use CFReadStreamCopyProperty
to copy the
kCFStreamPropertyHTTPResponseHeader
property from the CFReadStreamRef,
then you can use
CFHTTPMessageCopyAllHeaderFields to
copy the header fields out of the
response. For many streaming audio
servers, the stream name is one of
these fields.
The considerably harder source of
metadata are the ID3 tags. ID3v1 is
always at the end of the file (so is
useless when streaming). ID3v2 is
located at the start so may be more
accessible.
I've never read the ID3 tags but I
suspect that if you cache the first
few hundred kilobytes of the file
somewhere as it loads, open that cache
with AudioFileOpenWithCallbacks and
then read the kAudioFilePropertyID3Tag
with AudioFileGetProperty you may be
able to read the ID3 data (if it
exists). Like I said though: I've
never actually done this so I don't
know for certain that it would work.