AVPlayer selectively playing MP3 audio file in documents directory - swift

I'm writing a podcast app and I want the user to play the local file they've downloaded if it exists (rather than stream it). The streaming works fine, but if the local file is found, it's not played by AVPlayer (this is actually not true for one out of about 5 files for reasons beyond me -- tapping on one podcast will play locally (I'm in airplane mode to verify).
I believe this is the Swift equivalent of this question, but I haven't been able to implement that solution in Swift.
Here's the code:
func tableView(tableView: UITableView, didSelectRowAtIndexPath indexPath: NSIndexPath) {
var theURL = posts.objectAtIndex(indexPath.row).valueForKey("enclosure") as? String
var theMp3Filename = theURL!.lastPathComponent
for i in downloadedList {
if i as! String == theMp3Filename {
// If we've d/l'ed the mp3, insert the local path
theURL = documentsDirectory().stringByAppendingPathComponent(theMp3Filename)
var path: NSURL = NSURL.fileURLWithPath(theURL!)!
player = AVPlayer(URL: path)
player.play()
} else {
let podcastEpisode = AVPlayerItem(URL: NSURL(string: theURL!))
player = AVPlayer(playerItem: podcastEpisode)
player.play()
}
}
}
downloadedList is just an array I generate with the downloaded filenames. I am getting into the local condition, and the path looks right, but AVPlayer usually isn't working.
Any guidance would be appreciated.

Add this right after you set your path variable for playing downloaded audio:
let asset = AVURLAsset(URL: path)
let playerItem = AVPlayerItem(asset: asset)
player = AVPlayer(playerItem: playerItem)
player.play()

Related

Buffering the next song using a second AVPlayer

I want to play a list of songs but due to a custom queue system I don't want to use AVQueuePlayer, to have a seamless transition between songs (not having to wait for the next song to load/buffer) I want to create a second player which loads the next song in the queue and assign it to the main player, is it a legit solution?
Something like this:
var player = AVPlayer()
var nextSongPlayer = AVPlayer()
//...
func playSong() {
let playerItem = AVPlayerItem(url: "URL1")
player.replaceCurrentItem(with: playerItem)
//...
let player = AVPlayerItem(url: "URL2")
nextSongPlayer.replaceCurrentItem(with: player)
//...
player.play()
}
func nextSong() {
player = nextSongPlayer
player = play
}
Or I should create an AVPlayerItem with the second URL and replace the current plying item with that? or there is a better solution?
The code is almost wright, I had tp create a new AVPlayer and assign it to the nextSongPlayer:
//...
let playerItem = AVPlayerItem(url: "URL2")
let tempPlayer = AVPlayer()
tempPlayer.replaceCurrentItem(with: player)
tempPlayer.play() // Makes the player buffer the song
tempPlayer.pause()
self. nextSongPlayer = tempPlayer()
//...

Why is my local video file path within the document directory not working after some time? [Swift]

I have a custom video recording view controller, in which I save using a URL:
Sample URL: file:///var/mobile/Containers/Data/Application/802497BB-8413-4245-B33A-708EF8DD9CAF/Documents/FA6AF78A-2692-4CC8-8BE2-7F18140C3C98.mp4
The URL is generated using these functions:
// Gets the directory that the video is stored in
func getPathDirectory() -> URL {
// Searches a FileManager for paths and returns the first one
let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)
let documentDirectory = paths[0]
return documentDirectory
}
func generateURL() -> URL? {
let path = getPathDirectory().appendingPathComponent(NSUUID().uuidString + ".mp4")
return path
}
I then replay the video once in a separate playback controller, and it works no problem. I have included it right below:
import AVFoundation
class VideoPlaybackController: UIViewController {
let avPlayer = AVPlayer()
var avPlayerLayer: AVPlayerLayer!
var videoURL: URL!
#IBOutlet weak var videoView: UIView!
override func viewDidLoad() {
super.viewDidLoad()
// print("VideoPlaybackController: \(videoURL!)")
avPlayerLayer = AVPlayerLayer(player: avPlayer)
avPlayerLayer.frame = view.bounds
avPlayerLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill
videoView.layer.insertSublayer(avPlayerLayer, at: 0)
view.layoutIfNeeded()
let playerItem = AVPlayerItem(url: self.videoURL as URL)
avPlayer.replaceCurrentItem(with: playerItem)
avPlayer.play()
}
}
While the video works, after some time, the video cannot be replayed, even though I saved it within the document directory. I downloaded the xcappdata and looked within the contents of the app, and saw that the video file is still there, so I am confused as to why the video is not available after some time.
Is there something that I am missing? Thank you for your time.

AVPlayer cannot play video

I am trying to play a video inside a subclass of JSQMessagesViewController, but it doesn't start playing onClick event.
I am downloading the video from firebase, and then I write the data on the device by appending a unique reference -LK7WSGtGAQ2anpDxkCQ. Please see below print out statement for the complete path.
//response to collection view tap events
//called every time we type on a message
override func collectionView(_ collectionView: JSQMessagesCollectionView!, didTapMessageBubbleAt indexPath: IndexPath!) {
let message = messages[indexPath.item]
print("message type is \(message.type) and message media uid is \(message.mediaMessage?.mediaUID)")
if message.type == MessageType.image {
let jsqMessage = jsqMessages[indexPath.item]
let mediaItem = jsqMessage.media as! JSQPhotoMediaItem
let photos = IDMPhoto.photos(withImages: [mediaItem.image])
let browser = IDMPhotoBrowser(photos: photos)
self.present(browser!, animated: true, completion: nil)
}
if message.type == MessageType.video {
let jsqMessage = jsqMessages[indexPath.item]
if jsqMessage.isMediaMessage {
if let mediaItem = jsqMessage.media as? JSQVideoMediaItem {
print("mediaItem.fileURL is \(mediaItem.fileURL)")
let player = AVPlayer(url: mediaItem.fileURL)
let playerViewController = AVPlayerViewController()
playerViewController.player = player
self.present(playerViewController, animated: true) {
playerViewController.player!.play()
}
}
}
}
}
}//end of extension
mediaItem.fileURL is
Optional(file:///Users/bogdanbarbulescu/Library/Developer/CoreSimulator/Devices/7FB42206-997D-4AC2-B0BD-CEE2E22DAFBE/data/Containers/Data/Application/8F17DF31-16B1-4B27-ACB2-015AA55D8979/Documents/-LK7WSGtGAQ2anpDxkCQ)
Update
The video can by played now after appending .MOV to the url above, but there is no sound when it's played. How to fix this?
The problem was that when I was saving the data on the disk, I did not append .MOV extension to the end of the path, thus AVPlayer, could not play the video.
Before:
mediaItem.fileURL is file:///Users/bogdanbarbulescu/Library/Developer/CoreSimulator/Devices/7FB42206-997D-4AC2-B0BD-CEE2E22DAFBE/data/Containers/Data/Application/8F17DF31-16B1-4B27-ACB2-015AA55D8979/Documents/-LK7WSGtGAQ2anpDxkCQ
After: file:///Users/bogdanbarbulescu/Library/Developer/CoreSimulator/Devices/7FB42206-997D-4AC2-B0BD-CEE2E22DAFBE/data/Containers/Data/Application/8F17DF31-16B1-4B27-ACB2-015AA55D8979/Documents/-LK7WSGtGAQ2anpDxkCQ.MOV

Play video from local file with tvOS

I want to play a little video (20 seconds) when my app opens on the Apple TV. But I can't find how to implement the URL from a local file.
This is what I've got so far, but unfortunately it doesn't work:
override func viewDidLoad() {
super.viewDidLoad()
// I know this is how to play a file from a webserver:
// player = AVPlayer(URL: NSURL(string: "http://clips.vorwaerts-gmbh.de/big_buck_bunny.mp4")!)
// But I can't find how to set the URL to a local file in the app
let path = NSBundle.mainBundle().pathForResource("bunny", ofType:"mp4")
let url = NSURL.fileURLWithPath(path!)
player = AVPlayer(URL: url)
player?.play()
}
Your code seems fine. Maybe, you should try this:
In the Project Navigator select your Project Root > Your Target >
Build Phases > Copy Bundle Resources. Check to see that your video is
there, if not click the plus sign to add it.
Source: http://www.brianjcoleman.com/tutorial-play-video-swift/
I'd like to list the most compact way along with all four hurdles:
Import AVKit, Call the playVideo() function from viewDidAppear, watch
out for the correct Bundle access and make sure the file has the
correct target membership as stated by Ennio!
private func playVideo() {
guard let path = Bundle.main.path(forResource: "video", ofType:"mov") else {
debugPrint("video.mov not found")
return
}
let player = AVPlayer(url: URL(fileURLWithPath: path))
// Create a new AVPlayerViewController and pass it a reference to the player.
let controller = AVPlayerViewController()
controller.player = player
// Modally present the player and call the player's play() method when complete.
present(controller, animated: true) {
player.play()
}
}

is it possible to read metadata using HTTP live streaming in the iPhone SDK

When playing a live stream using the HTTP Live Streaming method, is it possible read the current metadata (eg. Title and Artist)? This is for an iPhone radio app.
Not sure that this question is still actual for its author, but may be it will help someone. After two days of pain I investigated that it's quite simple. Here is the code that works for me:
AVPlayerItem* playerItem = [AVPlayerItem playerItemWithURL:[NSURL URLWithString:<here your http stream url>]];
[playerItem addObserver:self forKeyPath:#"timedMetadata" options:NSKeyValueObservingOptionNew context:nil];
AVPlayer* player = [[AVPlayer playerWithPlayerItem:playerItem] retain];
[player play];
and then:
- (void) observeValueForKeyPath:(NSString*)keyPath ofObject:(id)object
change:(NSDictionary*)change context:(void*)context {
if ([keyPath isEqualToString:#"timedMetadata"])
{
AVPlayerItem* playerItem = object;
for (AVMetadataItem* metadata in playerItem.timedMetadata)
{
NSLog(#"\nkey: %#\nkeySpace: %#\ncommonKey: %#\nvalue: %#", [metadata.key description], metadata.keySpace, metadata.commonKey, metadata.stringValue);
}
}
}
That's it. I dont know why Apple didn't provide in the docs for AVPlayerItem this sample for access "title" of the stream which is the key feature for real world streaming audio. In "AV Foundation Framework Reference" they tell about "timedMetadata" nowhere where needed. And Matt's sample does not work with all streams (but AVPlayer does).
in swift 2.0 getting metadata info music streaming:
PlayerItem.addObserver(self, forKeyPath: "timedMetadata", options: NSKeyValueObservingOptions.New, context: nil)
add this method:
override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
//Atualiza Nome Musica
if keyPath == "timedMetadata" {
if let meta = PlayerItem.timedMetadata {
print("Novo Metadata \(meta)")
for metadata in meta {
if let nomemusica = metadata.valueForKey("value") as? String{
LB_NomeMusica.text = nomemusica
if NSClassFromString("MPNowPlayingInfoCenter") != nil {
let image:UIImage = UIImage(named: "logo.gif")!
let albumArt = MPMediaItemArtwork(image: image)
var songInfo: [String:AnyObject] = [
MPMediaItemPropertyTitle: nomemusica,
MPMediaItemPropertyArtist: "Ao Vivo",
MPMediaItemPropertyArtwork: albumArt
]
MPNowPlayingInfoCenter.defaultCenter().nowPlayingInfo = songInfo
}
}
}
}
}
}
Swift solution. This is a sample of simple streaming audio player. You can read metadata in the method of delegate AVPlayerItemMetadataOutputPushDelegate.
import UIKit
import AVFoundation
class PlayerViewController: UIViewController {
var player = AVPlayer()
override func viewDidLoad() {
super.viewDidLoad()
configurePlayer()
player.play()
}
private func configurePlayer() {
guard let url = URL(string: "Your stream URL") else { return }
let asset = AVAsset(url: url)
let playerItem = AVPlayerItem(asset: asset)
let metadataOutput = AVPlayerItemMetadataOutput(identifiers: nil)
metadataOutput.setDelegate(self, queue: DispatchQueue.main)
playerItem.add(metadataOutput)
player = AVPlayer(playerItem: playerItem)
}
}
extension PlayerViewController: AVPlayerItemMetadataOutputPushDelegate {
func metadataOutput(_ output: AVPlayerItemMetadataOutput, didOutputTimedMetadataGroups groups: [AVTimedMetadataGroup], from track: AVPlayerItemTrack?) {
let item = groups.first?.items.first
item?.value(forKeyPath: "value")
print(item!.value(forKeyPath: "value")!)
}
}
It is, but it's not easy. Matt Gallagher has a nice post on his blog about streaming audio. To quote him on the subject:
The easiest source of metadata comes
from the HTTP headers. Inside the
handleReadFromStream:eventType:
method, use CFReadStreamCopyProperty
to copy the
kCFStreamPropertyHTTPResponseHeader
property from the CFReadStreamRef,
then you can use
CFHTTPMessageCopyAllHeaderFields to
copy the header fields out of the
response. For many streaming audio
servers, the stream name is one of
these fields.
The considerably harder source of
metadata are the ID3 tags. ID3v1 is
always at the end of the file (so is
useless when streaming). ID3v2 is
located at the start so may be more
accessible.
I've never read the ID3 tags but I
suspect that if you cache the first
few hundred kilobytes of the file
somewhere as it loads, open that cache
with AudioFileOpenWithCallbacks and
then read the kAudioFilePropertyID3Tag
with AudioFileGetProperty you may be
able to read the ID3 data (if it
exists). Like I said though: I've
never actually done this so I don't
know for certain that it would work.