I am working on Downloading and playing HLS content, To download the HLS I am using following code
func downloadTask() {
let videoUrl = URL(string: "https://bitdash-a.akamaihd.net/content/MI201109210084_1/m3u8s/f08e80da-bf1d-4e3d-8899-f0f6155f6efa.m3u8")!
configuration = URLSessionConfiguration.background(withIdentifier: downloadIdentifier)
downloadSession = AVAssetDownloadURLSession(configuration: configuration!, assetDownloadDelegate: self, delegateQueue: OperationQueue.main)
let documentsDirectoryURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!
destinationUrl = documentsDirectoryURL.appendingPathComponent(videoUrl.lastPathComponent)
var urlComponents = URLComponents(
url: videoUrl,
resolvingAgainstBaseURL: false
)!
urlComponents.scheme = "https"
do {
let asset = try AVURLAsset(url: urlComponents.url!)
asset.resourceLoader.setDelegate(self, queue: DispatchQueue(label: "com.example.AssetResourceLoaderDelegateQueue"))
if #available(iOS 10.0, *) {
assetDownloadTask = downloadSession!
.makeAssetDownloadTask(
asset: asset,
assetTitle: "RG-TVVideo",
assetArtworkData: nil,
options: nil
)
APP_DELEGATE.isProgressRunning = true
assetDownloadTask?.resume()
} else {
// Fallback on earlier versions
}
} catch { print("Erorr while parsing the URL.") }
}
Download finished
func urlSession(_ session: URLSession, assetDownloadTask: AVAssetDownloadTask, didFinishDownloadingTo location: URL) {
if #available(iOS 11.0, *) {
let storageManager = AVAssetDownloadStorageManager.shared()
let newPolicy = AVMutableAssetDownloadStorageManagementPolicy()
newPolicy.expirationDate = Date()
newPolicy.priority = .important
let baseURL = URL(fileURLWithPath: NSHomeDirectory())
let assetURL = baseURL.appendingPathComponent(location.relativePath)
storageManager.setStorageManagementPolicy(newPolicy, for: assetURL)
UserDefaults.standard.set(location.relativePath, forKey: "videoPath")
strDownloadStatus = "5"
let dictVideoInfo = ["strDownloadStatus" : "5","VideoID":self.strID]
// Here I am Storing Downloaded location in to database
DBManager.shared.updateVideoStatus(strVideoID: APP_DELEGATE.arrTempVideoIds.object(at: 0) as! String, strStatus: "5", strSavePath: location.relativePath) { (status) in }
DispatchQueue.main.async {
NotificationCenter.default.post(name: NSNotification.Name.init("UpdateProgress"), object: self.percentageComplete, userInfo: dictVideoInfo)
}
}
}
Now I am trying to get Video path from the location which is stored in Database and trying to play it offline(Without Internet) using following code
func setLocalPlayer(strDownloadPath: String) {
let strDownloadPath = “”
//Getting path from database
DBManager.shared.getDownloadedPath(videoID: VideoID) { (strPath) in
strDownloadPath = strPath
}
activityIndicator.isHidden = false
let baseURL = URL(fileURLWithPath: NSHomeDirectory())
let assetURL = baseURL.appendingPathComponent(strDownloadPath)
let asset = AVURLAsset(url: assetURL)
// if let cache = asset.assetCache, cache.isPlayableOffline {
// let videoAsset = AVURLAsset(url: assetURL)
asset.resourceLoader.preloadsEligibleContentKeys = true
asset.resourceLoader.setDelegate(self, queue: DispatchQueue(label: "com.example.AssetResourceLoaderDelegateQueue"))
let playerItem = AVPlayerItem(asset: asset)
avPlayer = AVPlayer(playerItem: playerItem)
avPlayerLayer = AVPlayerLayer()
avPlayerLayer.frame = CGRect(x: 0, y: 0, width: playerContainer.frame.width, height: playerContainer.frame.height)
avPlayerLayer.videoGravity = .resize
avPlayerLayer.player = avPlayer
playerContainer.layer.addSublayer(avPlayerLayer)
let interval = CMTime(seconds: 0.01, preferredTimescale: CMTimeScale(NSEC_PER_SEC))
timeObserver = avPlayer?.addPeriodicTimeObserver(forInterval: interval, queue: DispatchQueue.main, using: { elapsedTime in
self.updateVideoPlayerState()
if self.avPlayer != nil {
self.bufferState()
}
})
self.slider.setThumbImage(UIImage(named: "slider_dot"), for: UIControl.State.normal)
resetTimer()
avPlayer.play()
isPlaying = true
// }
}
NOTE: This code is working fine when internet is on
I have referred following links
https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/MediaPlaybackGuide/Contents/Resources/en.lproj/HTTPLiveStreaming/HTTPLiveStreaming.html
https://assist-software.net/snippets/how-play-encrypted-http-live-streams-offline-avfoundation-ios-using-swift-4
Downloading and playing offline HLS Content - iOS 10
Please guide what I am doing wrong.
Thanks
Well, I don't know if it's your error, but for further readings :
Don't do newPolicy.expirationDate = Date() it's a mistake. According to Advances in HTTP Live Streaming 2017 WWDC session, it will delete your file as soon as possible.
Before playing your offline playback, you can check if it's still on your device in Settings -> General -> Storage -> MyApp
The expiration date property is there in case your asset at some point
becomes no longer eligible to be played. For instance, you may find
that you may be in a situation where a particular show may be leaving
your catalog, you no longer have rights to stream it.
If that's the case you can set the expiration date and it will be sort of bumped up
in the deletion queue. So, using it is fairly straight forward.
Related
I am trying to play multiple audio files using 2 AVPlayer instances, but one of the player stops for a fraction of a second rather than playing all audio files simultaneously.
The logic of the program is as follows:
var player: AVPlayer? will stream an audio file from my database. On its own is playing perfectly.
fileprivate var countPlayer: AVPlayer? plays the count number of the current item being played by var player. The count is a sequence of 1 to 8 and for each digit I am storing/sandobxing a .wav file locally such as 1.wav, 2.wav...8.wav.
When current time of var player is at a certain time, countPlayer is triggered and it plays one of the local file 1.wav, 2.wav..etc.
The problem is that when the var countPlayer starts playing, it causes the background AVPlayer, namely var player to stop for a fraction of a second, similar to what's described in this comment:
Play multiple Audio Files with AVPlayer
var player: AVPlayer? //plays the song
fileprivate var countPlayer: AVPlayer? // plays the count number of song
private func addBoundaryTimeObserver(tableIndexPath: IndexPath) {
let mediaItem = mediaArray[tableIndexPath.row]
guard let url = URL(string: mediaItem.mediaAudioUrlStringRepresentation ?? "") else {return}
let playerItem = AVPlayerItem(url: url)
player = AVPlayer(playerItem: playerItem)
var timesToTransverse = [NSValue]()
//convert string representation of times elements to array
let timesRecorded: [String] = mediaItem.timesRecorded.components(separatedBy: ",")
// Build boundary times from arrayOfBeats keys
let timeDoubles: [Double] = timesRecorded.compactMap {timeString in
if let second = Double("\(timeString)") {
return second
}
return nil
}
guard timeDoubles.count > 0 else {return} //unexpected
timesToTransverse = timeDoubles.map { second in
let cmtime = CMTime(seconds: second, preferredTimescale: CMTimeScale(NSEC_PER_SEC))
return NSValue(time: cmtime)
}
guard timesToTransverse.count != 0 else {return}
guard let playerCell = tableView.cellForRow(at: IndexPath(row: 0, section: 0)) as? PlayerCell else {return}
startTime = Date().timeIntervalSinceReferenceDate
timeIndex = 0
player?.play()
player?.rate = Float(initialPlaybackRate)
// find the index of time
//reset timeObserverToken
// call a function with the new times sorted
// Queue on which to invoke the callback
let mainQueue = DispatchQueue.main
// Add time observer
timeObserverToken =
player?.addBoundaryTimeObserver(forTimes: timesToTransverse, queue: mainQueue) {
[weak self] in
//because there are no time signature changes, we can simply increment timeIndex with + 1 every time `addBoundaryTimeObserver` completion handler is called and subscript timesToTransverse with timeIndex in order to get the subsequent timeInSeconds
guard let strongSelf = self, strongSelf.timeIndex < timesToTransverse.count else {return}
let timeElement = timesToTransverse[strongSelf.timeIndex]
strongSelf.timeInSeconds = CMTimeGetSeconds(timeElement.timeValue)
//show progress in progressView
let duration = CMTimeGetSeconds(playerItem.duration)
let cmtimeSeconds = CMTime(seconds: strongSelf.timeInSeconds, preferredTimescale: CMTimeScale(NSEC_PER_SEC))
//Total time since timer started, in seconds
strongSelf.timeInSeconds = Date().timeIntervalSinceReferenceDate - strongSelf.startTime
let timeString = String(format: "%.2f", strongSelf.timeInSeconds)
strongSelf.timeString = timeString
//use reminder operator to determine the beat count
let beat = (strongSelf.timeIndex + 1) % 8 == 0 ? 8 : ((strongSelf.timeIndex + 1) % 8)
//play the beat count : 1, 2, ...8
self.preapareToPlayAudio(beatCount: beat)
/*
0: (0 + 1) % 8 = 1
1: (1 + 1) % 8 = 2
6: (6 + 1) % 8 = 7
7: (7 + 1) % 8 = 0
*/
strongSelf.timeIndex += 1
}
}//end addBoundaryTimeObserver
//prepare determine what wav file to play
private func preapareToPlayAudio(beatCount: Int) {
switch beatCount {
case 1:
guard let url = Bundle.main.url(forResource: "1", withExtension: "wav") else {return}
playWith(beatCountURL: url)
//7 more cases go here .....
default: print("unexpected case here")
}
}//end play(beatCount: Int)
private func playWith(beatCountURL: URL) {
let playerItem = AVPlayerItem(url: beatCountURL)
countPlayer = AVPlayer(playerItem: playerItem)
countPlayer?.play()
}
You would be better off using AVAudioPlayerNode, AVAudioMixerNode, AVAudioEngine. Using these classes you won't have problems like you have right now. It's also not that difficult to set up.
You can check out my gist, in order to play the sounds in your Playgrounds you would need to put audio files into Resources folder in Project Navigator:
https://gist.github.com/standinga/24342d23acfe70dc08cbcc994895f32b
The code works without stopping background audio when top sounds are triggered.
Here's also the same code:
import AVFoundation
import PlaygroundSupport
PlaygroundPage.current.needsIndefiniteExecution = true
class AudioPlayer {
var backgroundAudioFile:AVAudioFile
var topAudioFiles: [AVAudioFile] = []
var engine:AVAudioEngine
var backgroundAudioNode: AVAudioPlayerNode
var topAudioAudioNodes = [AVAudioPlayerNode]()
var mixer: AVAudioMixerNode
var timer: Timer!
var urls: [URL] = []
init (_ url: URL, urls: [URL] = []) {
backgroundAudioFile = try! AVAudioFile(forReading: url)
topAudioFiles = urls.map { try! AVAudioFile(forReading: $0) }
engine = AVAudioEngine()
mixer = AVAudioMixerNode()
engine.attach(mixer)
engine.connect(mixer, to: engine.outputNode, format: nil)
self.urls = urls
backgroundAudioNode = AVAudioPlayerNode()
for _ in topAudioFiles {
topAudioAudioNodes += [AVAudioPlayerNode()]
}
}
func start() {
engine.attach(backgroundAudioNode)
engine.connect(backgroundAudioNode, to: mixer, format: nil)
backgroundAudioNode.scheduleFile(backgroundAudioFile, at: nil, completionHandler: nil)
try! engine.start()
backgroundAudioNode.play()
for node in topAudioAudioNodes {
engine.attach(node)
engine.connect(node, to: mixer, format: nil)
try! engine.start()
}
// simulate rescheduling files played on top of background audio
DispatchQueue.global().async { [unowned self] in
for i in 0..<1000 {
sleep(2)
let index = i % self.topAudioAudioNodes.count
let node = self.topAudioAudioNodes[index]
node.scheduleFile(self.topAudioFiles[index], at: nil, completionHandler: nil)
node.play()
}
}
}
}
let bundle = Bundle.main
let beepLow = bundle.url(forResource: "beeplow", withExtension: "wav")!
let beepMid = bundle.url(forResource: "beepmid", withExtension: "wav")!
let backgroundAudio = bundle.url(forResource: "backgroundAudio", withExtension: "wav")!
let audioPlayer = AudioPlayer(backgroundAudio, urls: [beepLow, beepMid])
audioPlayer.start()
I have used this process to download file from the link. Now i want the file path to access this video and play with APPlayer:
#IBAction func btnplayClicked(sender: AnyObject) {
let videoImageUrl = "https://devimages-cdn.apple.com/samplecode/avfoundationMedia/AVFoundationQueuePlayer_HLS2/master.m3u8"
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), {
let url = NSURL(string: videoImageUrl);
let urlData = NSData(contentsOfURL: url!);
if(urlData != nil)
{
let documentsPath = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)[0];
let filePath="\(documentsPath)/video.mp4";
dispatch_async(dispatch_get_main_queue(), {
urlData?.writeToFile(filePath, atomically: true);
PHPhotoLibrary.sharedPhotoLibrary().performChanges({
PHAssetChangeRequest.creationRequestForAssetFromVideoAtFileURL(NSURL(fileURLWithPath: filePath))
}) { completed, error in
if completed {
print("Video is saved!")
if let path = NSBundle.mainBundle().pathForResource("video", ofType: ".mp4")
{
let apath = NSURL(fileURLWithPath: path)
let video = AVPlayer(URL: apath)
let videoPlayer = AVPlayerViewController()
videoPlayer.player = video
self.presentViewController(videoPlayer, animated: true, completion: {
video.play()
})
}
}
}
})
}
})
}
Video is downloaded and i can see it on gallery.But AVPlayer does not play the video..What am i doing wrong here..?
AS iOS greek Greek told you are accessing wrong path,you can do it with this approach:
if completed {
dispatch_async(dispatch_get_main_queue(), {
// Call UI related operations
let apath = NSURL(fileURLWithPath: filePath)
let video = AVPlayer(URL: apath)
let videoPlayer = AVPlayerViewController()
videoPlayer.player = video
self.presentViewController(videoPlayer, animated: true, completion: {
video.play()
})
})
}
Did you try to load the AVPlayer directly from the cached path? doing so:
if completed {
let video = AVPlayer(URL: NSURL(fileURLWithPath: filePath))
let videoPlayer = AVPlayerViewController()
videoPlayer.player = video
self.presentViewController(videoPlayer, animated: true, completion: {
video.play()
})
}
I would like to write an app in swift 3 in order to play queued audio files without any gap, crack or noise when passing from one to another.
My first try was using AvAudioPlayer and AvAudioDelegate (AVAudioPlayer using array to queue audio files - Swift), but I don't know how to preload the next song to avoid gap. Even if I know how to do it, I am not certain it is the best way to achieve my goal.
AVQueuePlayer seems to be a better candidate for the job, it is made for that purpose, but I don't find any example to help me out.
Maybe it is only a problem of preloading or buffering? I am a bit lost in this ocean of possibilities.
Any suggestion is welcomed.
It is far to be perfect, specially if you want to do it twice or more ("file exist" error), but it can serve as a base.
What it does is taking two files (mines are aif samples of ap. 4 sec.), encode them in one file and play the resulting files. If you have hundreds of them, assembled aleatory or not, it can make great fun.
All credits for the mergeAudioFiles function goes to #Peyman and #Pigeon_39. Concatenate two audio files in Swift and play them
Swift 3
import Cocoa
import AVFoundation
var action = AVAudioPlayer()
let path = Bundle.main.path(forResource: "audiofile1.aif", ofType:nil)!
let url = URL(fileURLWithPath: path)
let path2 = Bundle.main.path(forResource: "audiofile2.aif", ofType:nil)!
let url2 = URL(fileURLWithPath: path2)
let array1 = NSMutableArray(array: [url, url2])
class ViewController: NSViewController, AVAudioPlayerDelegate
{
#IBOutlet weak var LanceStop: NSButton!
override func viewDidLoad()
{
super.viewDidLoad()
}
override var representedObject: Any?
{
didSet
{
// Update the view, if already loaded.
}
}
#IBAction func Lancer(_ sender: NSButton)
{
mergeAudioFiles(audioFileUrls: array1)
let url3 = NSURL(string: "/Users/ADDUSERNAMEHERE/Documents/FinalAudio.m4a")
do
{
action = try AVAudioPlayer(contentsOf: url3 as! URL)
action.delegate = self
action.numberOfLoops = 0
action.prepareToPlay()
action.volume = 1
action.play()
}
catch{print("error")}
}
func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool)
{
if flag == true
{
}
}
var mergeAudioURL = NSURL()
func mergeAudioFiles(audioFileUrls: NSArray) {
//audioFileUrls.adding(url)
//audioFileUrls.adding(url2)
let composition = AVMutableComposition()
for i in 0 ..< audioFileUrls.count {
let compositionAudioTrack :AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID())
let asset = AVURLAsset(url: (audioFileUrls[i] as! NSURL) as URL)
let track = asset.tracks(withMediaType: AVMediaTypeAudio)[0]
let timeRange = CMTimeRange(start: CMTimeMake(0, 600), duration: track.timeRange.duration)
try! compositionAudioTrack.insertTimeRange(timeRange, of: track, at: composition.duration)
}
let documentDirectoryURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first! as NSURL
self.mergeAudioURL = documentDirectoryURL.appendingPathComponent("FinalAudio.m4a")! as URL as NSURL
let assetExport = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetAppleM4A)
assetExport?.outputFileType = AVFileTypeAppleM4A
assetExport?.outputURL = mergeAudioURL as URL
assetExport?.exportAsynchronously(completionHandler:
{
switch assetExport!.status
{
case AVAssetExportSessionStatus.failed:
print("failed \(assetExport?.error)")
case AVAssetExportSessionStatus.cancelled:
print("cancelled \(assetExport?.error)")
case AVAssetExportSessionStatus.unknown:
print("unknown\(assetExport?.error)")
case AVAssetExportSessionStatus.waiting:
print("waiting\(assetExport?.error)")
case AVAssetExportSessionStatus.exporting:
print("exporting\(assetExport?.error)")
default:
print("Audio Concatenation Complete")
}
})
}
}
I'm new to SWIFT/Programming &
I couldn't find an answer on my question, that's why I'm gonna give it a try here:
HOW Do I download a video (mp4) from an URL and store it within the app**
HOW Do I display the video then in a container**
I've already found this topic:
Swift - Downloading video with downloadTaskWithURL
But in my case, I wouldn't want the video to be safed in the camera-roll. Just within the app.
Thanks for any kind of help/hint !
You can use URLSession's dataTask or downloadTask to download any file from url(if it's downloadble)
Here's the way to use dataTask for downloading:
let videoUrl = "Some video url"
let docsUrl = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!
let destinationUrl = docsUrl.appendingPathComponent("MyFileSaveName.mp4")
if(FileManager().fileExists(atPath: destinationUrl.path)){
print("\n\nfile already exists\n\n")
}
else{
//DispatchQueue.global(qos: .background).async {
var request = URLRequest(url: URL(string: videoUrl)!)
request.httpMethod = "GET"
_ = session.dataTask(with: request, completionHandler: { (data, response, error) in
if(error != nil){
print("\n\nsome error occured\n\n")
return
}
if let response = response as? HTTPURLResponse{
if response.statusCode == 200{
DispatchQueue.main.async {
if let data = data{
if let _ = try? data.write(to: destinationUrl, options: Data.WritingOptions.atomic){
print("\n\nurl data written\n\n")
}
else{
print("\n\nerror again\n\n")
}
}//end if let data
}//end dispatch main
}//end if let response.status
}
}).resume()
//}//end dispatch global
}//end outer else
Now to play the saved file:
class MyViewController: UIViewController {
override func viewDidLoad() {
let baseUrl = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!
let assetUrl = baseUrl.appendingPathComponent("MyFileSaveName.mp4")
let url = assetUrl
print(url)
let avAssest = AVAsset(url: url)
let playerItem = AVPlayerItem(asset: avAssest)
let player = AVPlayer(playerItem: playerItem)
let playerViewController = AVPlayerViewController()
playerViewController.player = player
self.present(playerViewController, animated: true, completion: {
player.play()
})
}
}
However, most sites do not provide a direct dwonloadable link for video. You can get that link by playing the video in a UIWebView and register this following observer to get that link:
NotificationCenter.default.addObserver(self, selector: #selector(videoPlayedInWebView), name: NSNotification.Name(rawValue: "AVPlayerItemBecameCurrentNotification"), object: nil)
#objc func videoPlayedInWebView(aNotification: NSNotification) {
if let playerItem: AVPlayerItem = aNotification.object as? AVPlayerItem{
let asset: AVURLAsset = playerItem.asset as! AVURLAsset
var downloadebaleVideoUrl = asset.url
print(downloadebaleVideoUrl)
}
}
Here "downloadebaleVideoUrl" is the link that will be generated once the video plays in the webview.
If you have any questions, feel free to ask.
Note: This is will work only for sites that have mp4 files. 'HLS' streams won't be downloaded with this method. For that you can refer to this following answer:
https://stackoverflow.com/a/54493233/10758374
Edit: this works only with UIWebView and it won't work with WKWebView.
You need to create a local url, that will be a path in your app's file system and write the video's data into it.
func writeToFile(urlString: String) {
guard let videoUrl = URL(string: urlString) else {
return
}
do {
let videoData = try Data(contentsOf: videoUrl)
let fm = FileManager.default
guard let docUrl = fm.urls(for: .documentDirectory, in: .userDomainMask).first else {
print("Unable to reach the documents folder")
return false
}
let localUrl = docUrl.appendingPathComponent("test.mp4")
try videoData.write(to: localUrl)
} catch {
print("could not save data")
}
}
Keep in mind to always call this function in the background thread.
So i'm using this custom class to record my video -- https://github.com/piemonte/PBJVision. I am attempting to record video in my iOS app and I can't seem to get the code correct to upload the file to my parse server. A few things:
In the PBJVision class it allows you to use NSURL(fileWithPath:videoPath) to access the asset after the video has been recorded.
To access the Data in the asset and save to Parse, I use the following function:
func vision(vision: PBJVision, capturedVideo videoDict: [NSObject : AnyObject]?, error: NSError?) {
if error != nil {
print("Encountered error with video")
isVideo = false
} else {
let currentVideo = videoDict
let videoPath = currentVideo![PBJVisionVideoPathKey] as! String
print("The video path is: \(videoPath)")
self.player = Player()
self.player.delegate = self
self.player.view.frame = CGRect(x: cameraView.frame.origin.x, y: cameraView.frame.origin.y, width: cameraView.frame.width, height: cameraView.frame.height)
self.player.playbackLoops = true
videoUrl = NSURL(fileURLWithPath: videoPath)
self.player.setUrl(videoUrl)
self.cameraView.addSubview(self.player.view)
self.player.playFromBeginning()
nextButton.hidden = false
isVideo = true
let contents: NSData?
do {
contents = try NSData(contentsOfFile: videoPath, options: NSDataReadingOptions.DataReadingMappedAlways)
} catch _ {
contents = nil
}
print(contents)
let videoObject = PFObject(className: "EventChatroomMessages")
videoObject.setValue(user, forKey: "user")
videoObject.setValue("uG7v2KWBQm", forKey: "eventId")
videoObject.setValue(NSDate(), forKey: "timestamp")
let videoFile: PFFile?
do {
videoFile = try PFFile(name: randomAlphaNumericString(26) + ".mp4", data: contents!, contentType: "video/mp4")
print("VideoFile: \(videoFile)")
} catch _ {
print("error")
}
print(videoFile)
videoObject.setValue(videoFile, forKey: "image")
videoObject.saveInBackgroundWithBlock {
(success: Bool, error: NSError?) -> Void in
if success == true {
ProgressHUD.showSuccess("Video Saved.", interaction: false)
dispatch_async(dispatch_get_main_queue()) {
ProgressHUD.dismiss()
}
} else {
ProgressHUD.showError("Error Saving Video.", interaction: false)
dispatch_async(dispatch_get_main_queue()) {
ProgressHUD.dismiss()
}
}
}
}
}
I am then using a UITableView to display my data from Parse. Here is how I retrieve my asset back from Parse and into my AVPlayer():
// Create Player for Reaction
let player = Player()
player.delegate = self
player.view.frame = CGRectMake(0.0, nameLabel.frame.origin.y + nameLabel.frame.size.height + 0.0, self.view.frame.width, 150)
player.view.backgroundColor = UIColor.whiteColor()
let video = message.objectForKey("image") as! PFFile
let urlFromParse = video.url!
print(urlFromParse)
let url = NSURL(fileURLWithPath: video.url!)
print(url)
let playerNew = AVPlayer(URL: url!)
let playerLayer = AVPlayerLayer(player: playerNew)
playerLayer.frame = CGRectMake(0.0, nameLabel.frame.origin.y + nameLabel.frame.size.height + 0.0, self.view.frame.width, 150)
cell.layer.addSublayer(playerLayer)
playerLayer.backgroundColor = UIColor.whiteColor().CGColor
playerNew.play()
I copy the value that is returned from urlFromParse which is (http://parlayapp.herokuapp.com/parse/files/smTrXDGZhlYQGh4BZcVvmZ2rYB9kA5EhPkGbj2R2/58c0648ae4ca9900f2d835feb77f165e_file.mp4) and paste it into my browser and the video plays in browser. Am I correct to assume the file has been saved correctly?
When I go to run my app, the video does not play.Any suggestion on what i'm doing wrong?
I have found that playing video using the pfFile.url does not work. You have to write the NSData from the PFFIle to a local file using the right extension (mov) and then play the video using the local file as the source.