So i'm using this custom class to record my video -- https://github.com/piemonte/PBJVision. I am attempting to record video in my iOS app and I can't seem to get the code correct to upload the file to my parse server. A few things:
In the PBJVision class it allows you to use NSURL(fileWithPath:videoPath) to access the asset after the video has been recorded.
To access the Data in the asset and save to Parse, I use the following function:
func vision(vision: PBJVision, capturedVideo videoDict: [NSObject : AnyObject]?, error: NSError?) {
if error != nil {
print("Encountered error with video")
isVideo = false
} else {
let currentVideo = videoDict
let videoPath = currentVideo![PBJVisionVideoPathKey] as! String
print("The video path is: \(videoPath)")
self.player = Player()
self.player.delegate = self
self.player.view.frame = CGRect(x: cameraView.frame.origin.x, y: cameraView.frame.origin.y, width: cameraView.frame.width, height: cameraView.frame.height)
self.player.playbackLoops = true
videoUrl = NSURL(fileURLWithPath: videoPath)
self.player.setUrl(videoUrl)
self.cameraView.addSubview(self.player.view)
self.player.playFromBeginning()
nextButton.hidden = false
isVideo = true
let contents: NSData?
do {
contents = try NSData(contentsOfFile: videoPath, options: NSDataReadingOptions.DataReadingMappedAlways)
} catch _ {
contents = nil
}
print(contents)
let videoObject = PFObject(className: "EventChatroomMessages")
videoObject.setValue(user, forKey: "user")
videoObject.setValue("uG7v2KWBQm", forKey: "eventId")
videoObject.setValue(NSDate(), forKey: "timestamp")
let videoFile: PFFile?
do {
videoFile = try PFFile(name: randomAlphaNumericString(26) + ".mp4", data: contents!, contentType: "video/mp4")
print("VideoFile: \(videoFile)")
} catch _ {
print("error")
}
print(videoFile)
videoObject.setValue(videoFile, forKey: "image")
videoObject.saveInBackgroundWithBlock {
(success: Bool, error: NSError?) -> Void in
if success == true {
ProgressHUD.showSuccess("Video Saved.", interaction: false)
dispatch_async(dispatch_get_main_queue()) {
ProgressHUD.dismiss()
}
} else {
ProgressHUD.showError("Error Saving Video.", interaction: false)
dispatch_async(dispatch_get_main_queue()) {
ProgressHUD.dismiss()
}
}
}
}
}
I am then using a UITableView to display my data from Parse. Here is how I retrieve my asset back from Parse and into my AVPlayer():
// Create Player for Reaction
let player = Player()
player.delegate = self
player.view.frame = CGRectMake(0.0, nameLabel.frame.origin.y + nameLabel.frame.size.height + 0.0, self.view.frame.width, 150)
player.view.backgroundColor = UIColor.whiteColor()
let video = message.objectForKey("image") as! PFFile
let urlFromParse = video.url!
print(urlFromParse)
let url = NSURL(fileURLWithPath: video.url!)
print(url)
let playerNew = AVPlayer(URL: url!)
let playerLayer = AVPlayerLayer(player: playerNew)
playerLayer.frame = CGRectMake(0.0, nameLabel.frame.origin.y + nameLabel.frame.size.height + 0.0, self.view.frame.width, 150)
cell.layer.addSublayer(playerLayer)
playerLayer.backgroundColor = UIColor.whiteColor().CGColor
playerNew.play()
I copy the value that is returned from urlFromParse which is (http://parlayapp.herokuapp.com/parse/files/smTrXDGZhlYQGh4BZcVvmZ2rYB9kA5EhPkGbj2R2/58c0648ae4ca9900f2d835feb77f165e_file.mp4) and paste it into my browser and the video plays in browser. Am I correct to assume the file has been saved correctly?
When I go to run my app, the video does not play.Any suggestion on what i'm doing wrong?
I have found that playing video using the pfFile.url does not work. You have to write the NSData from the PFFIle to a local file using the right extension (mov) and then play the video using the local file as the source.
Related
I need to be able to load an heic image and extract and output all of the sub images as pngs similar to how preview does it. For example, if you open a dynamic heic wallpaper in preview, it shows all the images in the sidebar with their names:
How do you do this? I've tried to use NSImage like below. But that only outputs a single image:
let image = NSImage(byReferencing: url)
image.writePNG(toURL: newUrl)
You need to load the HEIC data, get its CGImageSource and its count. Then create a loop from 0 to count-1 and get each image at the corresponding index. You can create an array with those CGImages in memory or write them to disk (preferred). Note that this will take a while to be executed because of the size of the HEIC file 186MB. Each image extracted will be from 19MB to 28MB.
func extractHeicImages(from url: URL) throws {
let data = try Data(contentsOf: url)
let location = url.deletingLastPathComponent()
let pathExtension = url.pathExtension
let fileName = url.deletingPathExtension().lastPathComponent
let destinationFolder = location.appendingPathComponent(fileName)
guard pathExtension == "heic", let imageSource = CGImageSourceCreateWithData(data as CFData, nil) else { return }
let count = CGImageSourceGetCount(imageSource)
try FileManager.default.createDirectory(at: destinationFolder, withIntermediateDirectories: false, attributes: nil)
for index in 0..<count {
try autoreleasepool {
if let cgImage = CGImageSourceCreateImageAtIndex(imageSource, index, nil) {
let number = String(format: "#%05d", index)
let destinationURL = destinationFolder
.appendingPathComponent(fileName+number)
.appendingPathExtension(pathExtension)
try NSImage(cgImage: cgImage, size: .init(width: cgImage.width, height: cgImage.height))
.heic?
.write(to: destinationURL)
print("saved image " + number)
}
}
}
}
You will need these helpers as well to extract the cgimate from your image and also to get a HEIC data representation from them:
extension NSImage {
var heic: Data? { heic() }
var cgImage: CGImage? {
var rect = NSRect(origin: .zero, size: size)
return cgImage(forProposedRect: &rect, context: .current, hints: nil)
}
func heic(compressionQuality: CGFloat = 1) -> Data? {
guard
let mutableData = CFDataCreateMutable(nil, 0),
let destination = CGImageDestinationCreateWithData(mutableData, "public.heic" as CFString, 1, nil),
let cgImage = cgImage
else { return nil }
CGImageDestinationAddImage(destination, cgImage, [kCGImageDestinationLossyCompressionQuality: compressionQuality] as CFDictionary)
guard CGImageDestinationFinalize(destination) else { return nil }
return mutableData as Data
}
}
Playground testing. This assumes the "Catalina.heic" is located at your desktop.
let catalinaURL = FileManager.default.urls(for: .desktopDirectory, in: .userDomainMask).first!.appendingPathComponent("Catalina.heic")
do {
try extractHeicImages(from: catalinaURL)
} catch {
print(error)
}
Each subimage is represented by a NSBitmapImageRep. Loop the image reps, convert to png and save:
let imageReps = image.representations
for imageIndex in 0..<imageReps.count {
if let imageRep = imageReps[imageIndex] as? NSBitmapImageRep {
if let data = imageRep.representation(using: .png, properties: [:]) {
do {
let url = folderURL.appendingPathComponent("image \(imageIndex).png", isDirectory: false)
try data.write(to: url, options:[])
} catch {
print("Unexpected error: \(error).")
}
}
}
}
The conversion to png takes some time. Running the conversions in parallel is faster but I'm not sure if it's save:
DispatchQueue.concurrentPerform(iterations: imageReps.count) { iteration in
if let imageRep = imageReps[iteration] as? NSBitmapImageRep {
if let data = imageRep.representation(using: .png, properties: [:]) {
do {
let url = folderURL.appendingPathComponent("image \(iteration).png", isDirectory: false)
try data.write(to: url, options:[])
} catch {
print("Unexpected error: \(error).")
}
}
}
}
For starters here is what I currently have:
func uploadToCloud(fileURL: URL){
let storage = Storage.storage()
let data = Data()
let storageRef = storage.reference()
let localFile = fileURL
let photoRef = storageRef.child("\(email)\(counter)")
let uploadTask = photoRef.putFile(from: localFile, metadata: nil) { (metadata, err) in
guard let metadata = metadata else {
print(err?.localizedDescription)
return
}
To upload the video to storage ^
func playVideo(url: URL) {
let player = AVPlayer(url: url)
let vc = AVPlayerViewController()
vc.player = player
self.present(vc, animated: true) { vc.player?.play() }
}
This is what I have to play the video ^
Originally I tried also saving the link to Firestore, but all it saved was the files location, I believe on the phone which is not what I am trying to do. Is there a way to I can get a the url for the video saved in storage to run it as a url in the playVideo function?
Thank you!
Try this to get
// ....
photoRef.downloadURL { (url, error) in
if let error = error {
print("DEBUG: Upload: \(error.localizedDescription)")
return
}
guard let videoUrl = url?.absoluteString else {
print("DEBUG: Upload failed get URL")
return
}
completion(videoUrl)
}
I am working on Downloading and playing HLS content, To download the HLS I am using following code
func downloadTask() {
let videoUrl = URL(string: "https://bitdash-a.akamaihd.net/content/MI201109210084_1/m3u8s/f08e80da-bf1d-4e3d-8899-f0f6155f6efa.m3u8")!
configuration = URLSessionConfiguration.background(withIdentifier: downloadIdentifier)
downloadSession = AVAssetDownloadURLSession(configuration: configuration!, assetDownloadDelegate: self, delegateQueue: OperationQueue.main)
let documentsDirectoryURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!
destinationUrl = documentsDirectoryURL.appendingPathComponent(videoUrl.lastPathComponent)
var urlComponents = URLComponents(
url: videoUrl,
resolvingAgainstBaseURL: false
)!
urlComponents.scheme = "https"
do {
let asset = try AVURLAsset(url: urlComponents.url!)
asset.resourceLoader.setDelegate(self, queue: DispatchQueue(label: "com.example.AssetResourceLoaderDelegateQueue"))
if #available(iOS 10.0, *) {
assetDownloadTask = downloadSession!
.makeAssetDownloadTask(
asset: asset,
assetTitle: "RG-TVVideo",
assetArtworkData: nil,
options: nil
)
APP_DELEGATE.isProgressRunning = true
assetDownloadTask?.resume()
} else {
// Fallback on earlier versions
}
} catch { print("Erorr while parsing the URL.") }
}
Download finished
func urlSession(_ session: URLSession, assetDownloadTask: AVAssetDownloadTask, didFinishDownloadingTo location: URL) {
if #available(iOS 11.0, *) {
let storageManager = AVAssetDownloadStorageManager.shared()
let newPolicy = AVMutableAssetDownloadStorageManagementPolicy()
newPolicy.expirationDate = Date()
newPolicy.priority = .important
let baseURL = URL(fileURLWithPath: NSHomeDirectory())
let assetURL = baseURL.appendingPathComponent(location.relativePath)
storageManager.setStorageManagementPolicy(newPolicy, for: assetURL)
UserDefaults.standard.set(location.relativePath, forKey: "videoPath")
strDownloadStatus = "5"
let dictVideoInfo = ["strDownloadStatus" : "5","VideoID":self.strID]
// Here I am Storing Downloaded location in to database
DBManager.shared.updateVideoStatus(strVideoID: APP_DELEGATE.arrTempVideoIds.object(at: 0) as! String, strStatus: "5", strSavePath: location.relativePath) { (status) in }
DispatchQueue.main.async {
NotificationCenter.default.post(name: NSNotification.Name.init("UpdateProgress"), object: self.percentageComplete, userInfo: dictVideoInfo)
}
}
}
Now I am trying to get Video path from the location which is stored in Database and trying to play it offline(Without Internet) using following code
func setLocalPlayer(strDownloadPath: String) {
let strDownloadPath = “”
//Getting path from database
DBManager.shared.getDownloadedPath(videoID: VideoID) { (strPath) in
strDownloadPath = strPath
}
activityIndicator.isHidden = false
let baseURL = URL(fileURLWithPath: NSHomeDirectory())
let assetURL = baseURL.appendingPathComponent(strDownloadPath)
let asset = AVURLAsset(url: assetURL)
// if let cache = asset.assetCache, cache.isPlayableOffline {
// let videoAsset = AVURLAsset(url: assetURL)
asset.resourceLoader.preloadsEligibleContentKeys = true
asset.resourceLoader.setDelegate(self, queue: DispatchQueue(label: "com.example.AssetResourceLoaderDelegateQueue"))
let playerItem = AVPlayerItem(asset: asset)
avPlayer = AVPlayer(playerItem: playerItem)
avPlayerLayer = AVPlayerLayer()
avPlayerLayer.frame = CGRect(x: 0, y: 0, width: playerContainer.frame.width, height: playerContainer.frame.height)
avPlayerLayer.videoGravity = .resize
avPlayerLayer.player = avPlayer
playerContainer.layer.addSublayer(avPlayerLayer)
let interval = CMTime(seconds: 0.01, preferredTimescale: CMTimeScale(NSEC_PER_SEC))
timeObserver = avPlayer?.addPeriodicTimeObserver(forInterval: interval, queue: DispatchQueue.main, using: { elapsedTime in
self.updateVideoPlayerState()
if self.avPlayer != nil {
self.bufferState()
}
})
self.slider.setThumbImage(UIImage(named: "slider_dot"), for: UIControl.State.normal)
resetTimer()
avPlayer.play()
isPlaying = true
// }
}
NOTE: This code is working fine when internet is on
I have referred following links
https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/MediaPlaybackGuide/Contents/Resources/en.lproj/HTTPLiveStreaming/HTTPLiveStreaming.html
https://assist-software.net/snippets/how-play-encrypted-http-live-streams-offline-avfoundation-ios-using-swift-4
Downloading and playing offline HLS Content - iOS 10
Please guide what I am doing wrong.
Thanks
Well, I don't know if it's your error, but for further readings :
Don't do newPolicy.expirationDate = Date() it's a mistake. According to Advances in HTTP Live Streaming 2017 WWDC session, it will delete your file as soon as possible.
Before playing your offline playback, you can check if it's still on your device in Settings -> General -> Storage -> MyApp
The expiration date property is there in case your asset at some point
becomes no longer eligible to be played. For instance, you may find
that you may be in a situation where a particular show may be leaving
your catalog, you no longer have rights to stream it.
If that's the case you can set the expiration date and it will be sort of bumped up
in the deletion queue. So, using it is fairly straight forward.
Audio file will not play after reducing it using AVAssetReader/ AVAssetWriter
At the moment, the whole function is being executed fine, with no errors thrown.
For some reason, when I go inside the document directory of the simulator via terminal, the audio file will not play through iTunes and comes up with error when trying to open with quicktime "QuickTime Player can't open "test1.m4a"
Does anyone specialise in this area and understand why this isn't working?
protocol FileConverterDelegate {
func fileConversionCompleted()
}
class WKAudioTools: NSObject {
var delegate: FileConverterDelegate?
var url: URL?
var assetReader: AVAssetReader?
var assetWriter: AVAssetWriter?
func convertAudio() {
let documentDirectory = try! FileManager.default.url(for: .documentDirectory, in: .userDomainMask, appropriateFor: nil, create: true)
let exportURL = documentDirectory.appendingPathComponent(Assets.soundName1).appendingPathExtension("m4a")
url = Bundle.main.url(forResource: Assets.soundName1, withExtension: Assets.mp3)
guard let assetURL = url else { return }
let asset = AVAsset(url: assetURL)
//reader
do {
assetReader = try AVAssetReader(asset: asset)
} catch let error {
print("Error with reading >> \(error.localizedDescription)")
}
let assetReaderOutput = AVAssetReaderAudioMixOutput(audioTracks: asset.tracks, audioSettings: nil)
//let assetReaderOutput = AVAssetReaderTrackOutput(track: track!, outputSettings: nil)
guard let assetReader = assetReader else {
print("reader is nil")
return
}
if assetReader.canAdd(assetReaderOutput) == false {
print("Can't add output to the reader ☹️")
return
}
assetReader.add(assetReaderOutput)
// writer
do {
assetWriter = try AVAssetWriter(outputURL: exportURL, fileType: .m4a)
} catch let error {
print("Error with writing >> \(error.localizedDescription)")
}
var channelLayout = AudioChannelLayout()
memset(&channelLayout, 0, MemoryLayout.size(ofValue: channelLayout))
channelLayout.mChannelLayoutTag = kAudioChannelLayoutTag_Stereo
// use different values to affect the downsampling/compression
let outputSettings: [String: Any] = [AVFormatIDKey: kAudioFormatMPEG4AAC,
AVSampleRateKey: 44100.0,
AVNumberOfChannelsKey: 2,
AVEncoderBitRateKey: 128000,
AVChannelLayoutKey: NSData(bytes: &channelLayout, length: MemoryLayout.size(ofValue: channelLayout))]
let assetWriterInput = AVAssetWriterInput(mediaType: .audio, outputSettings: outputSettings)
guard let assetWriter = assetWriter else { return }
if assetWriter.canAdd(assetWriterInput) == false {
print("Can't add asset writer input ☹️")
return
}
assetWriter.add(assetWriterInput)
assetWriterInput.expectsMediaDataInRealTime = false
// MARK: - File conversion
assetWriter.startWriting()
assetReader.startReading()
let audioTrack = asset.tracks[0]
let startTime = CMTime(seconds: 0, preferredTimescale: audioTrack.naturalTimeScale)
assetWriter.startSession(atSourceTime: startTime)
// We need to do this on another thread, so let's set up a dispatch group...
var convertedByteCount = 0
let dispatchGroup = DispatchGroup()
let mediaInputQueue = DispatchQueue(label: "mediaInputQueue")
//... and go
dispatchGroup.enter()
assetWriterInput.requestMediaDataWhenReady(on: mediaInputQueue) {
while assetWriterInput.isReadyForMoreMediaData {
let nextBuffer = assetReaderOutput.copyNextSampleBuffer()
if nextBuffer != nil {
assetWriterInput.append(nextBuffer!) // FIXME: Handle this safely
convertedByteCount += CMSampleBufferGetTotalSampleSize(nextBuffer!)
} else {
// done!
assetWriterInput.markAsFinished()
assetReader.cancelReading()
dispatchGroup.leave()
DispatchQueue.main.async {
// Notify delegate that conversion is complete
self.delegate?.fileConversionCompleted()
print("Process complete 🎧")
if assetWriter.status == .failed {
print("Writing asset failed ☹️ Error: ", assetWriter.error)
}
}
break
}
}
}
}
}
You need to call finishWriting on your AVAssetWriter to get the output completely written:
assetWriter.finishWriting {
DispatchQueue.main.async {
// Notify delegate that conversion is complete
self.delegate?.fileConversionCompleted()
print("Process complete 🎧")
if assetWriter.status == .failed {
print("Writing asset failed ☹️ Error: ", assetWriter.error)
}
}
}
If exportURL exists before you start the conversion, you should remove it, otherwise the conversion will fail:
try! FileManager.default.removeItem(at: exportURL)
As #matt points out, why the buffer stuff when you could do the conversion more simply with an AVAssetExportSession, and also why convert one of your own assets when you could distribute it already in the desired format?
Is there a more elegant solution to load an external image on the watch than the following ?
let image_url:String = "http://placehold.it/350x150"
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)) {
let url:NSURL = NSURL(string:image_url)!
var data:NSData = NSData(contentsOfURL: url)!
var placeholder = UIImage(data: data)!
// update ui
dispatch_async(dispatch_get_main_queue()) {
self.imageView.setImage(placeholder)
}
}
NSURL is meant to be used for local files. Instead use NSURLSession. It's also useful to set the scale for the remote image.
import WatchKit
public extension WKInterfaceImage {
public func setImageWithUrl(url:String, scale: CGFloat = 1.0) -> WKInterfaceImage? {
NSURLSession.sharedSession().dataTaskWithURL(NSURL(string: url)!) { data, response, error in
if (data != nil && error == nil) {
let image = UIImage(data: data!, scale: scale)
dispatch_async(dispatch_get_main_queue()) {
self.setImage(image)
}
}
}.resume()
return self
}
}
Use it like this
self.imageView.setImageWithUrl(image_url, scale: 2.0)
Here is the category
import WatchKit
public extension WKInterfaceImage {
public func setImageWithUrl(url:String) -> WKInterfaceImage? {
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)) {
let url:NSURL = NSURL(string:url)!
var data:NSData = NSData(contentsOfURL: url)!
var placeholder = UIImage(data: data)!
dispatch_async(dispatch_get_main_queue()) {
self.setImage(placeholder)
}
}
return self
}
}
Use it like this
self.imageView.setImageWithUrl(image_url)
I thinks that solution is good because it can help your application out of lagging when you're trying to load some Images from web.
you can make a new function like this:
func loadImage(url:String, forImageView: WKInterfaceImage) {
// load image
let image_url:String = url
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)) {
let url:NSURL = NSURL(string:image_url)!
var data:NSData = NSData(contentsOfURL: url)!
var placeholder = UIImage(data: data)!
// update ui
dispatch_async(dispatch_get_main_queue()) {
forImageView.setImage(placeholder)
}
}
}
after that any where you want to load image from urlString you can use like this:
loadImage("http://...", forImageView: self.myImageView)
Hope this help.
I think by this solution you can store image in cache and display image from cache also.so you can call this function and use it.
func loadImage(url:String, forImageView: WKInterfaceImage) {
forImageView.setImageNamed("placeholder")
let image_url:String = url
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)) {
let url:NSURL = NSURL(string:image_url)!
print(url)
//if image is already stored in cache
if WKInterfaceDevice.currentDevice().cachedImages[image_url] != nil{
dispatch_async(dispatch_get_main_queue()) {
forImageView.setImageNamed(image_url)
}
}else{
if let data = NSData(contentsOfURL: url){
//load image
let image = UIImage(data: data)!
//Store image in cache
WKInterfaceDevice.currentDevice().addCachedImage(image, name: image_url)
dispatch_async(dispatch_get_main_queue()) {
forImageView.setImage(placeholder)
}
}
}
}
}
Just had the same task, the answers here helped me, but I needed to do some modifications. So I wanted to share the updated version (without any forced unwraps) of the common answers here (should work with Swift 4.2):
public extension WKInterfaceImage {
public func setBackgroundImage(url: String) {
let asyncQueue = DispatchQueue(label: "backgroundImage")
asyncQueue.async {
do {
if let url = URL(string: url) {
let data = try Data(contentsOf: url)
if let placeholder = UIImage(data: data) {
self.setImage(placeholder)
}
}
} catch let error {
print("Could not set backgroundImage for WKInterfaceImage: \(error.localizedDescription)")
}
}
}
}
public extension WKInterfaceImage {
public func setImageWithUrl(url:String, scale: CGFloat = 1.0) -> WKInterfaceImage? {
URLSession.shared.dataTask(with: NSURL(string: url)! as URL) { data, response, error in
if (data != nil && error == nil) {
let image = UIImage(data: data!, scale: scale)
DispatchQueue.main.async() {
self.setImage(image)
}
}
}.resume()
return self
}
}
if let url = NSURL(string: "http://google.net/img/upload/photo2.png") {
if let data = NSData(contentsOfURL: url){
imageWK.setImage(UIImage(data: data))
}
}
Try this code.
Dont forget to add NSTransportSecurity in your Plist.
Swift 4.2
Using URLSession, proper GCD and #discardableResult to silence the Result of call to '...' is unused warning.
plist
App Transport Security Settings
Allow Arbitrary Loads - YES
You can set a fixed image size of 100x100 in the storyboard if you like.
let url = "https://i.imgur.com/UZbLC0Q.jpg"
public extension WKInterfaceImage {
#discardableResult public func setImageWithUrl(url:String, scale: CGFloat = 1.0) -> WKInterfaceImage? {
URLSession.shared.dataTask(with: NSURL(string: url)! as URL) { data, response, error in
if (data != nil && error == nil) {
let image = UIImage(data: data!, scale: scale)
DispatchQueue.main.async {
self.setImage(image)
}
}
}.resume()
return self
}
}
call
row.image.setImageWithUrl(url: url, scale: 1.0)