I want to add live channel streaming to my tvOS app. the plan was to use apple's HLS protocol.
I tried apple's HLS code example from:
https://developer.apple.com/videos/play/wwdc2016/504/
I have added import AVFoundation then added this code.
func setupAssetDownload() {
let hlsAsset = AVURLAsset(url: assetURL)
let backgroundConfiguration = URLSessionConfiguration.background(
withIdentifier: "assetDownloadConfigurationIdentifier")
let assetURLSession = AVAssetDownloadURLSession(configuration: backgroundConfiguration,
assetDownloadDelegate: self, delegateQueue: OperationQueue.main())
// Download a Movie at 2 mbps
let assetDownloadTask = assetURLSession.makeAssetDownloadTask(asset: hlsAsset, assetTitle: "My Movie",
assetArtworkData: nil, options: [AVAssetDownloadTaskMinimumRequiredMediaBitrateKey: 2000000])!
assetDownloadTask.resume()
}
xcode gave me 'AVAssetDownloadURLSession' is unavailable
when i try the same code on iOS project it recognizes the AVAssetDownloadURLSession object.
according to apple documentation AVFoundeation supports HLS on tvOS.
what might be the issue ?
Currently in tvOS 10, HLS playback is indeed supported, for example when loading a stream like AVURLPlayer(url: "https://example.com/123.m3u8").
Unfortunately, AVAssetDownloadURLSession is not (yet) available right now. The Apple documentation mentioning HLS support is referring to the above.
Related
I want to be able to record audio and play back positional audio at the same time.
To do this I need to use the .playAndRecord audio session category, and simultaneous recording and playback works. However, using this category the audio file is played without being positional (i.e. it's not spatial) when using bluetooth headphones. This works as expected when using wired headphones.
If I set the audio session category to .playback, the audio played is correctly positional for both wired and bluetooth headphones, however I'm not able to simultaneously record.
I've tried various audio session categories/option but have had no luck.
import AVFoundation
class PlayerRecorder: ObservableObject {
let engine = AVAudioEngine()
let mixer = AVAudioEnvironmentNode()
init() {
let audioSession = AVAudioSession.sharedInstance()
/*
Using .playAndRecord both recording and playback works, however
the audio that is played is NOT positional. .allowBluetooth is needed
so that bluetooth headphones can be used.
*/
try! audioSession.setCategory(.playAndRecord, mode: .default, options: .allowBluetooth)
/*
Using .playback the positional audio DOES work, however we are not able to record.
*/
// try! audioSession.setCategory(.playback)
self.engine.attach(self.mixer)
let stereoFormat = AVAudioFormat(standardFormatWithSampleRate: self.engine.outputNode.outputFormat(forBus: 0).sampleRate, channels: 2)
self.engine.connect(self.mixer, to: self.engine.outputNode, format: stereoFormat)
self.engine.prepare()
try! self.engine.start()
}
func play() {
let audioPlayer = AVAudioPlayerNode()
self.engine.attach(audioPlayer)
let monoFormat = AVAudioFormat(standardFormatWithSampleRate: self.engine.outputNode.outputFormat(forBus: 0).sampleRate, channels: 1)
self.engine.connect(audioPlayer, to: self.mixer, format: monoFormat)
// This file has to be in mono
let url = Bundle.main.url(forResource: "your-mono-audio-file.mp3", withExtension: nil)
let f = try! AVAudioFile(forReading: url!)
audioPlayer.scheduleFile(f, at: nil, completionHandler: nil)
audioPlayer.renderingAlgorithm = .HRTFHQ
audioPlayer.position = AVAudio3DPoint(x: 20.0, y: 5.0, z: 0.0)
audioPlayer.play()
}
}
I want to be able to record audio and play back positional audio at the same time.
This is not possible over standard Bluetooth if you're both playing to and recording from the Bluetooth headsets. The option allowBluetooth does not mean "allow Bluetooth." It means "prefer HFP if available." (It's the worst named constant I know in Core Bluetooth.) HFP is a low-bandwidth bidirectional audio protocol designed for phone calls.
If you switch to the high-bandwidth audio protocol, A2DP, you'll find it's unidirectional, and so you cannot record from the microphone.
There is no widely-deployed Bluetooth protocol that gives you both high quality audio and access to a microphone. If you control the firmware, you can develop your own proprietary microphone audio stream over BLE (or iAP2 if it's a MFi device). But otherwise, there isn't a current solution.
I keep hoping that LEA will fix this, but I can't find any hint that it will. I also had hoped aptX might fix it (even though iPhones don't support it), but no luck there, either. I'm not certain why this use case isn't being worked on by the Bluetooth committee and vendors, but as best I know, nothing is on the horizon.
When using the AVCaptureSession to connect a AVCaptureDevice the device get's locked by the application and can not be released. The only way the device is truly released is after restarting the app.
We are using AVCaptureSession for previewing the camera and audio level meters. But once we start the actual capture we might need to switch to an alternative capture SDK (in this case DeckLink SDK). How ever the device remains locked by AVFoundation and we can't seem to free it in any way.... It all goes sideways as soon as wel call AVCaptureSession.addInput on the AVCaptureSession from the AVCaptureDevice. And simply iterating the inputs and using AVCaptureSession.removeInput does not seem work.
We setup the session like this:
do {
try self.selectedVideoDevice.lockForConfiguration()
try self.selectedAudioDevice?.lockForConfiguration()
self.cameraSession = AVCaptureSession()
self.cameraSession?.beginConfiguration()
self.cameraSession?.sessionPreset = AVCaptureSession.Preset.high
// Add some outputs... not relevant for the issue at hand?!
// Add audio input
if self.selectedAudioDevice != nil {
let deviceInputAudio = try AVCaptureDeviceInput(device: self.selectedAudioDevice!)
if self.cameraSession?.canAddInput(deviceInputAudio) ?? false {
self.cameraSession?.addInput(deviceInputAudio)
}
}
// Add video input
let deviceInputVideo = try AVCaptureDeviceInput(device: self.selectedVideoDevice)
if self.cameraSession?.canAddInput(deviceInputVideo) ?? false {
self.cameraSession?.addInput(deviceInputVideo)
}
self.cameraSession?.commitConfiguration()
self.cameraSession?.startRunning()
self.selectedVideoDevice.unlockForConfiguration()
self.selectedAudioDevice?.unlockForConfiguration()
} catch {
}
And try to release using something like this... one of the many tries...
self.cameraSession?.stopRunning()
for output in self.cameraSession?.outputs ?? [] {
self.cameraSession?.removeOutput(output)
}
for input in self.cameraSession?.inputs ?? [] {
self.cameraSession?.removeInput(input)
}
self.cameraSession = nil
How ever we can't get the device to be recognized in the DeckLink SDK after using in AVFoundation.
Any ideas would be great as cleaning up or setting the variables to nil doesn't seem to do anything...
We chose to implement the Desktop Video SDK from Blackmagic and do all captures from a Blackmagic device using that. This solves more issues when capturing using Blackmagic Mini Recorder (for example: audio sync) somehow AVFoundation does work with Blackmagic but not really well. And Blackmagic officially never answered the question "Do you support AVFoundation". So in order to make it work I would recommend the Desktop Video SDK that can be downloaded from their site under Support.
https://www.blackmagicdesign.com/support/
Also make sure you never load the video device into your AVFoundation workflow. It will get stuck and hold it. So first check if it's Blackmagic then continue to AVFoundation if not.
This question already has answers here:
How to embed a Youtube video into my app?
(7 answers)
Closed 7 years ago.
Okay so I'm looking to play film trailers in my app. The user will press a button, and then it plays the video. I have added the import AVKit and import AVFoundation lines to my file. This is the code I have so far for making the video play:
#IBAction func playTrailerPressed(sender: AnyObject) {
let videoURL = NSURL(string: "https://youtu.be/d88APYIGkjk")
let player = AVPlayer(URL: videoURL!)
let playerViewController = AVPlayerViewController()
playerViewController.player = player
self.presentViewController(playerViewController, animated: true) {
playerViewController.player!.play()
}
}
This seems to launch an AVPlayerViewController, but doesn't play the video from YouTube. Instead, I get the below:
I have tried both the sharing and embedding link from YouTube, but neither work. If I use a link which has the video file name at the end, it plays it fine, for example: "https://clips.vorwaerts-gmbh.de/big_buck_bunny.mp4"- So I know the code works.
Does anyone know if this is possible to use this way with a YouTube video? (I have also tried trailers from IMDb and Apple, but it's the same).
Thanks for your help!
AVPlayer only plays movie files, and YouTube videos aren't directly exposed as movie files at their URL. It looks like the preferred way to handle YouTube videos is to embed a web view into your app. See this page for information from Google on how to do that.
I've tried https://github.com/ap4y/OrigamiEngine but I can't get it to play flac files.
here's a sample
var url = "/Users/Simun/Downloads/track.flac"
url = url.stringByAddingPercentEncodingWithAllowedCharacters(NSCharacterSet.URLQueryAllowedCharacterSet())!
let trackUrl = NSURL(string:"file://localhost/\(url)")
player.playUrl(trackUrl)
I am not aware of other libraries for playing FLAC in Swift, but https://github.com/ap4y/OrigamiEngine does do the job. The most difficult part for me was simply getting this Objective-C library included in a Swift XCode project using cocoapods (importing the headers and linking). Once that is accomplished, the following code should play a file:
let origami = ORGMEngine()
origami.play(URL(fileURLWithPath: "/Users/user/xyz.flac"))
I have succeeded to preview .flac files using two native things:
AVKit's AVPlayerViewController
QuickLook
Can the Apple Watch use AVFoundation? More specifically, can AVAudioPlayer and AVAudioRecorder work?
I am trying to make an app that lets you record a sound to the Apple Watch and have it play it back using the audioplayer. Thanks
UPDATE 11/28/15
This answer applies to WatchKit and WatchOS 1.x. Though I have not tested myself, WatchOS 2.x offers different support, including hardware. I can confirm that as of XCode 7.x, the compiler behaves correctly and WatchKit 1.x extensions with calls to AVFoundation won't build.
For WatchOS 1.x and XCode up to 6.x
Despite the fact that AVAudioRecorder and AVAudioPlayer both work in Simulator Apple Watch, they don't actually work on an actual device! I had opened a bug with Apple back on 5/5/15 (once I got to test an App I had written on my actual watch.) My app (a wrist audio recorded) indeed worked beautifully on Simulator. Yet on the actual watch, it would install and run, but the AVAudioRecorder "record" message would simply never toggle into "recording". Interestingly, the code does not throw any exceptions anywhere. It simply does not record!
I received a reply to my Apple Bug today 5/28/15 that "there are no plans to address this" based on "AVFoundation isn't supported on the watch." No word on whether or not Simulator will be updated so that AVFoundation also fails to work.
So, for now, Apple Watch is limited to controlling recording "on the phone" via watch extension to phone messaging which is supported in Apple Watch extensions.
Try using WKAudioFilePlayerItem ?
To use this class to play audio, you need to declare 3 variables:
var audioFile: WKAudioFileAsset!
var audioItem: WKAudioFilePlayerItem!
var audioPlayer: WKAudioFilePlayer!
They have different role inside the program. For audioFile is used to define NSURL. For example, if you have file called "bicycle.mp3", you can define the URL path like this :
let soundPath = NSBundle.mainBundle().pathForResource("bicycle", ofType: "mp3")
print("load file")
when you have a soundpath, you make it a NSURL inside audioFile:
audioFile = WKAudioFileAsset.init(URL: soundPathURL)
When you have a audioFile, you put it inside the audioPlayerItem
audioItem = WKAudioFilePlayerItem.init(asset: audioFile)
When you have audioItem, you put it inside the audioPlayer
`audioPlayer = WKAudioFilePlayer(playerItem: audioItem)`
Finally, you can put the code audioPlayer.play() where you want to play the sound
Enjoy!
Short answer? No. WatchKit doesn't currently offer access to any of the hardware in the Watch.
finally ended up with this:
#IBAction func playButtonTapped() {
let options = [WKMediaPlayerControllerOptionsAutoplayKey : "true"]
let filePath = NSBundle.mainBundle().pathForResource("1button", ofType: "wav")!
let fileUrl = NSURL.fileURLWithPath(filePath)
presentMediaPlayerControllerWithURL(fileUrl, options: options,
completion: { didPlayToEnd, endTime, error in
if let err = error {
print(err.description)
}
})
}