Can the Apple Watch use AVFoundation? More specifically, can AVAudioPlayer and AVAudioRecorder work?
I am trying to make an app that lets you record a sound to the Apple Watch and have it play it back using the audioplayer. Thanks
UPDATE 11/28/15
This answer applies to WatchKit and WatchOS 1.x. Though I have not tested myself, WatchOS 2.x offers different support, including hardware. I can confirm that as of XCode 7.x, the compiler behaves correctly and WatchKit 1.x extensions with calls to AVFoundation won't build.
For WatchOS 1.x and XCode up to 6.x
Despite the fact that AVAudioRecorder and AVAudioPlayer both work in Simulator Apple Watch, they don't actually work on an actual device! I had opened a bug with Apple back on 5/5/15 (once I got to test an App I had written on my actual watch.) My app (a wrist audio recorded) indeed worked beautifully on Simulator. Yet on the actual watch, it would install and run, but the AVAudioRecorder "record" message would simply never toggle into "recording". Interestingly, the code does not throw any exceptions anywhere. It simply does not record!
I received a reply to my Apple Bug today 5/28/15 that "there are no plans to address this" based on "AVFoundation isn't supported on the watch." No word on whether or not Simulator will be updated so that AVFoundation also fails to work.
So, for now, Apple Watch is limited to controlling recording "on the phone" via watch extension to phone messaging which is supported in Apple Watch extensions.
Try using WKAudioFilePlayerItem ?
To use this class to play audio, you need to declare 3 variables:
var audioFile: WKAudioFileAsset!
var audioItem: WKAudioFilePlayerItem!
var audioPlayer: WKAudioFilePlayer!
They have different role inside the program. For audioFile is used to define NSURL. For example, if you have file called "bicycle.mp3", you can define the URL path like this :
let soundPath = NSBundle.mainBundle().pathForResource("bicycle", ofType: "mp3")
print("load file")
when you have a soundpath, you make it a NSURL inside audioFile:
audioFile = WKAudioFileAsset.init(URL: soundPathURL)
When you have a audioFile, you put it inside the audioPlayerItem
audioItem = WKAudioFilePlayerItem.init(asset: audioFile)
When you have audioItem, you put it inside the audioPlayer
`audioPlayer = WKAudioFilePlayer(playerItem: audioItem)`
Finally, you can put the code audioPlayer.play() where you want to play the sound
Enjoy!
Short answer? No. WatchKit doesn't currently offer access to any of the hardware in the Watch.
finally ended up with this:
#IBAction func playButtonTapped() {
let options = [WKMediaPlayerControllerOptionsAutoplayKey : "true"]
let filePath = NSBundle.mainBundle().pathForResource("1button", ofType: "wav")!
let fileUrl = NSURL.fileURLWithPath(filePath)
presentMediaPlayerControllerWithURL(fileUrl, options: options,
completion: { didPlayToEnd, endTime, error in
if let err = error {
print(err.description)
}
})
}
Related
I got this in a older version of my app
var recSession: AVAudioSession!
recSession = AVAudioSession.sharedInstance()
try recSession.setCategory(AVAudioSession.Category.playAndRecord)
How should I translate the last line into swift 4.2.
setCategory is deprecated, but what is the alternative?
Try this:
try recSession.setCategory(.playAndRecord, mode: .default)
Seems Apple is recommending to set category and mode at the same time.
Note
Instead of setting your category and mode properties independently,
it's recommended that you set them at the same time using the
setCategory:mode:options:error: method.
AVAudioSession's mode defaults to AVAudioSession.Mode.default, so if your app does not change it, the code above should work.
I tried to play music in my SceneKit game, but it crashes without a reason.
So I try to add this 3 lines of code in the standard Apple game template (at the end of viewDidLoad):
let music = SCNAudioSource(fileNamed: "Music.mp3")
let action = SCNAction.playAudio(music!, waitForCompletion: false)
ship.runAction(action)
and, at runtime, Xcode show me this message:
com.apple.scenekit.scnview-renderer (8): breakpoint 1.2
Where is my mistake?
I tried to compile and run my code in two different Mac: on my MacBookPro Retina it runs ok with the sound, in my iMac 21,5 it crashes.
So, my code is correct, probably I will fill a radar to Apple.
Note that both Macs are installed with OS Mojave in beta (same version) and also Xcode used is the Beta.
You have to put the audio file into an SCNAssets catalog (with .scnassets extension), like the art.scnassets catalog in the template. Then play the music like that:
if let source = SCNAudioSource(fileNamed: "art.scnassets/Music.mp3") {
let action = SCNAction.playAudio(source, waitForCompletion: false)
ship.runAction(action)
} else {
print("cannot find file")
}
I think you are looking for something like this...
ship.runAction(SKAction.playSoundFileNamed("Music.mp3",waitForCompletion:false));
When using the AVCaptureSession to connect a AVCaptureDevice the device get's locked by the application and can not be released. The only way the device is truly released is after restarting the app.
We are using AVCaptureSession for previewing the camera and audio level meters. But once we start the actual capture we might need to switch to an alternative capture SDK (in this case DeckLink SDK). How ever the device remains locked by AVFoundation and we can't seem to free it in any way.... It all goes sideways as soon as wel call AVCaptureSession.addInput on the AVCaptureSession from the AVCaptureDevice. And simply iterating the inputs and using AVCaptureSession.removeInput does not seem work.
We setup the session like this:
do {
try self.selectedVideoDevice.lockForConfiguration()
try self.selectedAudioDevice?.lockForConfiguration()
self.cameraSession = AVCaptureSession()
self.cameraSession?.beginConfiguration()
self.cameraSession?.sessionPreset = AVCaptureSession.Preset.high
// Add some outputs... not relevant for the issue at hand?!
// Add audio input
if self.selectedAudioDevice != nil {
let deviceInputAudio = try AVCaptureDeviceInput(device: self.selectedAudioDevice!)
if self.cameraSession?.canAddInput(deviceInputAudio) ?? false {
self.cameraSession?.addInput(deviceInputAudio)
}
}
// Add video input
let deviceInputVideo = try AVCaptureDeviceInput(device: self.selectedVideoDevice)
if self.cameraSession?.canAddInput(deviceInputVideo) ?? false {
self.cameraSession?.addInput(deviceInputVideo)
}
self.cameraSession?.commitConfiguration()
self.cameraSession?.startRunning()
self.selectedVideoDevice.unlockForConfiguration()
self.selectedAudioDevice?.unlockForConfiguration()
} catch {
}
And try to release using something like this... one of the many tries...
self.cameraSession?.stopRunning()
for output in self.cameraSession?.outputs ?? [] {
self.cameraSession?.removeOutput(output)
}
for input in self.cameraSession?.inputs ?? [] {
self.cameraSession?.removeInput(input)
}
self.cameraSession = nil
How ever we can't get the device to be recognized in the DeckLink SDK after using in AVFoundation.
Any ideas would be great as cleaning up or setting the variables to nil doesn't seem to do anything...
We chose to implement the Desktop Video SDK from Blackmagic and do all captures from a Blackmagic device using that. This solves more issues when capturing using Blackmagic Mini Recorder (for example: audio sync) somehow AVFoundation does work with Blackmagic but not really well. And Blackmagic officially never answered the question "Do you support AVFoundation". So in order to make it work I would recommend the Desktop Video SDK that can be downloaded from their site under Support.
https://www.blackmagicdesign.com/support/
Also make sure you never load the video device into your AVFoundation workflow. It will get stuck and hold it. So first check if it's Blackmagic then continue to AVFoundation if not.
I want to add live channel streaming to my tvOS app. the plan was to use apple's HLS protocol.
I tried apple's HLS code example from:
https://developer.apple.com/videos/play/wwdc2016/504/
I have added import AVFoundation then added this code.
func setupAssetDownload() {
let hlsAsset = AVURLAsset(url: assetURL)
let backgroundConfiguration = URLSessionConfiguration.background(
withIdentifier: "assetDownloadConfigurationIdentifier")
let assetURLSession = AVAssetDownloadURLSession(configuration: backgroundConfiguration,
assetDownloadDelegate: self, delegateQueue: OperationQueue.main())
// Download a Movie at 2 mbps
let assetDownloadTask = assetURLSession.makeAssetDownloadTask(asset: hlsAsset, assetTitle: "My Movie",
assetArtworkData: nil, options: [AVAssetDownloadTaskMinimumRequiredMediaBitrateKey: 2000000])!
assetDownloadTask.resume()
}
xcode gave me 'AVAssetDownloadURLSession' is unavailable
when i try the same code on iOS project it recognizes the AVAssetDownloadURLSession object.
according to apple documentation AVFoundeation supports HLS on tvOS.
what might be the issue ?
Currently in tvOS 10, HLS playback is indeed supported, for example when loading a stream like AVURLPlayer(url: "https://example.com/123.m3u8").
Unfortunately, AVAssetDownloadURLSession is not (yet) available right now. The Apple documentation mentioning HLS support is referring to the above.
I've tried https://github.com/ap4y/OrigamiEngine but I can't get it to play flac files.
here's a sample
var url = "/Users/Simun/Downloads/track.flac"
url = url.stringByAddingPercentEncodingWithAllowedCharacters(NSCharacterSet.URLQueryAllowedCharacterSet())!
let trackUrl = NSURL(string:"file://localhost/\(url)")
player.playUrl(trackUrl)
I am not aware of other libraries for playing FLAC in Swift, but https://github.com/ap4y/OrigamiEngine does do the job. The most difficult part for me was simply getting this Objective-C library included in a Swift XCode project using cocoapods (importing the headers and linking). Once that is accomplished, the following code should play a file:
let origami = ORGMEngine()
origami.play(URL(fileURLWithPath: "/Users/user/xyz.flac"))
I have succeeded to preview .flac files using two native things:
AVKit's AVPlayerViewController
QuickLook