I tried to play music in my SceneKit game, but it crashes without a reason.
So I try to add this 3 lines of code in the standard Apple game template (at the end of viewDidLoad):
let music = SCNAudioSource(fileNamed: "Music.mp3")
let action = SCNAction.playAudio(music!, waitForCompletion: false)
ship.runAction(action)
and, at runtime, Xcode show me this message:
com.apple.scenekit.scnview-renderer (8): breakpoint 1.2
Where is my mistake?
I tried to compile and run my code in two different Mac: on my MacBookPro Retina it runs ok with the sound, in my iMac 21,5 it crashes.
So, my code is correct, probably I will fill a radar to Apple.
Note that both Macs are installed with OS Mojave in beta (same version) and also Xcode used is the Beta.
You have to put the audio file into an SCNAssets catalog (with .scnassets extension), like the art.scnassets catalog in the template. Then play the music like that:
if let source = SCNAudioSource(fileNamed: "art.scnassets/Music.mp3") {
let action = SCNAction.playAudio(source, waitForCompletion: false)
ship.runAction(action)
} else {
print("cannot find file")
}
I think you are looking for something like this...
ship.runAction(SKAction.playSoundFileNamed("Music.mp3",waitForCompletion:false));
Related
A try to start a screen recording with RPScreenRecorder. I got the following error:
Recording interrupted by multitasking and content resizing
func startRecording() {
let recorder = RPScreenRecorder.shared()
recorder.startRecording(handler: { (error) in
if let unwrappedError = error {
print(unwrappedError.localizedDescription)
} else {
}
})
}
Before iOS 12.0 everything worked fine. From the update I get the error above.
My app has been rejected from App store for the same reason. So far the only workaround is to reboot the device.
I had a similar problem and here is how I solved it.
go to project then targets then capability switch on Background mode then enable audio and VOIP. It should work
I've done a lot of research on the errors and posted the solution Here.
For now my screen recording feature is bug free. But who knows what comes with the new OS updates
We've been rejected same issue several times.
But we found a senario to re-produce as bellow,
We reported it on Resolution Center in App Store Connect, then passed.
connect iOS(12.4) device to host launched XCode 10.3
(regardless of opened related project)
cold boot iOS device.
launch app and start recording video ASAP(until 30sec after booted)
Now iOS13, we don't face this error at the above senario.
I got this in a older version of my app
var recSession: AVAudioSession!
recSession = AVAudioSession.sharedInstance()
try recSession.setCategory(AVAudioSession.Category.playAndRecord)
How should I translate the last line into swift 4.2.
setCategory is deprecated, but what is the alternative?
Try this:
try recSession.setCategory(.playAndRecord, mode: .default)
Seems Apple is recommending to set category and mode at the same time.
Note
Instead of setting your category and mode properties independently,
it's recommended that you set them at the same time using the
setCategory:mode:options:error: method.
AVAudioSession's mode defaults to AVAudioSession.Mode.default, so if your app does not change it, the code above should work.
When using the AVCaptureSession to connect a AVCaptureDevice the device get's locked by the application and can not be released. The only way the device is truly released is after restarting the app.
We are using AVCaptureSession for previewing the camera and audio level meters. But once we start the actual capture we might need to switch to an alternative capture SDK (in this case DeckLink SDK). How ever the device remains locked by AVFoundation and we can't seem to free it in any way.... It all goes sideways as soon as wel call AVCaptureSession.addInput on the AVCaptureSession from the AVCaptureDevice. And simply iterating the inputs and using AVCaptureSession.removeInput does not seem work.
We setup the session like this:
do {
try self.selectedVideoDevice.lockForConfiguration()
try self.selectedAudioDevice?.lockForConfiguration()
self.cameraSession = AVCaptureSession()
self.cameraSession?.beginConfiguration()
self.cameraSession?.sessionPreset = AVCaptureSession.Preset.high
// Add some outputs... not relevant for the issue at hand?!
// Add audio input
if self.selectedAudioDevice != nil {
let deviceInputAudio = try AVCaptureDeviceInput(device: self.selectedAudioDevice!)
if self.cameraSession?.canAddInput(deviceInputAudio) ?? false {
self.cameraSession?.addInput(deviceInputAudio)
}
}
// Add video input
let deviceInputVideo = try AVCaptureDeviceInput(device: self.selectedVideoDevice)
if self.cameraSession?.canAddInput(deviceInputVideo) ?? false {
self.cameraSession?.addInput(deviceInputVideo)
}
self.cameraSession?.commitConfiguration()
self.cameraSession?.startRunning()
self.selectedVideoDevice.unlockForConfiguration()
self.selectedAudioDevice?.unlockForConfiguration()
} catch {
}
And try to release using something like this... one of the many tries...
self.cameraSession?.stopRunning()
for output in self.cameraSession?.outputs ?? [] {
self.cameraSession?.removeOutput(output)
}
for input in self.cameraSession?.inputs ?? [] {
self.cameraSession?.removeInput(input)
}
self.cameraSession = nil
How ever we can't get the device to be recognized in the DeckLink SDK after using in AVFoundation.
Any ideas would be great as cleaning up or setting the variables to nil doesn't seem to do anything...
We chose to implement the Desktop Video SDK from Blackmagic and do all captures from a Blackmagic device using that. This solves more issues when capturing using Blackmagic Mini Recorder (for example: audio sync) somehow AVFoundation does work with Blackmagic but not really well. And Blackmagic officially never answered the question "Do you support AVFoundation". So in order to make it work I would recommend the Desktop Video SDK that can be downloaded from their site under Support.
https://www.blackmagicdesign.com/support/
Also make sure you never load the video device into your AVFoundation workflow. It will get stuck and hold it. So first check if it's Blackmagic then continue to AVFoundation if not.
Can the Apple Watch use AVFoundation? More specifically, can AVAudioPlayer and AVAudioRecorder work?
I am trying to make an app that lets you record a sound to the Apple Watch and have it play it back using the audioplayer. Thanks
UPDATE 11/28/15
This answer applies to WatchKit and WatchOS 1.x. Though I have not tested myself, WatchOS 2.x offers different support, including hardware. I can confirm that as of XCode 7.x, the compiler behaves correctly and WatchKit 1.x extensions with calls to AVFoundation won't build.
For WatchOS 1.x and XCode up to 6.x
Despite the fact that AVAudioRecorder and AVAudioPlayer both work in Simulator Apple Watch, they don't actually work on an actual device! I had opened a bug with Apple back on 5/5/15 (once I got to test an App I had written on my actual watch.) My app (a wrist audio recorded) indeed worked beautifully on Simulator. Yet on the actual watch, it would install and run, but the AVAudioRecorder "record" message would simply never toggle into "recording". Interestingly, the code does not throw any exceptions anywhere. It simply does not record!
I received a reply to my Apple Bug today 5/28/15 that "there are no plans to address this" based on "AVFoundation isn't supported on the watch." No word on whether or not Simulator will be updated so that AVFoundation also fails to work.
So, for now, Apple Watch is limited to controlling recording "on the phone" via watch extension to phone messaging which is supported in Apple Watch extensions.
Try using WKAudioFilePlayerItem ?
To use this class to play audio, you need to declare 3 variables:
var audioFile: WKAudioFileAsset!
var audioItem: WKAudioFilePlayerItem!
var audioPlayer: WKAudioFilePlayer!
They have different role inside the program. For audioFile is used to define NSURL. For example, if you have file called "bicycle.mp3", you can define the URL path like this :
let soundPath = NSBundle.mainBundle().pathForResource("bicycle", ofType: "mp3")
print("load file")
when you have a soundpath, you make it a NSURL inside audioFile:
audioFile = WKAudioFileAsset.init(URL: soundPathURL)
When you have a audioFile, you put it inside the audioPlayerItem
audioItem = WKAudioFilePlayerItem.init(asset: audioFile)
When you have audioItem, you put it inside the audioPlayer
`audioPlayer = WKAudioFilePlayer(playerItem: audioItem)`
Finally, you can put the code audioPlayer.play() where you want to play the sound
Enjoy!
Short answer? No. WatchKit doesn't currently offer access to any of the hardware in the Watch.
finally ended up with this:
#IBAction func playButtonTapped() {
let options = [WKMediaPlayerControllerOptionsAutoplayKey : "true"]
let filePath = NSBundle.mainBundle().pathForResource("1button", ofType: "wav")!
let fileUrl = NSURL.fileURLWithPath(filePath)
presentMediaPlayerControllerWithURL(fileUrl, options: options,
completion: { didPlayToEnd, endTime, error in
if let err = error {
print(err.description)
}
})
}
So I have a very simple call to get the path to a video file in Swift. Assuming the video is named "Video.mp4":
let path = NSBundle.mainBundle().pathForResource("Video", ofType: "mp4") // Returns nil
This returns nil, without exception. Same for using URLForResource().
The weird thing is that if have an image file that has been imported the same way (all assets are in the "Copy Bundle Resources" pane), it works perfectly:
let path = NSBundle.mainBundle().pathForResource("Image", ofType: "png") // Returns the correct path
Anybody have hunches on what might be causing this?
Thanks!
I had the same problem too. I finaly solved the problem.
Switch iOS Deployment Target to iOS 8
Run the app it works!!!
Switch iOS Deployment Target back to iOS 9
Run the app it works again!!!
A weird problem and a solution :))
I hope it helps.