How to access camera in Mac OS Mojave Swift playground - swift

How to access the camera in Swift playgrounds on Mac OS Mojave?
This used to work with no problem on High Sierra however the code does not seem to execute correctly any more as the playgrounds running symbol is spinning continuously as well as the camera lights staying off.
I got an example of the code which used to work fine last year but not any more today does anyone know what's wrong and how to fix it?
import Cocoa
import AVFoundation
import AVKit
import QuartzCore
import PlaygroundSupport
let view = NSView(frame: NSRect(x: 0.0, y: 0.0, width: 640.0, height: 480.0))
let session = AVCaptureSession()
session.sessionPreset = AVCaptureSession.Preset.vga640x480
session.beginConfiguration()
session.commitConfiguration()
var input : AVCaptureDeviceInput
if let devices : [AVCaptureDevice] = AVCaptureDevice.devices() as? [AVCaptureDevice] {
for device in devices {
if device.hasMediaType(AVMediaType.video) && device.supportsSessionPreset(AVCaptureSession.Preset.vga640x480) {
do {
input = try AVCaptureDeviceInput(device: device as AVCaptureDevice) as AVCaptureDeviceInput
if session.canAddInput(input) {
session.addInput(input)
break
}
}
catch {
error
}
}
}
let output = AVCaptureVideoDataOutput()
output.videoSettings = [kCVPixelBufferPixelFormatTypeKey as AnyHashable as! String: Int(kCVPixelFormatType_32BGRA)]
output.alwaysDiscardsLateVideoFrames = true
if session.canAddOutput(output) {
session.addOutput(output)
}
let captureLayer = AVCaptureVideoPreviewLayer(session: session)
view.wantsLayer = true
view.layer = captureLayer
session.startRunning()
//View -> Assistant Editor -> Show Assistant Editor
PlaygroundPage.current.liveView = view
}
Looked at apples documentation on how to request access to the camera I tried it however it does not seem to work in playgrounds as I'm still having the same issue when this is implemented.
https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture/requesting_authorization_for_media_capture_on_macos
switch AVCaptureDevice.authorizationStatus(for: .video) {
case .authorized: // The user has previously granted access to the camera.
self.setupCaptureSession()
case .notDetermined: // The user has not yet been asked for camera access.
AVCaptureDevice.requestAccess(for: .video) { granted in
if granted {
self.setupCaptureSession()
}
}
case .denied: // The user has previously denied access.
return
case .restricted: // The user can't grant access due to restrictions.
return
}

Related

Swift - Recorded Video is Mirrored on Front Camera - How to flip?

I'm trying to mirror the recorded video from a capture session. The video preview for front facing camera shows a mirrored version, however, when I go to save the file and play it back, the captured video is actually mirrored. I'm using Apple's AVCam demo as a reference and can't seem to figure this out! Please help.
I've tried creating an AVCaptureConnection and trying to set the .isVideoMirrored parameter. However, I get this error:
cannot be added to the session because the source and destination media types are incompatible'
I would have thought mirroring the video would be much easier. I think I may be creating my connection incorrectly. The code below doesn't actually "Add connection" when I call the .canAddConnection check.
var captureSession: AVCaptureSession!
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
captureSession = AVCaptureSession()
//Setup Camera
if let dualCameraDevice = AVCaptureDevice.default(.builtInDualCamera, for: .video, position: .front) {
defaultVideoDevice = dualCameraDevice
} else if let frontCameraDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .front) {
// If the rear wide angle camera isn't available, default to the front wide angle camera.
defaultVideoDevice = frontCameraDevice
}
guard let videoDevice = defaultVideoDevice else {
print("Default video device is unavailable.")
// setupResult = .configurationFailed
captureSession.commitConfiguration()
return
}
let videoDeviceInput = try AVCaptureDeviceInput(device: videoDevice)
if captureSession.canAddInput(videoDeviceInput) {
captureSession.addInput(videoDeviceInput)
}
let movieOutput = AVCaptureMovieFileOutput()
//Video Input variable for AVCapture Connection
let videoInput: [AVCaptureInput.Port] = videoDeviceInput.ports
if captureSession.canAddOutput(movieOutput) {
captureSession.beginConfiguration()
captureSession.addOutput(movieOutput)
captureSession.sessionPreset = .medium
Then I try to setup the AVCapture connection and try to set the parameters for mirroring. Please tell me if there is an easier way to mirror the output / playback.
avCaptureConnection = AVCaptureConnection(inputPorts: videoInput, output: movieOutput)
avCaptureConnection.isEnabled = true
//Mirror the capture connection?
avCaptureConnection.automaticallyAdjustsVideoMirroring = false
avCaptureConnection.isVideoMirrored = false
//Check if we can add a connection
if captureSession.canAddConnection(avCaptureConnection) {
//Add the connection
captureSession.addConnection(avCaptureConnection)
}
captureSession.commitConfiguration()
self.movieOutput = movieOutput
setupLivePreview()
}
}
Somewhere else in the code, connected to an IBAaction, I initialize the recording
// Start recording video to a temporary file.
let outputFileName = NSUUID().uuidString
let outputFilePath = (NSTemporaryDirectory() as NSString).appendingPathComponent((outputFileName as NSString).appendingPathExtension("mov")!)
print("Recording in tap function")
movieOutput.startRecording(to: URL(fileURLWithPath: outputFilePath), recordingDelegate: self)
I think I'm using AVCaptureConnection incorrectly, especially because of the error stating media types are incompatible. If there is a proper way to implement this function please do let me know. Also open to hearing suggestions for an easier way to mirror the playback. Thank you!

What is wrong with my custom camera view?

I followed this video: https://www.youtube.com/watch?v=7TqXrMnfJy8&t=45s to the T. But when I open the camera view all I see is the black screen and white button. I get no error messages when I try I load the camera view. Can someone please assist me with what I'm doing wrong?
My code is below:
import UIKit
import AVFoundation
class CameraViewController: UIViewController {
var captureSession = AVCaptureSession()
var backCamera: AVCaptureDevice?
var currentCamera: AVCaptureDevice?
var photoOutput: AVCapturePhotoOutput?
var cameraPreviewLayer: AVCaptureVideoPreviewLayer?
override func viewDidLoad() {
super.viewDidLoad()
setupCaptureSession()
setupDevice()
setupInputOutput()
setupPreviewLayer()
startRunningCaptureSession()
}
func setupCaptureSession(){
captureSession.sessionPreset = AVCaptureSession.Preset.photo
}
func setupDevice(){
let deviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [AVCaptureDevice.DeviceType.builtInWideAngleCamera], mediaType: AVMediaType.video, position: AVCaptureDevice.Position.unspecified)
let devices = deviceDiscoverySession.devices
for device in devices{
if device.position == AVCaptureDevice.Position.back {
backCamera = device
}
}
currentCamera = backCamera
}
func setupInputOutput(){
do {
let captureDeviceInput = try AVCaptureDeviceInput(device: currentCamera!)
captureSession.addInput(captureDeviceInput)
photoOutput?.setPreparedPhotoSettingsArray([AVCapturePhotoSettings(format:[AVVideoCodecKey: AVVideoCodecType.jpeg])], completionHandler: nil)
} catch {
print(error)
}
}
func setupPreviewLayer(){
cameraPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
cameraPreviewLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
cameraPreviewLayer?.connection?.videoOrientation = AVCaptureVideoOrientation.portrait
cameraPreviewLayer?.frame = self.view.frame
self.view.layer.insertSublayer(cameraPreviewLayer!, at: 1)
}
func startRunningCaptureSession(){
captureSession.startRunning()
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated
}
}
I ran your code and it worked perfectly fine — almost! The only problem is that I had to add a Privacy — Camera Usage Description entry to the app's Info.plist. Otherwise the app crashes.
Once I did that and ran your code, I saw the live camera view on my device.
So why isn't it working for you? Let's think of some possible reasons. You didn't give enough info to know for sure (seeing as the code itself works just fine), but here are some possibilities:
You don't have the Privacy — Camera Usage Description entry in the app's Info.plist.
You are testing on the Simulator. Maybe this code works only on a device.
There is something in your interface in front of the sublayer that you add when you say insertSublayer. To test this, try saying addSublayer instead; this will make the camera layer the frontmost layer (this is just for testing purposes, remember).
Maybe your code never runs at all? Perhaps we never actually go to this view controller. To test that theory, put a print statement in your viewDidLoad and see if it actually prints to the console.
Maybe your code runs too soon? To test that theory, move all those calls out of viewDidLoad and into something later, such as viewDidAppear. Remember, this is just for testing purposes.
Hopefully one of those will help you figure out what the problem is.

Refreshing AVCaptureSession...?

I am having some difficulty with AVCaptureSession when popping view controllers. I have a view controller in a navigation controller where a user takes a photo. After the photo is captured, I segue to a "preview photo" view controller. If the user doesn't like the photo, they can go back and re take it. When I pop the preview photo view controller, the app crashes with error "Multiple audio/video AVCaptureInputs are not currently supported'"
I thought that maybe I can remove/ refresh the input session but it's still crashing.
Any support/ advice is greatly appreciated!
segue:
#IBAction func cancelPressed(_ sender: UIButton) {
_ = self.navigationController?.popViewController(animated: true)
}
camera config (which works fine):
func setupCaptureSessionCamera() {
//this makes sure to get full res of camera
captureSession.sessionPreset = AVCaptureSession.Preset.photo
var devices = AVCaptureDevice.devices(for: .video)
//query available devices
for device in devices {
if device.position == .front {
frontFacingCamera = device
} else if device.position == .back {
backFacingCamera = device
}
}//end iteration
//set a default device
currentDevice = backFacingCamera
//configure session w output for capturing still img
stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput?.outputSettings = [AVVideoCodecKey : AVVideoCodecType.jpeg]
do {
let captureDeviceInput = try AVCaptureDeviceInput(device: currentDevice!)
captureSession.addInput(captureDeviceInput)
captureSession.addOutput(stillImageOutput!)
//setup camera preview layer
cameraPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
//add the preview to our specified view in the UI
view.layer.addSublayer(cameraPreviewLayer!)
cameraPreviewLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
cameraPreviewLayer?.frame = cameraView.frame
captureSession.startRunning()
} catch let error {
print(error)
}//end do
}
What I tried (remove inputs in view will appear if the sender is preview photo controller):
func refreshCamera() {
captureSession.beginConfiguration()
for input in captureSession.inputs {
captureSession.removeInput(input as! AVCaptureDeviceInput)
}
captureSession.commitConfiguration()
}
It was much simpler than I was imagining. All that is needed is to first check if there is already input or not before calling the setupCameraSession method:
if captureSession.inputs.isEmpty {
setupCaptureSessionCamera()
}

Custom camera in xcode, Swift 3

So I have this issue where I am trying to create a custom camera in Xcode however, for some reason I cannot get it so that it is set to use the front camera. No matter what I change in the code it seems to only use the back camera and I was hoping that someone might be generous enough to take a quick look at my code below and see whether there is something that I am missing or somewhere that I went wrong. Any help would be very much appreciated, thank you for your time.
func SelectInputDevice() {
let devices = AVCaptureDevice.defaultDevice(withDeviceType: .builtInWideAngleCamera,
mediaType: AVMediaTypeVideo, position: .front)
if devices?.position == AVCaptureDevicePosition.front {
print(devices?.position)
frontCamera = devices
}
currentCameraDevice = frontCamera
do {
let captureDeviceInput = try AVCaptureDeviceInput(device: currentCameraDevice)
captureSession.addInput(captureDeviceInput)
} catch {
print(error.localizedDescription)
}
}
This is where frontCamera and currentCameraDevice are a AVCaptureDevice's.
Seems there are a few things missing from your code:
1) In order to change input devices you need to reconfigure the session by calling session.beginConfiguration() before adding the new device and ending with session.commitConfiguration(). Also all changes should be made on the background queue (that hopefully you've created for the session) so that UI isn't blocked when session is configured.
2) Code would be safer to check with the session it allows the new device before adding it with session.canAddInput(captureDeviceInput) + removing the previous device (the back camera) as front+back config isn't allowed.
3) Also would be cleaner to check your device has a working front camera (might be broken) before to prevent any crashes.
Full code for changing capture device to front camera would look like:
func switchCameraToFront() {
//session & sessionQueue are references to the capture session and its dispatch queue
sessionQueue.async { [unowned self] in
let currentVideoInput = self.videoDeviceInput //ref to current videoInput as setup in initial session config
let preferredPosition: AVCaptureDevicePosition = .front
let preferredDeviceType: AVCaptureDeviceType = .builtInWideAngleCamera
let devices = self.videoDeviceDiscoverySession.devices!
var newVideoDevice: AVCaptureDevice? = nil
// First, look for a device with both the preferred position and device type. Otherwise, look for a device with only the preferred position.
if let device = devices.filter({ $0.position == preferredPosition && $0.deviceType == preferredDeviceType }).first {
newVideoDevice = device
}
else if let device = devices.filter({ $0.position == preferredPosition }).first {
newVideoDevice = device
}
if let videoDevice = newVideoDevice {
do {
let videoDeviceInput = try AVCaptureDeviceInput(device: videoDevice)
self.session.beginConfiguration()
// Remove the existing device input first, since using the front and back camera simultaneously is not supported.
self.session.removeInput(currentVideoInput)
if self.session.canAddInput(videoDeviceInput) {
self.session.addInput(videoDeviceInput)
self.videoDeviceInput = videoDeviceInput
}
else {
//fallback to current device
self.session.addInput(self.videoDeviceInput);
}
self.session.commitConfiguration()
}
catch {
}
}
}
}

Apple Music & AVAudioEngine in Swift

How can we access songs in the Apple Music library with AVAudioPlayerNode/AVAudioEngine for playback and processing?
I have asked this question in Apple forum.
"Apple Music" may refer to:
the music streaming service
its iOS client (known as "Music")
its macOS client, which doubles as a local media management and playback app (formerly known as iTunes, known as "Music" as of macOS Catalina).
Due to DRM restrictions it is not possible to play tracks from the Apple Music catalogue downloaded onto your device from the Apple Music macOS or iOS clients. However you can play audio files you own and which you've synced onto your device using the macOS Music app or the Finder app, as follows:
Add the NSAppleMusicUsageDescription key to your Info.plist file, and its corresponding value
Setup your AVAudioSession and AVAudioEngine
Find the URL of the media item you want to play (you can use MPMediaPickerController like in the example below or you can make your own MPMediaQuery)
Create an AVAudioFile from that URL
Create an AVAudioPlayerNode set to play that AVAudioFile
Connect the player node to the engine's output node
import UIKit
import AVFoundation
import MediaPlayer
class ViewController: UIViewController {
let engine = AVAudioEngine()
override func viewDidLoad() {
super.viewDidLoad()
let mediaPicker = MPMediaPickerController(mediaTypes: .music)
mediaPicker.allowsPickingMultipleItems = false
mediaPicker.showsItemsWithProtectedAssets = false // These items usually cannot be played back
mediaPicker.showsCloudItems = false // MPMediaItems stored in the cloud don't have an assetURL
mediaPicker.delegate = self
mediaPicker.prompt = "Pick a track"
present(mediaPicker, animated: true, completion: nil)
}
func startEngine(playFileAt: URL) {
do {
try AVAudioSession.sharedInstance().setCategory(.playback)
let avAudioFile = try AVAudioFile(forReading: playFileAt)
let player = AVAudioPlayerNode()
engine.attach(player)
engine.connect(player, to: engine.mainMixerNode, format: avAudioFile.processingFormat)
try engine.start()
player.scheduleFile(avAudioFile, at: nil, completionHandler: nil)
player.play()
} catch {
assertionFailure(String(describing: error))
}
}
}
extension ViewController: MPMediaPickerControllerDelegate {
func mediaPicker(_ mediaPicker: MPMediaPickerController, didPickMediaItems mediaItemCollection: MPMediaItemCollection) {
guard let item = mediaItemCollection.items.first else {
print("no item")
return
}
print("picking \(item.title!)")
guard let url = item.assetURL else {
return print("no url")
}
dismiss(animated: true) { [weak self] in
self?.startEngine(playFileAt: url)
}
}
func mediaPickerDidCancel(_ mediaPicker: MPMediaPickerController) {
dismiss(animated: true, completion: nil)
}
}