AVFoundation Camera in view - swift

I'm having a lot of problems trying to get a view to show my back Camera feed. I looked throughout apples docs and came up with this, but all it seems to do is make a black screen. I also added the perms in my plist and am running on a real device. I don't need it to take a photo or save anything. Just simply show the camera live in a view.
import UIKit
import AVFoundation
class ViewController: UIViewController {
#IBOutlet weak var cameraView: UIView!
var captureSession = AVCaptureSession()
var previewLayer = AVCaptureVideoPreviewLayer()
override func viewDidLoad() {
super.viewDidLoad()
loadCamera()
}
func loadCamera() {
let device = AVCaptureDevice.default(.builtInWideAngleCamera, for: AVMediaType.video, position: .back)
do {
let input = try AVCaptureDeviceInput(device: device!)
if captureSession.canAddInput(input) {
captureSession.addInput(input)
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill
previewLayer.connection?.videoOrientation = AVCaptureVideoOrientation.portrait
cameraView.layer.addSublayer(previewLayer)
}
} catch {
print(error)
}
}
}

Welcome!
The problem is that there is actually no data flow (of video frames) happening with your current setup. You need to at least attach one input and one output to your capture session. The preview layer doesn't count as an output itself since it will only attach to an existing connection between input and output.
So to fix it, you can just add an AVCapturePhotoOutput to the session (probably before you add the layer) but never use it. The preview layer should start displaying the frames then.
You probably also want to set the session's sessionPreset to .photo before you add the inputs and outputs. This will cause the session to produce video frames that have an ideal size for displaying on your device's screen.

Related

How to add a video in the UI in Swift

I want to be able to add a video (called "Logo-Animation4.mp4") into the UI in Swift, just like you can a UIImage in a UIImageView.
Right now, I've tried just putting an AVPlayer on the view, but it comes up with the default iOS AVPlayer, which contains the regular fullscreen, controls, and volume stuff that I don't want. I just want the video to play in the UI without any way to have the user interact with it.
I've thought about animating the video using a regular UIImageView's animation feature, but my video isn't that short, and it would be hard to get every single frame, and input it into the code.
How should I go about doing this?
To achieve what you are asking, you'll need to use AVPlayerLayer
Add a UIView outlet
#IBOutlet weak var videoView: UIView!
Import AVFoundation
Create player variables
var player : AVPlayer!
var avPlayerLayer : AVPlayerLayer!
Add the following code to your viewDidLoad
guard let path = Bundle.main.path(forResource: "Logo-Animation4", ofType:"mp4") else {
debugPrint("Logo-Animation4.mp4 not found")
return
}
player = AVPlayer(url: URL(fileURLWithPath: path))
avPlayerLayer = AVPlayerLayer(player: player)
avPlayerLayer.videoGravity = AVLayerVideoGravity.resize
videoView.layer.addSublayer(avPlayerLayer)
player.play()
Add this method
override func viewDidLayoutSubviews() {
super.viewDidLayoutSubviews()
avPlayerLayer.frame = videoView.layer.bounds
}

macOS App using iPhone camera

I am trying to build a simple swift (4) macOS app to use an iPhone camera connected to my Mac.
I have started an blank macOS template app and have turned on sandbox to allow camera, mic and USB and added the following code to my ViewController.
import Cocoa
import AVFoundation
class ViewController: NSViewController {
#IBOutlet weak var camera: NSView!
override func viewDidLoad() {
super.viewDidLoad()
camera.layer = CALayer()
let session:AVCaptureSession = AVCaptureSession()
session.sessionPreset = AVCaptureSession.Preset.high
let device:AVCaptureDevice = (AVCaptureDevice.default(for: AVMediaType.video))!
// let listdevices = (AVCaptureDevice.devices())
do {
try session.addInput(AVCaptureDeviceInput(device: device))
//Preview
let previewLayer:AVCaptureVideoPreviewLayer = AVCaptureVideoPreviewLayer(session: session)
let myView:NSView = self.view
previewLayer.frame = myView.bounds
previewLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill
self.camera.layer?.addSublayer(previewLayer)
session.startRunning()
// print(listdevices)
// print(device)
} catch {
print(device)
}
}
override var representedObject: Any? {
didSet {
// Update the view, if already loaded.
}
}
}
In storyboard I have dropped in a Custom View.
App builds ok and uses facetime camera no problem, however with iPhone connected I dont see if as a device that AVFoundation can use. Not sure on next steps on how to get the previewLayer to select the USB camera to use aka iPhone.
p.s Needs to be landscape for all cameras orietation
According to this Apple Developer Forum post, capturing the camera of a connected iOS device from a macOS app is not supported.
The closest you can do (as the post suggests), is to capture the screen of the iOS device while the camera (Camera.app) is running, effectively capturing the live camera preview (or you can roll your own companion camera app in iOS, if you want to remove the camera app’s UI from the captured screen).

ARKit and AVCamera simultaneously

As there is no autofocus in ARKit, I wanted to load ARKit in a view that is half the screen and second half will have AVFoundation -> AVCamera.
Is it possible to load AVCamera and ARKit simultaneously in same app?
Thanks.
Nope.
ARKit uses AVCapture internally (as explained in the WWDC talk introducing ARKit). Only one AVCaptureSession can be running at a time, so if you run your own capture session it’ll suspend ARKit’s session (and break tracking).
Update: However, in iOS 11.3 (aka "ARKit 1.5"), ARKit enables autofocus by default, and you can choose to disable it with the isAutoFocusEnabled option.
Changing the camera focus would disrupt the tracking - so this is definitely not possible (right now at least).
Update: See #rickster answer above.
I managed to use AVFoundation with ARKit by calling self.sceneView.session.run(self.sceneView.session.configuration!) right after taking the photo.
Use self.captureSession?.stopRunning() right after taking photo to make the session resume faster.
self.takePhoto()
self.sceneView.session.run(self.sceneView.session.configuration!)
func startCamera() {
captureSession = AVCaptureSession()
captureSession.sessionPreset = AVCaptureSession.Preset.photo
cameraOutput = AVCapturePhotoOutput()
if let device = AVCaptureDevice.default(for: .video),
let input = try? AVCaptureDeviceInput(device: device) {
if (captureSession.canAddInput(input)) {
captureSession.addInput(input)
if (captureSession.canAddOutput(cameraOutput)) {
captureSession.addOutput(cameraOutput)
captureSession.startRunning()
}
} else {
print("issue here : captureSesssion.canAddInput")
}
} else {
print("some problem here")
}
}
func takePhoto() {
startCamera()
let settings = AVCapturePhotoSettings()
cameraOutput.capturePhoto(with: settings, delegate: self)
self.captureSession?.stopRunning()
}

Swift Mac OS - How to play another video/change the URL of the video with AVPlayer upon a button click?

I am new to swift and I am trying to make a Mac OS app that loops a video from the app's resources using AVPlayer as the background of the window once the app has been launched. When the user selects a menu item/clicks a button the background video will instantly change to a different video from the app's resources and start looping that video as the window's background.
I was able to play the first video once the app launches following this tutorial: (https://youtu.be/QgeQc587w70) and I also successfully made the video loop itself seamlessly following this post: (Looping AVPlayer seamlessly).
The problem I am now facing is changing the video to the other one once a menu item was selected/a button was clicked. The approach I was going for is to change the url and create a new AVPlayer using the new URL and affect it to the playerView.player following this post: (Swift How to update url of video in AVPlayer when clicking on a button?). However every time the menu item is selected the app crashes with the error "thread 1 exc_bad_instruction (code=exc_i386_invop subcode=0x0)". This is apparently caused by the value of playerView being nil. I don't really understand the reason for this as playerView is an AVPlayerView object that I created using the xib file and linked to the swift file by control-dragging and I couldn't seem to find another appropriate method of doing the thing I wanted to do. If you know the reason for this and the way of fixing it please provide me some help or if you know a better method of doing what I've mention above please tell me as well. Any help would be much appreciated!
My code is as follow, the line that crashes the app is at the bottom:
import Cocoa
import AppKit
import AVKit
import AVFoundation
struct videoVariables {
static var videoName = "Test_Video" //declaring the video name as a global variable
}
var videoIsPlaying = true
var theURL = Bundle.main.url(forResource:videoVariables.videoName, withExtension: "mp4") //creating the video url
var player = AVPlayer.init(url: theURL!)
class BackgroundWindow: NSWindowController {
#IBOutlet weak var playerView: AVPlayerView! // AVPlayerView Linked using control-drag from xib file
#IBOutlet var mainWindow: NSWindow!
#IBOutlet weak var TempBG: NSImageView!
override var windowNibName : String! {
return "BackgroundWindow"
}
//function used for resizing the temporary background image and the playerView to the window’s size
func resizeBG() {
var scrn: NSScreen = NSScreen.main()!
var rect: NSRect = scrn.frame
var height = rect.size.height
var width = rect.size.width
TempBG.setFrameSize(NSSize(width: Int(width), height: Int(height)))
TempBG.frame.origin = CGPoint(x: 0, y: 0)
playerView!.setFrameSize(NSSize(width: Int(width), height: Int(height)))
playerView!.frame.origin = CGPoint(x: 0, y: 0)
}
override func windowDidLoad() {
super.windowDidLoad()
self.window?.titleVisibility = NSWindowTitleVisibility.hidden //hide window’s title
self.window?.styleMask = NSBorderlessWindowMask //hide window’s border
self.window?.hasShadow = false //hide window’s shadow
self.window?.level = Int(CGWindowLevelForKey(CGWindowLevelKey.desktopWindow)) //set window’s layer as desktopWindow layer
self.window?.center()
self.window?.makeKeyAndOrderFront(nil)
NSApp.activate(ignoringOtherApps: true)
if let screen = NSScreen.main() {
self.window?.setFrame(screen.visibleFrame, display: true, animate: false) //resizing the window to cover the whole screen
}
resizeBG() //resizing the temporary background image and the playerView to the window’s size
startVideo() //start playing and loop the first video as the window’s background
}
//function used for starting the video again once it has been played fully
func playerItemDidReachEnd(notification: NSNotification) {
playerView.player?.seek(to: kCMTimeZero)
playerView.player?.play()
}
//function used for starting and looping the video
func startVideo() {
//set the seeking time to be 2ms ahead to prevent a black screen every time the video loops
let playAhead = CMTimeMake(2, 100);
//loops the video
NotificationCenter.default.addObserver(forName: .AVPlayerItemDidPlayToEndTime, object:
playerView.player?.currentItem, queue: nil, using: { (_) in
DispatchQueue.main.async {
self.playerView.player?.seek(to: playAhead)
self.playerView.player?.play()
}
})
var playerLayer: AVPlayerLayer?
playerLayer = AVPlayerLayer(player: player)
playerView?.player = player
print(playerView?.player)
playerLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
player.play()
}
//changing the url to the new url and create a new AVPlayer then affect it to the playerView.player once the menu item is being selected
#IBAction func renderBG(_ sender: NSMenuItem) {
videoVariables.videoName = "Test_Video_2"
var theNewURL = Bundle.main.url(forResource:videoVariables.videoName, withExtension: "mp4")
player = AVPlayer.init(url: theNewURL!)
//!!this line crashes the app with the error "thread 1 exc_bad_instruction (code=exc_i386_invop subcode=0x0)" every time the menu item is being selected!!
playerView.player = player
}
}
Additionally, the background video is not supposed to be interactive(E.g. User cannot pause/ fast-forward the video), so any issues that might be caused by user interactivity can be ignored. The purpose of the app is to play a video on the user's desktop creating the exact same effect of running the command:
"/System/Library/Frameworks/ScreenSaver.framework/Resources/
ScreenSaverEngine.app/Contents/MacOS/ScreenSaverEngine -background" in terminal.
Any help would be much appreciated!
You don't need to create AVPlayer from url. There is AVPlayerItem class to manipulate player playback queue.
let firstAsset = AVURLAsset(url: firstVideoUrl)
let firstPlayerItem = AVPlayerItem(asset: firstAsset)
let player = AVPlayer(playerItem: firstPlayerItem)
let secondAsset = AVURLAsset(url: secondVideoUrl)
let secondPlayerItem = AVPlayerItem(asset: secondAsset)
player.replaceCurrentItem(with: secondPlayerItem)
Docs about AVPlayerItem

Capturing Video with Swift using AVCaptureVideoDataOutput or AVCaptureMovieFileOutput

I need some guidance on how to capture video without having to use an UIImagePicker. The video needs to start and stop on a button click and then this data be saved to the NSDocumentDirectory. I am new to swift so any help will be useful.
The section of code that I need help with is starting and stopping a video session and turning that to data. I created a picture taking version that runs captureStillImageAsynchronouslyFromConnection and saves this data to the NSDocumentDirectory. I have set up a video capturing session and have the code ready to save data but do not know how to get the data from the session.
var previewLayer : AVCaptureVideoPreviewLayer?
var captureDevice : AVCaptureDevice?
var videoCaptureOutput = AVCaptureVideoDataOutput()
let captureSession = AVCaptureSession()
override func viewDidLoad() {
super.viewDidLoad()
captureSession.sessionPreset = AVCaptureSessionPreset640x480
let devices = AVCaptureDevice.devices()
for device in devices {
if (device.hasMediaType(AVMediaTypeVideo)) {
if device.position == AVCaptureDevicePosition.Back {
captureDevice = device as? AVCaptureDevice
if captureDevice != nil {
beginSession()
}
}
}
}
}
func beginSession() {
var err : NSError? = nil
captureSession.addInput(AVCaptureDeviceInput(device: captureDevice, error: &err))
if err != nil {
println("Error: \(err?.localizedDescription)")
}
videoCaptureOutput.videoSettings = [kCVPixelBufferPixelFormatTypeKey:kCVPixelFormatType_32BGRA]
videoCaptureOutput.alwaysDiscardsLateVideoFrames = true
captureSession.addOutput(videoCaptureOutput)
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
self.view.layer.addSublayer(previewLayer)
previewLayer?.frame = CGRectMake(0, 0, screenWidth, screenHeight)
captureSession.startRunning()
var startVideoBtn = UIButton(frame: CGRectMake(0, screenHeight/2, screenWidth, screenHeight/2))
startVideoBtn.addTarget(self, action: "startVideo", forControlEvents: UIControlEvents.TouchUpInside)
self.view.addSubview(startVideoBtn)
var stopVideoBtn = UIButton(frame: CGRectMake(0, 0, screenWidth, screenHeight/2))
stopVideoBtn.addTarget(self, action: "stopVideo", forControlEvents: UIControlEvents.TouchUpInside)
self.view.addSubview(stopVideoBtn)
}
I can supply more code or explanation if needed.
For best results, read the Still and Video Media Capture section from the AV Foundation Programming Guide.
To process frames from AVCaptureVideoDataOutput, you will need a delegate that adopts the AVCaptureVideoDataOutputSampleBufferDelegate protocol. The delegate's captureOutput method will be called whenever a new frame is written. When you set the output’s delegate, you must also provide a queue on which callbacks should be invoked. It will look something like this:
let cameraQueue = dispatch_queue_create("cameraQueue", DISPATCH_QUEUE_SERIAL)
videoCaptureOutput.setSampleBufferDelegate(myDelegate, queue: cameraQueue)
captureSession.addOutput(videoCaptureOutput)
NB: If you just want to save the movie to a file, you may prefer the AVCaptureMovieFileOutput class instead of AVCaptureVideoDataOutput. In that case, you won't need a queue. But you'll still need a delegate, this time adopting the AVCaptureFileOutputRecordingDelegate protocol instead. (The relevant method is still called captureOutput.)
Here's one excerpt from the part about AVCaptureMovieFileOutput from the guide linked to above:
Starting a Recording
You start recording a QuickTime movie using startRecordingToOutputFileURL:recordingDelegate:. You need to supply a
file-based URL and a delegate. The URL must not identify an existing
file, because the movie file output does not overwrite existing
resources. You must also have permission to write to the specified
location. The delegate must conform to the
AVCaptureFileOutputRecordingDelegate protocol, and must implement the
captureOutput:didFinishRecordingToOutputFileAtURL:fromConnections:error:
method.
AVCaptureMovieFileOutput *aMovieFileOutput = <#Get a movie file output#>;
NSURL *fileURL = <#A file URL that identifies the output location#>;
[aMovieFileOutput startRecordingToOutputFileURL:fileURL recordingDelegate:<#The delegate#>];
In the implementation of
captureOutput:didFinishRecordingToOutputFileAtURL:fromConnections:error:,
the delegate might write the resulting movie to the Camera Roll album.
It should also check for any errors that might have occurred.