macOS App using iPhone camera - swift

I am trying to build a simple swift (4) macOS app to use an iPhone camera connected to my Mac.
I have started an blank macOS template app and have turned on sandbox to allow camera, mic and USB and added the following code to my ViewController.
import Cocoa
import AVFoundation
class ViewController: NSViewController {
#IBOutlet weak var camera: NSView!
override func viewDidLoad() {
super.viewDidLoad()
camera.layer = CALayer()
let session:AVCaptureSession = AVCaptureSession()
session.sessionPreset = AVCaptureSession.Preset.high
let device:AVCaptureDevice = (AVCaptureDevice.default(for: AVMediaType.video))!
// let listdevices = (AVCaptureDevice.devices())
do {
try session.addInput(AVCaptureDeviceInput(device: device))
//Preview
let previewLayer:AVCaptureVideoPreviewLayer = AVCaptureVideoPreviewLayer(session: session)
let myView:NSView = self.view
previewLayer.frame = myView.bounds
previewLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill
self.camera.layer?.addSublayer(previewLayer)
session.startRunning()
// print(listdevices)
// print(device)
} catch {
print(device)
}
}
override var representedObject: Any? {
didSet {
// Update the view, if already loaded.
}
}
}
In storyboard I have dropped in a Custom View.
App builds ok and uses facetime camera no problem, however with iPhone connected I dont see if as a device that AVFoundation can use. Not sure on next steps on how to get the previewLayer to select the USB camera to use aka iPhone.
p.s Needs to be landscape for all cameras orietation

According to this Apple Developer Forum post, capturing the camera of a connected iOS device from a macOS app is not supported.
The closest you can do (as the post suggests), is to capture the screen of the iOS device while the camera (Camera.app) is running, effectively capturing the live camera preview (or you can roll your own companion camera app in iOS, if you want to remove the camera app’s UI from the captured screen).

Related

AVFoundation Camera in view

I'm having a lot of problems trying to get a view to show my back Camera feed. I looked throughout apples docs and came up with this, but all it seems to do is make a black screen. I also added the perms in my plist and am running on a real device. I don't need it to take a photo or save anything. Just simply show the camera live in a view.
import UIKit
import AVFoundation
class ViewController: UIViewController {
#IBOutlet weak var cameraView: UIView!
var captureSession = AVCaptureSession()
var previewLayer = AVCaptureVideoPreviewLayer()
override func viewDidLoad() {
super.viewDidLoad()
loadCamera()
}
func loadCamera() {
let device = AVCaptureDevice.default(.builtInWideAngleCamera, for: AVMediaType.video, position: .back)
do {
let input = try AVCaptureDeviceInput(device: device!)
if captureSession.canAddInput(input) {
captureSession.addInput(input)
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill
previewLayer.connection?.videoOrientation = AVCaptureVideoOrientation.portrait
cameraView.layer.addSublayer(previewLayer)
}
} catch {
print(error)
}
}
}
Welcome!
The problem is that there is actually no data flow (of video frames) happening with your current setup. You need to at least attach one input and one output to your capture session. The preview layer doesn't count as an output itself since it will only attach to an existing connection between input and output.
So to fix it, you can just add an AVCapturePhotoOutput to the session (probably before you add the layer) but never use it. The preview layer should start displaying the frames then.
You probably also want to set the session's sessionPreset to .photo before you add the inputs and outputs. This will cause the session to produce video frames that have an ideal size for displaying on your device's screen.

Unsupported IOSurface format: 0x26424741 using twilio video in scenekit

I am using twilio to send a video and use that video in a scenekit as a texture. But the problem is it works fine with iPhone X, but it gave this error Unsupported IOSurface format: 0x26424741 on iPhone XR and XS.
this is what I am doing:
Get Video:
func subscribed(to videoTrack: TVIRemoteVideoTrack, publication: TVIRemoteVideoTrackPublication, for participant: TVIRemoteParticipant) {
print("Participant \(participant.identity) added a video track.")
let remoteView = TVIVideoView.init(frame: UIWindow().frame,
delegate:self)
videoTrack.addRenderer(remoteView!)
delegate.participantAdded(with: remoteView!)
}
delegate:
func participantAdded(with videoView: UIView) {
sceneView.addVideo(with: videoView)
}
and add video to plane:
func addVideo(with view: UIView){
videoPlane.geometry?.firstMaterial?.diffuse.contents = view
}
The problem was actually with renderingType of remoteView. For older devices using metal was fine but newer devices it needed openGLES. I dont know why but it was the fix.
I used this solution to find out the device type.
Next I determined which renderingType to use
var renderingType: VideoView.RenderingType {
get{
let device = UIDevice()
switch device.type{
case .iPhoneXS:
return .openGLES
case .iPhoneXR:
return .openGLES
case .iPhoneXSMax:
return .openGLES
default:
return .metal
}
}
}
And used it to initialize remoteView
func didSubscribeToVideoTrack(videoTrack: RemoteVideoTrack, publication: RemoteVideoTrackPublication, participant: RemoteParticipant) {
print("Participant \(participant.identity) added a video track.")
let remoteView = VideoView.init(frame: UIWindow().frame,
delegate:self,
renderingType: renderingType)
videoTrack.addRenderer(remoteView!)
delegate.participantAddedVideo(for: participant.identity, with: remoteView!)
}

ARKit and AVCamera simultaneously

As there is no autofocus in ARKit, I wanted to load ARKit in a view that is half the screen and second half will have AVFoundation -> AVCamera.
Is it possible to load AVCamera and ARKit simultaneously in same app?
Thanks.
Nope.
ARKit uses AVCapture internally (as explained in the WWDC talk introducing ARKit). Only one AVCaptureSession can be running at a time, so if you run your own capture session it’ll suspend ARKit’s session (and break tracking).
Update: However, in iOS 11.3 (aka "ARKit 1.5"), ARKit enables autofocus by default, and you can choose to disable it with the isAutoFocusEnabled option.
Changing the camera focus would disrupt the tracking - so this is definitely not possible (right now at least).
Update: See #rickster answer above.
I managed to use AVFoundation with ARKit by calling self.sceneView.session.run(self.sceneView.session.configuration!) right after taking the photo.
Use self.captureSession?.stopRunning() right after taking photo to make the session resume faster.
self.takePhoto()
self.sceneView.session.run(self.sceneView.session.configuration!)
func startCamera() {
captureSession = AVCaptureSession()
captureSession.sessionPreset = AVCaptureSession.Preset.photo
cameraOutput = AVCapturePhotoOutput()
if let device = AVCaptureDevice.default(for: .video),
let input = try? AVCaptureDeviceInput(device: device) {
if (captureSession.canAddInput(input)) {
captureSession.addInput(input)
if (captureSession.canAddOutput(cameraOutput)) {
captureSession.addOutput(cameraOutput)
captureSession.startRunning()
}
} else {
print("issue here : captureSesssion.canAddInput")
}
} else {
print("some problem here")
}
}
func takePhoto() {
startCamera()
let settings = AVCapturePhotoSettings()
cameraOutput.capturePhoto(with: settings, delegate: self)
self.captureSession?.stopRunning()
}

Swift Mac OS - How to play another video/change the URL of the video with AVPlayer upon a button click?

I am new to swift and I am trying to make a Mac OS app that loops a video from the app's resources using AVPlayer as the background of the window once the app has been launched. When the user selects a menu item/clicks a button the background video will instantly change to a different video from the app's resources and start looping that video as the window's background.
I was able to play the first video once the app launches following this tutorial: (https://youtu.be/QgeQc587w70) and I also successfully made the video loop itself seamlessly following this post: (Looping AVPlayer seamlessly).
The problem I am now facing is changing the video to the other one once a menu item was selected/a button was clicked. The approach I was going for is to change the url and create a new AVPlayer using the new URL and affect it to the playerView.player following this post: (Swift How to update url of video in AVPlayer when clicking on a button?). However every time the menu item is selected the app crashes with the error "thread 1 exc_bad_instruction (code=exc_i386_invop subcode=0x0)". This is apparently caused by the value of playerView being nil. I don't really understand the reason for this as playerView is an AVPlayerView object that I created using the xib file and linked to the swift file by control-dragging and I couldn't seem to find another appropriate method of doing the thing I wanted to do. If you know the reason for this and the way of fixing it please provide me some help or if you know a better method of doing what I've mention above please tell me as well. Any help would be much appreciated!
My code is as follow, the line that crashes the app is at the bottom:
import Cocoa
import AppKit
import AVKit
import AVFoundation
struct videoVariables {
static var videoName = "Test_Video" //declaring the video name as a global variable
}
var videoIsPlaying = true
var theURL = Bundle.main.url(forResource:videoVariables.videoName, withExtension: "mp4") //creating the video url
var player = AVPlayer.init(url: theURL!)
class BackgroundWindow: NSWindowController {
#IBOutlet weak var playerView: AVPlayerView! // AVPlayerView Linked using control-drag from xib file
#IBOutlet var mainWindow: NSWindow!
#IBOutlet weak var TempBG: NSImageView!
override var windowNibName : String! {
return "BackgroundWindow"
}
//function used for resizing the temporary background image and the playerView to the window’s size
func resizeBG() {
var scrn: NSScreen = NSScreen.main()!
var rect: NSRect = scrn.frame
var height = rect.size.height
var width = rect.size.width
TempBG.setFrameSize(NSSize(width: Int(width), height: Int(height)))
TempBG.frame.origin = CGPoint(x: 0, y: 0)
playerView!.setFrameSize(NSSize(width: Int(width), height: Int(height)))
playerView!.frame.origin = CGPoint(x: 0, y: 0)
}
override func windowDidLoad() {
super.windowDidLoad()
self.window?.titleVisibility = NSWindowTitleVisibility.hidden //hide window’s title
self.window?.styleMask = NSBorderlessWindowMask //hide window’s border
self.window?.hasShadow = false //hide window’s shadow
self.window?.level = Int(CGWindowLevelForKey(CGWindowLevelKey.desktopWindow)) //set window’s layer as desktopWindow layer
self.window?.center()
self.window?.makeKeyAndOrderFront(nil)
NSApp.activate(ignoringOtherApps: true)
if let screen = NSScreen.main() {
self.window?.setFrame(screen.visibleFrame, display: true, animate: false) //resizing the window to cover the whole screen
}
resizeBG() //resizing the temporary background image and the playerView to the window’s size
startVideo() //start playing and loop the first video as the window’s background
}
//function used for starting the video again once it has been played fully
func playerItemDidReachEnd(notification: NSNotification) {
playerView.player?.seek(to: kCMTimeZero)
playerView.player?.play()
}
//function used for starting and looping the video
func startVideo() {
//set the seeking time to be 2ms ahead to prevent a black screen every time the video loops
let playAhead = CMTimeMake(2, 100);
//loops the video
NotificationCenter.default.addObserver(forName: .AVPlayerItemDidPlayToEndTime, object:
playerView.player?.currentItem, queue: nil, using: { (_) in
DispatchQueue.main.async {
self.playerView.player?.seek(to: playAhead)
self.playerView.player?.play()
}
})
var playerLayer: AVPlayerLayer?
playerLayer = AVPlayerLayer(player: player)
playerView?.player = player
print(playerView?.player)
playerLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
player.play()
}
//changing the url to the new url and create a new AVPlayer then affect it to the playerView.player once the menu item is being selected
#IBAction func renderBG(_ sender: NSMenuItem) {
videoVariables.videoName = "Test_Video_2"
var theNewURL = Bundle.main.url(forResource:videoVariables.videoName, withExtension: "mp4")
player = AVPlayer.init(url: theNewURL!)
//!!this line crashes the app with the error "thread 1 exc_bad_instruction (code=exc_i386_invop subcode=0x0)" every time the menu item is being selected!!
playerView.player = player
}
}
Additionally, the background video is not supposed to be interactive(E.g. User cannot pause/ fast-forward the video), so any issues that might be caused by user interactivity can be ignored. The purpose of the app is to play a video on the user's desktop creating the exact same effect of running the command:
"/System/Library/Frameworks/ScreenSaver.framework/Resources/
ScreenSaverEngine.app/Contents/MacOS/ScreenSaverEngine -background" in terminal.
Any help would be much appreciated!
You don't need to create AVPlayer from url. There is AVPlayerItem class to manipulate player playback queue.
let firstAsset = AVURLAsset(url: firstVideoUrl)
let firstPlayerItem = AVPlayerItem(asset: firstAsset)
let player = AVPlayer(playerItem: firstPlayerItem)
let secondAsset = AVURLAsset(url: secondVideoUrl)
let secondPlayerItem = AVPlayerItem(asset: secondAsset)
player.replaceCurrentItem(with: secondPlayerItem)
Docs about AVPlayerItem

In iOS an application with a demo-video file(huge size) added to it, can impact RAM?

I am new to app development and actually i had recorded a demo video of the application and i had attached it to the application project
So now my application is increased to 38mb from 8mb { code size=8mb, video size = 30 mb}
So my question is when the appliation is running ,whether the entire application 38mb will sit on the RAM or while playin the demo video alone the 30mb-video will sit on the RAM ??
And wat's the maximum size that apple allow for an application in iOS ??
Thanks in advance...Here is my code how i paly a video using Media player
var moviePlayer : MPMoviePlayerController?
#IBOutlet weak var playerView: UIView!
override func viewDidLoad() {
super.viewDidLoad()
playVideo()
}
func playVideo() {
let videoPath = NSBundle.mainBundle().pathForResource("pjt", ofType:"mov")
//Make a URL from your path
let url = NSURL.fileURLWithPath(videoPath!)
//Initalize the movie player
moviePlayer = MPMoviePlayerController(contentURL: url)
if let player = moviePlayer {
player.view.frame = self.playerView.frame
player.scalingMode = .AspectFit
self.view.addSubview(player.view)
//Play the video
player.prepareToPlay()
}
else {
println("Movie player failed")
}
}`