Custom camera in xcode, Swift 3 - swift

So I have this issue where I am trying to create a custom camera in Xcode however, for some reason I cannot get it so that it is set to use the front camera. No matter what I change in the code it seems to only use the back camera and I was hoping that someone might be generous enough to take a quick look at my code below and see whether there is something that I am missing or somewhere that I went wrong. Any help would be very much appreciated, thank you for your time.
func SelectInputDevice() {
let devices = AVCaptureDevice.defaultDevice(withDeviceType: .builtInWideAngleCamera,
mediaType: AVMediaTypeVideo, position: .front)
if devices?.position == AVCaptureDevicePosition.front {
print(devices?.position)
frontCamera = devices
}
currentCameraDevice = frontCamera
do {
let captureDeviceInput = try AVCaptureDeviceInput(device: currentCameraDevice)
captureSession.addInput(captureDeviceInput)
} catch {
print(error.localizedDescription)
}
}
This is where frontCamera and currentCameraDevice are a AVCaptureDevice's.

Seems there are a few things missing from your code:
1) In order to change input devices you need to reconfigure the session by calling session.beginConfiguration() before adding the new device and ending with session.commitConfiguration(). Also all changes should be made on the background queue (that hopefully you've created for the session) so that UI isn't blocked when session is configured.
2) Code would be safer to check with the session it allows the new device before adding it with session.canAddInput(captureDeviceInput) + removing the previous device (the back camera) as front+back config isn't allowed.
3) Also would be cleaner to check your device has a working front camera (might be broken) before to prevent any crashes.
Full code for changing capture device to front camera would look like:
func switchCameraToFront() {
//session & sessionQueue are references to the capture session and its dispatch queue
sessionQueue.async { [unowned self] in
let currentVideoInput = self.videoDeviceInput //ref to current videoInput as setup in initial session config
let preferredPosition: AVCaptureDevicePosition = .front
let preferredDeviceType: AVCaptureDeviceType = .builtInWideAngleCamera
let devices = self.videoDeviceDiscoverySession.devices!
var newVideoDevice: AVCaptureDevice? = nil
// First, look for a device with both the preferred position and device type. Otherwise, look for a device with only the preferred position.
if let device = devices.filter({ $0.position == preferredPosition && $0.deviceType == preferredDeviceType }).first {
newVideoDevice = device
}
else if let device = devices.filter({ $0.position == preferredPosition }).first {
newVideoDevice = device
}
if let videoDevice = newVideoDevice {
do {
let videoDeviceInput = try AVCaptureDeviceInput(device: videoDevice)
self.session.beginConfiguration()
// Remove the existing device input first, since using the front and back camera simultaneously is not supported.
self.session.removeInput(currentVideoInput)
if self.session.canAddInput(videoDeviceInput) {
self.session.addInput(videoDeviceInput)
self.videoDeviceInput = videoDeviceInput
}
else {
//fallback to current device
self.session.addInput(self.videoDeviceInput);
}
self.session.commitConfiguration()
}
catch {
}
}
}
}

Related

Swift - Recorded Video is Mirrored on Front Camera - How to flip?

I'm trying to mirror the recorded video from a capture session. The video preview for front facing camera shows a mirrored version, however, when I go to save the file and play it back, the captured video is actually mirrored. I'm using Apple's AVCam demo as a reference and can't seem to figure this out! Please help.
I've tried creating an AVCaptureConnection and trying to set the .isVideoMirrored parameter. However, I get this error:
cannot be added to the session because the source and destination media types are incompatible'
I would have thought mirroring the video would be much easier. I think I may be creating my connection incorrectly. The code below doesn't actually "Add connection" when I call the .canAddConnection check.
var captureSession: AVCaptureSession!
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
captureSession = AVCaptureSession()
//Setup Camera
if let dualCameraDevice = AVCaptureDevice.default(.builtInDualCamera, for: .video, position: .front) {
defaultVideoDevice = dualCameraDevice
} else if let frontCameraDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .front) {
// If the rear wide angle camera isn't available, default to the front wide angle camera.
defaultVideoDevice = frontCameraDevice
}
guard let videoDevice = defaultVideoDevice else {
print("Default video device is unavailable.")
// setupResult = .configurationFailed
captureSession.commitConfiguration()
return
}
let videoDeviceInput = try AVCaptureDeviceInput(device: videoDevice)
if captureSession.canAddInput(videoDeviceInput) {
captureSession.addInput(videoDeviceInput)
}
let movieOutput = AVCaptureMovieFileOutput()
//Video Input variable for AVCapture Connection
let videoInput: [AVCaptureInput.Port] = videoDeviceInput.ports
if captureSession.canAddOutput(movieOutput) {
captureSession.beginConfiguration()
captureSession.addOutput(movieOutput)
captureSession.sessionPreset = .medium
Then I try to setup the AVCapture connection and try to set the parameters for mirroring. Please tell me if there is an easier way to mirror the output / playback.
avCaptureConnection = AVCaptureConnection(inputPorts: videoInput, output: movieOutput)
avCaptureConnection.isEnabled = true
//Mirror the capture connection?
avCaptureConnection.automaticallyAdjustsVideoMirroring = false
avCaptureConnection.isVideoMirrored = false
//Check if we can add a connection
if captureSession.canAddConnection(avCaptureConnection) {
//Add the connection
captureSession.addConnection(avCaptureConnection)
}
captureSession.commitConfiguration()
self.movieOutput = movieOutput
setupLivePreview()
}
}
Somewhere else in the code, connected to an IBAaction, I initialize the recording
// Start recording video to a temporary file.
let outputFileName = NSUUID().uuidString
let outputFilePath = (NSTemporaryDirectory() as NSString).appendingPathComponent((outputFileName as NSString).appendingPathExtension("mov")!)
print("Recording in tap function")
movieOutput.startRecording(to: URL(fileURLWithPath: outputFilePath), recordingDelegate: self)
I think I'm using AVCaptureConnection incorrectly, especially because of the error stating media types are incompatible. If there is a proper way to implement this function please do let me know. Also open to hearing suggestions for an easier way to mirror the playback. Thank you!

How to access camera in Mac OS Mojave Swift playground

How to access the camera in Swift playgrounds on Mac OS Mojave?
This used to work with no problem on High Sierra however the code does not seem to execute correctly any more as the playgrounds running symbol is spinning continuously as well as the camera lights staying off.
I got an example of the code which used to work fine last year but not any more today does anyone know what's wrong and how to fix it?
import Cocoa
import AVFoundation
import AVKit
import QuartzCore
import PlaygroundSupport
let view = NSView(frame: NSRect(x: 0.0, y: 0.0, width: 640.0, height: 480.0))
let session = AVCaptureSession()
session.sessionPreset = AVCaptureSession.Preset.vga640x480
session.beginConfiguration()
session.commitConfiguration()
var input : AVCaptureDeviceInput
if let devices : [AVCaptureDevice] = AVCaptureDevice.devices() as? [AVCaptureDevice] {
for device in devices {
if device.hasMediaType(AVMediaType.video) && device.supportsSessionPreset(AVCaptureSession.Preset.vga640x480) {
do {
input = try AVCaptureDeviceInput(device: device as AVCaptureDevice) as AVCaptureDeviceInput
if session.canAddInput(input) {
session.addInput(input)
break
}
}
catch {
error
}
}
}
let output = AVCaptureVideoDataOutput()
output.videoSettings = [kCVPixelBufferPixelFormatTypeKey as AnyHashable as! String: Int(kCVPixelFormatType_32BGRA)]
output.alwaysDiscardsLateVideoFrames = true
if session.canAddOutput(output) {
session.addOutput(output)
}
let captureLayer = AVCaptureVideoPreviewLayer(session: session)
view.wantsLayer = true
view.layer = captureLayer
session.startRunning()
//View -> Assistant Editor -> Show Assistant Editor
PlaygroundPage.current.liveView = view
}
Looked at apples documentation on how to request access to the camera I tried it however it does not seem to work in playgrounds as I'm still having the same issue when this is implemented.
https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture/requesting_authorization_for_media_capture_on_macos
switch AVCaptureDevice.authorizationStatus(for: .video) {
case .authorized: // The user has previously granted access to the camera.
self.setupCaptureSession()
case .notDetermined: // The user has not yet been asked for camera access.
AVCaptureDevice.requestAccess(for: .video) { granted in
if granted {
self.setupCaptureSession()
}
}
case .denied: // The user has previously denied access.
return
case .restricted: // The user can't grant access due to restrictions.
return
}

Refreshing AVCaptureSession...?

I am having some difficulty with AVCaptureSession when popping view controllers. I have a view controller in a navigation controller where a user takes a photo. After the photo is captured, I segue to a "preview photo" view controller. If the user doesn't like the photo, they can go back and re take it. When I pop the preview photo view controller, the app crashes with error "Multiple audio/video AVCaptureInputs are not currently supported'"
I thought that maybe I can remove/ refresh the input session but it's still crashing.
Any support/ advice is greatly appreciated!
segue:
#IBAction func cancelPressed(_ sender: UIButton) {
_ = self.navigationController?.popViewController(animated: true)
}
camera config (which works fine):
func setupCaptureSessionCamera() {
//this makes sure to get full res of camera
captureSession.sessionPreset = AVCaptureSession.Preset.photo
var devices = AVCaptureDevice.devices(for: .video)
//query available devices
for device in devices {
if device.position == .front {
frontFacingCamera = device
} else if device.position == .back {
backFacingCamera = device
}
}//end iteration
//set a default device
currentDevice = backFacingCamera
//configure session w output for capturing still img
stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput?.outputSettings = [AVVideoCodecKey : AVVideoCodecType.jpeg]
do {
let captureDeviceInput = try AVCaptureDeviceInput(device: currentDevice!)
captureSession.addInput(captureDeviceInput)
captureSession.addOutput(stillImageOutput!)
//setup camera preview layer
cameraPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
//add the preview to our specified view in the UI
view.layer.addSublayer(cameraPreviewLayer!)
cameraPreviewLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
cameraPreviewLayer?.frame = cameraView.frame
captureSession.startRunning()
} catch let error {
print(error)
}//end do
}
What I tried (remove inputs in view will appear if the sender is preview photo controller):
func refreshCamera() {
captureSession.beginConfiguration()
for input in captureSession.inputs {
captureSession.removeInput(input as! AVCaptureDeviceInput)
}
captureSession.commitConfiguration()
}
It was much simpler than I was imagining. All that is needed is to first check if there is already input or not before calling the setupCameraSession method:
if captureSession.inputs.isEmpty {
setupCaptureSessionCamera()
}

Ringer volume slider, swift

so my problem is that I'm creating an app which uses local notifications as an alarm. The only issue is that if the device has its ringer volume down or on silent, it will not play any sound. I am aware that you are unable to do this in the background however is it possible to have a slider linked up to the ringer volume (like a slider for MPVolume)? Also is it possible to check whether the device is muted? Again I just want to point out I am aware that you cannot simply change it without the user knowing but I was wondering whether the two methods stated earlier are possible. Any help on the topic will be greatly appreciated.
It is not possible to detect if an iPhone has its silence switch on or to change the "Ringer" (Not Volume) as Apple does not provide access to them.
There used to be a workaround a while back using Objective-C but I'm not quite sure if it still works, you are welcome to try if you want to.
AVSystemController
Also is it possible to check whether the device is muted?
This issue is really frustrating and I can't believe Apple makes it so hard :(
Checking the volume itself is rather simple:
let audioSession = AVAudioSession.sharedInstance()
let volume: Float = audioSession.outputVolume
If volume is 0.0, it's silent. But the real problem is the ringer switch.
My solution is to play a silent sound (called "silence.mp3" which is 1.5 sec long and all silent) and check after 0.5 sec if it's still being played.
This is totally inspired by the SoundSwitch: https://github.com/moshegottlieb/SoundSwitch
In usage:
MyAudioPlayer.isLoudCheck()
{
isLoud in
print("%%% - device is loud = \(isLoud)")
}
I changed the audio player class to this (usually I just use it to play sound files):
import AVFoundation
class MyAudioPlayer: NSObject, AVAudioPlayerDelegate
{
private static let sharedPlayer: MyAudioPlayer =
{
return MyAudioPlayer()
}()
public var container = [String : AVAudioPlayer]()
static func isLoudCheck(completionHandler: #escaping (Bool?) -> ())
{
let name = "silence"
let key = name
var player: AVAudioPlayer?
for (file, thePlayer) in sharedPlayer.container
{
if file == key
{
player = thePlayer
break
}
}
if player == nil, let resource = Bundle.main.path(forResource: name, ofType: "mp3")
{
do
{
player = try AVAudioPlayer(contentsOf: URL(fileURLWithPath: resource), fileTypeHint: AVFileTypeMPEGLayer3)
}
catch
{
print("%%% - audio error?")
}
}
if let thePlayer = player
{
print("%%% - the player plays")
thePlayer.delegate = sharedPlayer
sharedPlayer.container[key] = thePlayer
thePlayer.play()
}
DispatchQueue.main.asyncAfter(deadline: .now() + 0.5, execute:
{
if let thePlayer = player
{
if thePlayer.isPlaying
{
print("%%% - check: playing")
completionHandler(true)
}
else
{
print("%%% - check: not playing")
completionHandler(false)
}
}
})
}
static func isPlaying(key: String) -> Bool?
{
return sharedPlayer.container[key]?.isPlaying
}
}
Hope this helps someone :)

How to make phone calls in swift

First of all I know there are some similar topics as this one but because of my reputation I couldn't comment on those for help and stack overflow warned me not to ask for help from the answers section.. none of the similar posts have answered my question so here I go.
As can be understood from the topic, I want make a phone call on click,
I'm making an app for my business and I want to put in a call button so that people can call me over the app.
here are the attempts I've tried as read from the similar topics:
let phoneNumber = "1234567890"
if let phoneCallURL = NSURL(string: "tel:\(phoneNumber)") {
let application = UIApplication.sharedApplication()
if application.canOpenURL(phoneCallURL) {
application.openURL(phoneCallURL)
}
else{
println("failed")
}
}
so when I run the above code with a phone number it prints out the failed message on the console seems like i fails opening the URL
The other code I've tried is a very similar one
var url:NSURL = NSURL(string: "tel://phoneNumber")!
UIApplication.sharedApplication().openURL(url)
one other question is that: What is the correct syntax for the NSURL?
this
NSURL(string: "tel://\(phoneNumber)")
or this
NSURL(string: "tel//:\(phoneNumber)")
My last question is: If the app manages to make a call, does it appear on the simulator like a calling screen? I'm very new to swift programming and I apologise if the questions seem stupid..
The simple format for a tel: URL is tel:######. / is not a number. You probably mean this to be:
NSURL(string: "tel:\(phoneNumber)")
assuming phoneNumber is a string containing the phone number.
The tel: scheme is defined in RFC2806. You can look there for details on all its expected features.
Note that phone calls are not possible in the simulator, so you'll receive an error if you try to open a tel: URL there (unless you handle the URL yourself by registering an NSURLProtocol).
If you want to return to your app after your call has been ended (which you should do) then you need to use telprompt:// instead of tel://. The tel:// will take you to the home screen after the call.
Better use this:
var url:NSURL = NSURL(string: "telprompt://1234567891")!
UIApplication.sharedApplication().openURL(url)
let phoneNumber = "0507712323"
if let callNumber = phoneNumber {
let aURL = NSURL(string: "telprompt://\(callNumber)")
if UIApplication.sharedApplication().canOpenURL(aURL) {
UIApplication.sharedApplication().openURL(aURL)
} else {
print("error")
}
}
else {
print("error")}
I have this issue for different reasons and I would like share with you .
First Dont try in simulator must try on real device
Second make sure the passed number dont contain space
here is example
private func callNumber(phoneNumber:String) {
// I add this line to make sure passed number wihthout space
let CleanphoneNumber = phoneNumber.stringByReplacingOccurrencesOfString(" ", withString: "")
if let phoneCallURL:NSURL = NSURL(string: "tel://\(CleanphoneNumber)") {
let application:UIApplication = UIApplication.sharedApplication()
if (application.canOpenURL(phoneCallURL)) {
application.openURL(phoneCallURL);
}
}
}
Your code looks correct.It seems that it would always fail if you test it in simulator.
Try to use your iPhone to run it,and it would go to dialer interface as you want.
i added some additional validation in the code
func makeCall(constactNumber : NSString)
{
if(constactNumber.length == 0)
{
print("Contact number in not valid")
}
else
{
let CleanconstactNumber = constactNumber.stringByReplacingOccurrencesOfString(" ", withString: "")
if let phoneCallURL:NSURL = NSURL(string: "tel://\(CleanconstactNumber)")
{
if (UIDevice.currentDevice().model.rangeOfString("iPad") != nil)
{
print("Your device doesn't support this feature.")
} else
{
let application:UIApplication = UIApplication.sharedApplication()
if (application.canOpenURL(phoneCallURL))
{
let mobileNetworkCode = CTTelephonyNetworkInfo().subscriberCellularProvider?.mobileNetworkCode
if( mobileNetworkCode == nil)
{
print(" No sim present Or No cellular coverage or phone is on airplane mode.")
}
else
{
application.openURL(phoneCallURL);
}
}
}
}
}
}