Swift - Recorded Video is Mirrored on Front Camera - How to flip? - swift

I'm trying to mirror the recorded video from a capture session. The video preview for front facing camera shows a mirrored version, however, when I go to save the file and play it back, the captured video is actually mirrored. I'm using Apple's AVCam demo as a reference and can't seem to figure this out! Please help.
I've tried creating an AVCaptureConnection and trying to set the .isVideoMirrored parameter. However, I get this error:
cannot be added to the session because the source and destination media types are incompatible'
I would have thought mirroring the video would be much easier. I think I may be creating my connection incorrectly. The code below doesn't actually "Add connection" when I call the .canAddConnection check.
var captureSession: AVCaptureSession!
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
captureSession = AVCaptureSession()
//Setup Camera
if let dualCameraDevice = AVCaptureDevice.default(.builtInDualCamera, for: .video, position: .front) {
defaultVideoDevice = dualCameraDevice
} else if let frontCameraDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .front) {
// If the rear wide angle camera isn't available, default to the front wide angle camera.
defaultVideoDevice = frontCameraDevice
}
guard let videoDevice = defaultVideoDevice else {
print("Default video device is unavailable.")
// setupResult = .configurationFailed
captureSession.commitConfiguration()
return
}
let videoDeviceInput = try AVCaptureDeviceInput(device: videoDevice)
if captureSession.canAddInput(videoDeviceInput) {
captureSession.addInput(videoDeviceInput)
}
let movieOutput = AVCaptureMovieFileOutput()
//Video Input variable for AVCapture Connection
let videoInput: [AVCaptureInput.Port] = videoDeviceInput.ports
if captureSession.canAddOutput(movieOutput) {
captureSession.beginConfiguration()
captureSession.addOutput(movieOutput)
captureSession.sessionPreset = .medium
Then I try to setup the AVCapture connection and try to set the parameters for mirroring. Please tell me if there is an easier way to mirror the output / playback.
avCaptureConnection = AVCaptureConnection(inputPorts: videoInput, output: movieOutput)
avCaptureConnection.isEnabled = true
//Mirror the capture connection?
avCaptureConnection.automaticallyAdjustsVideoMirroring = false
avCaptureConnection.isVideoMirrored = false
//Check if we can add a connection
if captureSession.canAddConnection(avCaptureConnection) {
//Add the connection
captureSession.addConnection(avCaptureConnection)
}
captureSession.commitConfiguration()
self.movieOutput = movieOutput
setupLivePreview()
}
}
Somewhere else in the code, connected to an IBAaction, I initialize the recording
// Start recording video to a temporary file.
let outputFileName = NSUUID().uuidString
let outputFilePath = (NSTemporaryDirectory() as NSString).appendingPathComponent((outputFileName as NSString).appendingPathExtension("mov")!)
print("Recording in tap function")
movieOutput.startRecording(to: URL(fileURLWithPath: outputFilePath), recordingDelegate: self)
I think I'm using AVCaptureConnection incorrectly, especially because of the error stating media types are incompatible. If there is a proper way to implement this function please do let me know. Also open to hearing suggestions for an easier way to mirror the playback. Thank you!

Related

Unexpectedly found nil while unwrapping an Optional value, while trying to play video

I am using AVFoundation kit to play a local video
I tried this code:
import UIKit
import AVFoundation
class ViewController: UIViewController {
var player: AVPlayer?
#IBOutlet weak var videoViewContainer: UIView!
override func viewDidLoad() {
super.viewDidLoad()
initializeVideoPlayerWithVideo()
}
func initializeVideoPlayerWithVideo() {
// get the path string for the video from assets
let videoString:String? = Bundle.main.path(forResource: "Air Bike", ofType: "mov")
guard let unwrappedVideoPath = videoString else {return}
// convert the path string to a url
let videoUrl = URL(fileURLWithPath: unwrappedVideoPath)
// initialize the video player with the url
self.player = AVPlayer(url: videoUrl)
// create a video layer for the player
let layer: AVPlayerLayer = AVPlayerLayer(player: player)
// make the layer the same size as the container view
layer.frame = videoViewContainer.bounds
// make the video fill the layer as much as possible while keeping its aspect size
layer.videoGravity = AVLayerVideoGravity.resizeAspectFill
// add the layer to the container view
videoViewContainer.layer.addSublayer(layer)
}
#IBAction func playVideoButtonTapped(_ sender: UIButton) {
// play the video if the player is initialized
player?.play()
}
}
I've trying a few different approaches and I'm still getting the same error message can someone How can I resolve this issue?
func initializeVideoPlayerWithVideo() {
guard let path = Bundle.main.path(forResource: "jagdeep", ofType:".MOV") else {
debugPrint("video.m4v not found")
return
}
self.player = AVPlayer(url: URL(fileURLWithPath: path))
// create a video layer for the player
let layer: AVPlayerLayer = AVPlayerLayer(player: player)
// make the layer the same size as the container view
layer.frame = videoViewContainer.bounds
// make the video fill the layer as much as possible while keeping its aspect size
layer.videoGravity = AVLayerVideoGravity.resizeAspectFill
// add the layer to the container view
videoViewContainer.layer.addSublayer(layer)
player?.play()
}
HERE IS MY VIDEO NAME jagdeep and type .MOV AND its working on my side. It's problem in your code where you using type mov. I just drag a video name jagdeep.MOV to my project and use Your code. its ok
**use**
layer.frame = videoViewContainer.layer.bounds
instead of
// make the layer the same size as the container view
layer.frame = videoViewContainer.bounds
Thank you for your response seems like the file cannot be found here is a screenshot from my project what could be the reason the video is located in my main folder! Screenshot

swift Avcapture session for barcode scanning is not working

I am trying to build a barcode scanner. I adapted some of this tutorial. The video capture session is working but it is not detecting any barcode. I have gone through the code multiple times and still could not find what the problem could be. Here is my code for detecting the barcode
class ScanController: UIViewController, AVCaptureMetadataOutputObjectsDelegate {
var captureSession: AVCaptureSession?
var videoPreviewLayer: AVCaptureVideoPreviewLayer?
var qrCodeFrameView: UIView?
let supportedCodeTypes = [AVMetadataObject.ObjectType.upce,
AVMetadataObject.ObjectType.code39,
AVMetadataObject.ObjectType.qr]
override func viewDidLoad() {
super.viewDidLoad()
//Get an instance of the AVCaptureDevice class a device object and provide the video as the media type parameter
let captureDevice = AVCaptureDevice.default(for: AVMediaType.video)
do {
// Get an instance of the AVCaptureDeviceInput class using the previous device object.
let input = try AVCaptureDeviceInput(device: captureDevice!)
// Initialize the captureSession object.
captureSession = AVCaptureSession()
// Set the input device on the capture session.
captureSession?.addInput(input)
let captureMetadataOutput = AVCaptureMetadataOutput()
captureSession?.addOutput(captureMetadataOutput)
// Set delegate and use the default dispatch queue to execute the call back
captureMetadataOutput.setMetadataObjectsDelegate(self, queue: DispatchQueue.main)
captureMetadataOutput.metadataObjectTypes = supportedCodeTypes
// Initialize the video preview layer and add it as a sublayer to the viewPreview view's layer.
videoPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession!)
videoPreviewLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
videoPreviewLayer?.frame = view.layer.bounds
view.layer.addSublayer(videoPreviewLayer!)
// Start video capture.
captureSession?.startRunning()
// Add the message label
self.view.addSubview(messageLabel)
//initialize QR Code Frame to highlight the QR Code
qrCodeFrameView = UIView()
if let qrCodeFrameView = qrCodeFrameView {
qrCodeFrameView.layer.borderColor = UIColor.green.cgColor
qrCodeFrameView.layer.borderWidth = 2
view.addSubview(qrCodeFrameView)
view.bringSubview(toFront: qrCodeFrameView)
}
} catch {
// If any error occurs, simply print it out and don't continue any more.
print("THERE IS A PROBLEM WITH THE CAPTURE SESSION *****************")
print(error)
return
}
}
}
what am I missing ?
maybe you missing the Delegate Methods? In the Tutorial is the delegate method :
optional func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection)
under the section Decoding the QR Code

Custom camera in xcode, Swift 3

So I have this issue where I am trying to create a custom camera in Xcode however, for some reason I cannot get it so that it is set to use the front camera. No matter what I change in the code it seems to only use the back camera and I was hoping that someone might be generous enough to take a quick look at my code below and see whether there is something that I am missing or somewhere that I went wrong. Any help would be very much appreciated, thank you for your time.
func SelectInputDevice() {
let devices = AVCaptureDevice.defaultDevice(withDeviceType: .builtInWideAngleCamera,
mediaType: AVMediaTypeVideo, position: .front)
if devices?.position == AVCaptureDevicePosition.front {
print(devices?.position)
frontCamera = devices
}
currentCameraDevice = frontCamera
do {
let captureDeviceInput = try AVCaptureDeviceInput(device: currentCameraDevice)
captureSession.addInput(captureDeviceInput)
} catch {
print(error.localizedDescription)
}
}
This is where frontCamera and currentCameraDevice are a AVCaptureDevice's.
Seems there are a few things missing from your code:
1) In order to change input devices you need to reconfigure the session by calling session.beginConfiguration() before adding the new device and ending with session.commitConfiguration(). Also all changes should be made on the background queue (that hopefully you've created for the session) so that UI isn't blocked when session is configured.
2) Code would be safer to check with the session it allows the new device before adding it with session.canAddInput(captureDeviceInput) + removing the previous device (the back camera) as front+back config isn't allowed.
3) Also would be cleaner to check your device has a working front camera (might be broken) before to prevent any crashes.
Full code for changing capture device to front camera would look like:
func switchCameraToFront() {
//session & sessionQueue are references to the capture session and its dispatch queue
sessionQueue.async { [unowned self] in
let currentVideoInput = self.videoDeviceInput //ref to current videoInput as setup in initial session config
let preferredPosition: AVCaptureDevicePosition = .front
let preferredDeviceType: AVCaptureDeviceType = .builtInWideAngleCamera
let devices = self.videoDeviceDiscoverySession.devices!
var newVideoDevice: AVCaptureDevice? = nil
// First, look for a device with both the preferred position and device type. Otherwise, look for a device with only the preferred position.
if let device = devices.filter({ $0.position == preferredPosition && $0.deviceType == preferredDeviceType }).first {
newVideoDevice = device
}
else if let device = devices.filter({ $0.position == preferredPosition }).first {
newVideoDevice = device
}
if let videoDevice = newVideoDevice {
do {
let videoDeviceInput = try AVCaptureDeviceInput(device: videoDevice)
self.session.beginConfiguration()
// Remove the existing device input first, since using the front and back camera simultaneously is not supported.
self.session.removeInput(currentVideoInput)
if self.session.canAddInput(videoDeviceInput) {
self.session.addInput(videoDeviceInput)
self.videoDeviceInput = videoDeviceInput
}
else {
//fallback to current device
self.session.addInput(self.videoDeviceInput);
}
self.session.commitConfiguration()
}
catch {
}
}
}
}

Sound causing game to lag in swift sprite kit game?

New code
class SceneTwo: SKScene, SKPhysicsContactDelegate {
let flap = SKAction.playSoundFileNamed("flap.caf", waitForCompletion: false)
let whack = SKAction.playSoundFileNamed("whack.caf", waitForCompletion: false)
let tap = SKAction.playSoundFileNamed("tap.caf", waitForCompletion: false)
Then I simply have put
run(tap)
run(flap) etc
where necessary..
Hi just wondering if I am using the correct coding to play sounds in my game. For some context my game is similar to Flappy bird. One sound is played each time the screen is touched (when the bird has impulse upwards) the second sound is when the bird collects a coin in between each wall.
I have noticed that both of these sounds are causing my game to lag.
Below is my relative sound code for the game.
import AVFoundation
var flap: AVAudioPlayer?
var tap: AVAudioPlayer?
override func didMove(to view: SKView) {
tap?.prepareToPlay()
flap?.prepareToPlay()
func playFlap() {
let url = Bundle.main.url(forResource: "flap", withExtension: "caf")!
do {
flap = try AVAudioPlayer(contentsOf: url)
guard let flap = flap else { return }
flap.play()
} catch let error {
print(error.localizedDescription)
}
}
func playTap() {
let url = Bundle.main.url(forResource: "tap", withExtension: "caf")!
do {
tap = try AVAudioPlayer(contentsOf: url)
guard let tap = tap else { return }
tap.play()
} catch let error {
print(error.localizedDescription)
}
}
After this I have simply
playTap()
playFlap()
to where they are needed.
The sound is clear it just seems to make my spawning walls jump a little bit when the sound is made.
Is there something I am doing that is wrong?
You are getting lag because you are not preloading the sound files. You can preload them at App Launch, and then when you need just play them. For reference look into this stackoverflow's post
And if you still face the same issue then you can add sound in background queue, as demostrated here
let qualityOfServiceClass = QOS_CLASS_BACKGROUND
let backgroundQueue = dispatch_get_global_queue(qualityOfServiceClass, 0)
dispatch_async(backgroundQueue, {
audioPlayer.play()
})

Swift - Getting UIImages from Camera (AVCaptureSession)

Intro and background:
I have been working on a project for sometime that lets the user do some custom manipulations from their camera (a live feed)
At the moment, I start the capture session in the following way:
var session: AVCaptureSession?
var stillImageOutput: AVCaptureStillImageOutput?
var videoPreviewLayer: AVCaptureVideoPreviewLayer?
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
videoPreviewLayer!.frame = CameraView.bounds
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
session = AVCaptureSession()
session!.sessionPreset = AVCaptureSessionPresetPhoto
let backCamera = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
var error: NSError?
var input: AVCaptureDeviceInput!
do {
input = try AVCaptureDeviceInput(device: backCamera)
} catch let error1 as NSError {
error = error1
input = nil
}
if error == nil && session!.canAddInput(input) {
session!.addInput(input)
stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput?.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
if session!.canAddOutput(stillImageOutput) {
session!.addOutput(stillImageOutput)
videoPreviewLayer = AVCaptureVideoPreviewLayer(session: session)
videoPreviewLayer!.videoGravity = AVLayerVideoGravityResizeAspect
videoPreviewLayer!.connection?.videoOrientation = AVCaptureVideoOrientation.portrait
CameraView.layer.addSublayer(videoPreviewLayer!)
session!.startRunning()
}
}
}
where CameraView is the UIView of my viewcontroller. I now have a function called: singleTapped() that I want to get every frame of the capture, process it, then put into the CameraView frame (Perhaps I should be using a UIImageView instead?)...
Research:
I have looked here and here, as well as many others for getting the frames of the camera, yet these don't necessarily conclude where I need. What's interesting is in the first link I provided: In their answer they have:
self.stillImageOutput.captureStillImageAsynchronouslyFromConnection(self.stillImageOutput.connectionWithMediaType(AVMediaTypeVideo)) { (buffer:CMSampleBuffer!, error:NSError!) -> Void in
var image = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(buffer)
var data_image = UIImage(data: image) //THEY EXTRACTED A UIIMAGE HERE
self.imageView.image = data_image
}
which does indeed get a UIImage from the camera, but is this a viable method for 30fps?
Rational and Constraints:
The reason for why I need a UIImage is because I am utilizing a library someone else wrote for transforming a UIImage in a custom way quickly. I want to present this transformation to the user "live".
In conclusion
Please let me know if I am missing something, or if I should reword something. As said above this is my first post, so I am not quite strong with SO nuances. Thanks, and cheers
You should maybe try reconsider using AVCaptureSession. For what you are doing (I assume) you should try using OpenCV. Its a great utility for image manipulations, especially if you are doing so at 30/60fps* (The actual frame rate after processing might, and I guarantee will, be less). Depending on what this manipulation is you have been given, you can easily port that over into XCode using bridging headers or converting everything entirely to C++ for use with OpenCV.
With OpenCV you can call the camera from built-in functions and that can save you lots of processing time and therefore runtime. For example, take a look at this.
I have used OpenCV for similar situations to which you just described, and I think you could benefit. Swift is nice, but sometimes handling certain things are better through other means...