Image capture is slower than flash in Iphone photo capture using AVFoundation - iphone

We are using latest Swift Code and AVFoundation API's to create a camera App which has Flash, Reverse camera and Capture photo functionality. Code needs to support ios10 onwards.
Issue we are having is Camera flash appears, but doesn't get captured in the photo(basically camera flash appears a bit sooner than the photo capture or photo capture is bit slower than the flash) Which is making our flash funcitonality useless.
here is the code for our Camera Capture:
//Function to capture the image from the camera session -> this gets called from the ViewController Outlet action OnCapture
func capture() throws {
guard captureSession.isRunning else {
throw CameraRuntimeError.captureSessionIsMissing
}
let settings = AVCapturePhotoSettings()
if getCurrentCamera().isFlashAvailable {
settings.flashMode = self.flashMode
}
self.photoOutput?.capturePhoto(with: settings, delegate: self)
}
And here is the delegate:
extension CameraFunctions: AVCapturePhotoCaptureDelegate {
private static let failedToConvertToJPEGErrorCode = "JPEGERROR"
private static let failedToCaptureImage = "CAMERROR"
public func photoOutput(_ captureOutput: AVCapturePhotoOutput,
didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?,
previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?,
resolvedSettings: AVCaptureResolvedPhotoSettings,
bracketSettings: AVCaptureBracketedStillImageSettings?, error: Swift.Error?) {
if error != nil {
onPhotoCaptured(StringResult(error: ServicesError(CameraFunctions.failedToCaptureImage, error!.localizedDescription)))
}
if let buffer = photoSampleBuffer, let data = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: buffer, previewPhotoSampleBuffer: nil) {
let encodedString = //DO ENCODING OF THE PHOTO
onPhotoCaptured(encodedString)
} else {
onPhotoCaptured(StringResult(error: ServicesError(CameraFunctions.failedToConvertToJPEGErrorCode, CameraRuntimeError.failedToConvertImageToJPEG.localizedDescription)))
}
closeCaptureSession()
}
}
onPhotoCaptured is present in a ViewController.
please let us know if we are doing something wrong.

setting the prepared Settings for photo output solved this issue:
func capture(_ delegate: AVCapturePhotoCaptureDelegate, _ onError: #escaping (Error) -> Void) throws {
guard captureSession.isRunning else {
throw CameraRuntimeError.captureSessionIsMissing
}
let settings: AVCapturePhotoSettings
if #available(iOS 11.0, *) {
settings = AVCapturePhotoSettings(format: [AVVideoCodecKey: AVVideoCodecType.jpeg])
settings.isAutoStillImageStabilizationEnabled = true
} else {
settings = AVCapturePhotoSettings(format: [AVVideoCodecKey: AVVideoCodecJPEG])
}
if getCurrentCamera().isFlashAvailable {
settings.flashMode = self.flashMode
}
//This statement did the magic
self.photoOutput?.setPreparedPhotoSettingsArray([settings]) { (suc: Bool, err: Error?) -> Void in
if suc {
self.photoOutput?.capturePhoto(with: settings, delegate: delegate)
}
if err != nil {
onError(err!)
}
}
}

Related

func paste(itemProviders: [NSItemProvider]) {

I use the "drag" and "paste" functions of multiple imagesViews to multiple imageViews in the same application. As the function "drag" allows to know the imageView of origin, does the function "drop" makes it possible to know the imageView "target" at the end of the "drag" (coordinates? Tag? ...). Thank you for any suggestions.
// Right here, itemsForBeginning session: UIDragSession) allows to retrieve the imageView of origin.
func dragInteraction(_ interaction: UIDragInteraction, itemsForBeginning session: UIDragSession) -> [UIDragItem] {
let touchPoint = session.location(in: self.view)
print("\(touchPoint.x)")
print("\(touchPoint.y)")
//...
guard let image = viewTaped!.image else { return [] }
let item = UIDragItem(itemProvider: NSItemProvider(object: image))
return [item]
}
// But with the function "paste", Swift knows the target, but can I and how to recover the informations of imageView_yyy? (coordinates? Tag? ...).
override func paste(itemProviders: [NSItemProvider]) {
_ = itemProviders.first?.loadObject(ofClass: UIImage.self, completionHandler: { (image: NSItemProviderReading?, error: Error?) in
DispatchQueue.main.async {
self.imageView_yyy.image = image as? UIImage
}
})
}
You need to retain the loading callback. By assigning it to _ you are telling Swift that it's trash that can be discarded immediately
// This retains the progress
var pasteProgress: Progress?
override func paste(itemProviders: [NSItemProvider]) {
pasteProgress = itemProviders.first?.loadObject(ofClass: UIImage.self, completionHandler: { (image: NSItemProviderReading?, error: Error?) in
DispatchQueue.main.async {
self.imageView_yyy.image = image as? UIImage
}
})
}

ReplayKit: RPScreenRecorder.shared().startCapture() NOT WORKING

ReplayKit has really been frustrating me recently. For some reason
RPScreenRecorder.shared().startCapture(handler: { (sample, bufferType, error) in
does not actually work when I call it because I have a print() statement inside it and it is never called.
My code in the ViewController is:
import UIKit
import AVFoundation
import SpriteKit
import ReplayKit
import AVKit
class ViewController: UIViewController, AVCaptureVideoDataOutputSampleBufferDelegate, UIImagePickerControllerDelegate, UINavigationControllerDelegate, RPPreviewViewControllerDelegate {
var assetWriter:AVAssetWriter!
var videoInput:AVAssetWriterInput!
func startRecording(withFileName fileName: String) {
if #available(iOS 11.0, *)
{
assetWriter = try! AVAssetWriter(outputURL: fileURL, fileType:
AVFileType.mp4)
let videoOutputSettings: Dictionary<String, Any> = [
AVVideoCodecKey : AVVideoCodecType.h264,
AVVideoWidthKey : UIScreen.main.bounds.size.width,
AVVideoHeightKey : UIScreen.main.bounds.size.height
];
videoInput = AVAssetWriterInput (mediaType: AVMediaType.video, outputSettings: videoOutputSettings)
videoInput.expectsMediaDataInRealTime = true
assetWriter.add(videoInput)
print("HERE")
RPScreenRecorder.shared().startCapture(handler: { (sample, bufferType, error) in
print("RECORDING")
}
}
}
func stopRecording(handler: #escaping (Error?) -> Void)
{
if #available(iOS 11.0, *)
{
RPScreenRecorder.shared().stopCapture
{ (error) in
handler(error)
self.assetWriter.finishWriting
{
print("STOPPED")
}
}
}
}
"HERE" is printed, but not "RECORDING"
[p.s. sorry for bad formatting in code, I'm sure you'll understand :)]
I have also tried a different method:
let recorder = RPScreenRecorder.shared()
recorder.startRecording{ [unowned self] (error) in
guard error == nil else {
print("There was an error starting the recording.")
return
}
print("Started Recording Successfully")
}
and to stop the recording...
recorder.stopRecording { [unowned self] (preview, error) in
print("Stopped recording")
guard preview != nil else {
print("Preview controller is not available.")
return
}
onGoingScene = true
preview?.previewControllerDelegate = self
self.present(preview!, animated: true, completion: nil)
}
This method does not stop when I call the recorder.stopRecording() function, "Stopped recording" is never called.
Can someone please help me because this is really frustrating me, how can you PROPERLY use ReplayKit to record your screen in iOS 11? I have searched all over the internet and none of the methods work for me, I don't why. P.S. I have the necessary permission keys in my Info.plist.
Thanks
A huge reminder that ReplayKit doesn't work in simulator. I wasted hours on the exact same issue until realized that ReplayKit will never trigger startCapture handler because it never records in simulator.
Well there are quite few possible causes for this issue.
Some of them are here:
Your Replay kit Shared Recorder might be crashed, For that you can restart your device and check again.
There might be printable issue in your replay kit. For that kindly conform to the RPScreenRecorderDelegateProtocol and add Recording Changes
screenRecorder:didStopRecordingWithPreviewViewController:error:
method to your class and check if any error shows up in this method.

Alamofire background working on simulator but not on device

As the title mentions I've set up a backgroundURL with Alamofire. It works like a charm in simulator but on my device doesn't. I'm sure I'm missing something here since I'm not that experienced with URL.
Here's the code I have so far:
class NetworkManager {
static let shared = NetworkManager()
private lazy var backgroundManager: Alamofire.SessionManager = {
let bundleIdentifier = MyStruct.identifier
return Alamofire.SessionManager(configuration: URLSessionConfiguration.background(withIdentifier: bundleIdentifier))
}()
var backgroundCompletionHandler: (() -> Void)? {
get{
return backgroundManager.backgroundCompletionHandler
}
set{
backgroundManager.backgroundCompletionHandler = newValue
}
}
}
func application(_ application: UIApplication, handleEventsForBackgroundURLSession identifier: String, completionHandler: #escaping () -> Void) {
NetworkManager.shared.backgroundCompletionHandler = completionHandler
}
In my ViewController:
func populateArrays(){
Alamofire.request("http://www.aps.anl.gov/Accelerator_Systems_Division/Accelerator_Operations_Physics/sddsStatus/mainStatus.sdds.gz").responseData { response in
switch response.result{
case .success:
print("Validation Successful")
case .failure(let error):
print(error.localizedDescription)
}
if let data = response.result.value{
Solved it. For anyone else that has this problem you need to add the following code to your appDelegate.
func applicationDidEnterBackground(_ application: UIApplication) {
var bgTask = 0
var app = UIApplication.shared
bgTask = app.beginBackgroundTask(expirationHandler: {() -> Void in
app.endBackgroundTask(bgTask)
})
It seems to me that you are not using the background manager you've created. Instead of
Alamofire.request("http://www.aps.anl.gov...")
which calls the default (not background) session manager, you should use:
backgroundManager.request("http://www.aps.anl.gov...")
Which Jon Shier mentioned in the comments by the way.

Debugging advice for WatchOS2

I've been going through the examples in WatchOS 2 By Tutorial book by the team over at RayWenderlich, specifically chapter 18. They all work fine. In my own App, I am trying to send a button press from the watch to fire a button on the iPhone App. Here's the relevant code in Swift from the Watch and the Phone:
Watch:
//
// InterfaceController.swift
// Wasted Time Extension
//
// Created by Michael Rowe on 7/21/15.
// Copyright © 2010-2015 Michael Rowe. All rights reserved.
//
import WatchKit
import WatchConnectivity
import Foundation
class InterfaceController: WKInterfaceController,WCSessionDelegate {
#IBOutlet var wasteLabel: WKInterfaceLabel!
#IBOutlet var costLabel: WKInterfaceLabel!
#IBOutlet var counter: WKInterfaceLabel!
#IBOutlet var statusButton: WKInterfaceButton!
// our watchconnective session
var session : WCSession?
override func awakeWithContext(context: AnyObject?) {
super.awakeWithContext(context)
}
override func willActivate() {
// This method is called when watch view controller is about to be visible to user
super.willActivate()
if(WCSession.isSupported()){
session = WCSession.defaultSession()
session!.delegate = self
session!.activateSession()
}
}
override func didDeactivate() {
// This method is called when watch view controller is no longer visible
super.didDeactivate()
}
func session(session: WCSession, didReceiveMessage message: [String: AnyObject], replyHandler: [String: AnyObject] -> Void) {
print("Did receive message Watch \(message)")
}
#IBAction func addButtonPressed() {
// Pull values from the Phone for current meeting cost, waste costs, and people in meeting
let prefs:NSUserDefaults = NSUserDefaults(suiteName: "a.b.c")!
var counterd = prefs.doubleForKey("keyPeopleInMeeting")
counterd++
counter.setText(String(format:"%9.0f", counterd))
// Sending data to iPhone via Interactive Messaging
if WCSession.isSupported(){
// we have a watch supporting iPhone
let session = WCSession.defaultSession()
// we can reach the watch
if session.reachable {
let message = ["add": "1"]
print("Message \(message)")
session.transferUserInfo(message)
print("Send Message Add - People \(counterd)")
}
}
if WCSession.isSupported() {
let session = WCSession.defaultSession()
if session.reachable {
let message = ["add":"1"]
session.sendMessage(message, replyHandler: { ( reply: [String: AnyObject]) -> Void in
print("Reply: \(reply)")
}, errorHandler: { (error: NSError) -> Void in
print("ERROR Watch: \(error.localizedDescription)")
})
} else { // reachable
self.showReachabilityError()
}
}
print("Watch Add Button Pressed \(counterd)")
}
#IBAction func minusButtonPressed() {
// Pull values from the Phone for current meeting cost, waste costs, and people in meeting
let prefs:NSUserDefaults = NSUserDefaults(suiteName: "a.b.c")!
var counterd = prefs.doubleForKey("keyPeopleInMeeting")
counterd--
if (counterd <= 1) {
counterd = 1
}
counter.setText(String(format:"%9.0f", counterd))
if WCSession.isSupported() {
let session = WCSession.defaultSession()
if session.reachable {
let message = ["minus":"1"]
session.sendMessage(message, replyHandler: { ( reply: [String: AnyObject]) -> Void in
print("Reply: \(reply)")
}, errorHandler: { (error: NSError) -> Void in
print("ERROR Watch: \(error.localizedDescription)")
})
} else { // reachable
self.showReachabilityError()
}
}
print("Watch Minus Button Pressed \(counterd)")
}
func statusButtonPressed() {
// Pull values from the Phone for current meeting cost, waste costs, and people in meeting
let prefs:NSUserDefaults = NSUserDefaults(suiteName: "a.b.c")!
let status = statusButton.description
if WCSession.isSupported() {
let session = WCSession.defaultSession()
if session.reachable {
let message = ["status":status]
session.sendMessage(message, replyHandler: { ( reply: [String: AnyObject]) -> Void in
print("Reply: \(reply)")
}, errorHandler: { (error: NSError) -> Void in
print("ERROR Watch: \(error.localizedDescription)")
})
} else { // reachable
self.showReachabilityError()
}
}
print("Watch Status Button Pressed - Status \(statusButton)")
}
func session(session: WCSession, didReceiveApplicationContext applicationContext: [String : AnyObject]){
let prefs:NSUserDefaults = NSUserDefaults(suiteName: "a.b.c")!
if let waste = applicationContext["waste"] as? Float {
print("Watch Receive - Waste \(waste)")
}
if let cost = applicationContext["cost"] as? Float {
print("Watch Receive - Cost \(cost)")
}
if let counternum = applicationContext["counter"] as? Float {
print("Watch Receive - Counter \(counternum)")
}
if let status = applicationContext["status"] as? String {
print("Watch Receive - Status \(status)")
statusButton.setTitle(status)
}
}
private func showReachabilityError() {
let tryAgain = WKAlertAction(title: "Try Again", style: .Default, handler: { () -> Void in })
let cancel = WKAlertAction(title: "Cancel", style: .Cancel, handler: { () -> Void in })
self.presentAlertControllerWithTitle("Your iPhone is not reachable.", message: "You cannot adjust the status or number of attendees Watch is not currently connected to your iPhone. Please ensure your iPhone is on and within range of your Watch.", preferredStyle: WKAlertControllerStyle.Alert, actions:[tryAgain, cancel])
}
func session(session: WCSession, didFinishUserInfoTransfer userInfoTransfer: WCSessionUserInfoTransfer, error: NSError?) {
print("Transfer User Info Error watch: \(error)")
}
}
And the receiving code on the
iPhone:CODE:
func session(session: WCSession,
didReceiveMessage message: [String : AnyObject],
replyHandler: ([String : AnyObject]) -> Void) {
if let counterd = message["add"] as? Float {
let reply = ["add":counterd]
print("iPhone Receive Add \(counterd)")
addButtonPressed(self)
replyHandler(reply)
}
if let counterd = message["minus"] as? Float {
let reply = ["minus":counterd]
print("iPhone Receive minus \(counterd)")
removeButtonPressed(self)
replyHandler(reply)
}
if let status = message["status"] as? String {
if status == "Start" {
let reply = ["status":"Quorum"]
meetingStartedButtonPressed(self)
replyHandler(reply)
}
if status == "Quorum" {
let reply = ["status": "Finish"]
quorumButtonPressed(self)
replyHandler(reply)
}
if status == "Finish" {
let reply = ["status": "Reset"]
meetingEndedButtonPressed(self)
replyHandler(reply)
}
if status == "Reset" {
let reply = ["status": "Start"]
resetButtonPressed(self)
replyHandler(reply)
}
print("iPhone Received Status Button \(status)")
}
}
I get the messages firing fine on the Watch and see them in the debug log... But they do not seem to fire on the Phone. The phone is successfully sending its messages to the watch.
I have tested this code both in the simulator and on my own watch and iPhone. Note that the messages from the iPhone to the Watch are done using the via updateApplicationContext vs. the send message I am trying to use to send messages from the watch to the iPhone. Here's a sample of the iPhone code for sending context:
if WCSession.isSupported() {
if session.watchAppInstalled {
let UserInfo = ["waste":Float((wastedAmount.text! as NSString).floatValue), "cost":Float((totalAmount.text! as NSString).floatValue), "counter":Float((peopleInMeeting.text! as NSString).floatValue), "status":"Start"]
do {
try session.updateApplicationContext(UserInfo as! [String : AnyObject])
} catch {
print("Updating the context failed: ")
}
}
}
More information is needed regarding specifically what you're actually seeing on the Watch, when you say:
I get the messages firing fine on the Watch and see them in the debug log... But they do not seem to fire on the Phone. The phone is successfully sending its messages to the watch.
However, one common occurrence is that the iPhone code is actually working correctly, and the only thing you are not seeing is the debug statements printed to the console. This seems likely to be the case since you say you are seeing the expected Watch messages, presumably including those from print("Reply: \(reply)"). This indicates the message is being handled by the iPhone.
When that's the case, it's often simply that you are expecting to see debug console messages from both the Watch and iOS simulator processes at the same time, but in fact you're only connected to one or the other. There are (at least) two things you can do here:
Run the WatchKit app from Xcode, but then change to attach to the iPhone process instead. In Xcode, go Debug > Attach to Process... and select the iPhone app under "Likely Targets".
Start by running the iPhone app, which will mean you are already attached to that process. In the Apple Watch simulator, run the Watch app. You'll then be able to debug the iPhone side of the communication.
To debug in Watch-OS while running iPhone app and vice versa in Xcode-8.1. Required running Process need to be attached .
Visually:-

AVAudioRecorder delegate not assigned in Swift

I decided to rewrote audiorecorder class from Objective C to Swift.
In Objective C recording works, but in Swift AVAudioRecorderDelegate delegate methods not called, recorder starts successfully.
How can I fix this?
class VoiceRecorder: NSObject, AVAudioPlayerDelegate, AVAudioRecorderDelegate {
var audioRecorder: AVAudioRecorder!
var audioPlayer: AVAudioPlayer?
override init() {
super.init()
var error: NSError?
let audioRecordingURL = self.audioRecordingPath()
audioRecorder = AVAudioRecorder(URL: audioRecordingURL,
settings: self.audioRecordingSettings(),
error: &error)
audioRecorder.meteringEnabled = true
/* Prepare the recorder and then start the recording */
audioRecorder.delegate = self
if audioRecorder.prepareToRecord(){
println("Successfully prepared for record.")
}
}
func audioRecorderDidFinishRecording(recorder: AVAudioRecorder!, successfully flag: Bool) {
println("stop")
if flag{
println("Successfully stopped the audio recording process")
if completionHandler != nil {
completionHandler(success: flag)
}
} else {
println("Stopping the audio recording failed")
}
}
}
func record(){
audioRecorder.record()
}
//UPD.
func stop(#completion:StopCompletionHandler){
self.completionHandler = completion
self.audioRecorder?.stop
}
The problem is this line:
self.audioRecorder?.stop
That is not a call to the stop method. It merely mentions the name of the method. You want to say this:
self.audioRecorder?.stop()
Those parentheses make all the difference.