torchlevel not setting properly on iphone xs max - swift

I made this app dozing that helps you meditate by adjusting the torch on your phone. It goes from a low-level brightness to a high-level birghtness over a certain time period.
It worked great for iPhone 6gen, 7gen, 8gen, and X. I just got the XS max, and for whatever reason instead of the brightness adjusting, it just stays turned on at maximum brightness.
It still works on the iPhone 7 with iOS 12. Weird thing is, SOMETIMES it RANDOMLY works on the XS Max, I just can't figure out what causes it to adjust properly and what causes it to sometimes be stuck on maximum brightness.
func updateTorch() {
guard let device = AVCaptureDevice.default(for: AVMediaType.video)
else {
return
}
if device.hasTorch && device.isTorchAvailable {
do {
try device.lockForConfiguration()
if torchMode == 0 {
device.torchMode = .off
} else {
try device.setTorchModeOn(level: torchMode) // HERE
}
device.unlockForConfiguration()
} catch {
print("Torch is not working.")
}
} else {
print("Torch not compatible with device.")
}
}
That's my main method that updates the torch. If I print "torchMode" where I marked "//HERE," it gives an ajusting Float value between 0 and 1. Also no errors are thrown from the setTorchModeOn(level:) method.

Related

AKAmplitudeTracker amplitude getting 0.0 using audioKit

I want to get the volume of AKAmplitudeTracker but getting -inf what is wrong with me please help out.
AKAudioFile.cleanTempDirectory()
AKSettings.audioInputEnabled = true
AKSettings.bufferLength = .medium
AKSettings.defaultToSpeaker = true
AKSettings.playbackWhileMuted = true
AKSettings.enableRouteChangeHandling = true
AKSettings.enableCategoryChangeHandling = true
AKSettings.enableLogging = true
do {
try AKSettings.setSession(category: .playAndRecord, with: .allowBluetoothA2DP)
} catch {
print("error \(error.localizedDescription)")
}
microphone = AKMicrophone()!
tracker = AKAmplitudeTracker(microphone)
booster = AKBooster(tracker, gain: 0)
AudioKit.output = booster
try AudioKit.start()
=================
extension AKAmplitudeTracker {
var volume: Decibel {
return 20.0 * log10(amplitude)
}
}
=================
OutPut print(tracker. amplitude)
0.0
Had a quick look, seems that you followed the basic setup, you do seem to fail to trace the data generated in time correctly! Amplitude data is provided during the time period for the computation that is taken from the microphone, so to look at what it looks like in timeline you can use a timer, as such:
func reset() throws {
do {
self.timer.invalidate()
self.timer = nil
} catch {
throw error
}
}
func microphoneTracker() {
guard self.timer == nil else { return }
self.watcher()
let timer = Timer.scheduledTimer(withTimeInterval: 0.1, repeats: true) { _ in
log.info(self.akMicrophoneAmplitudeTracker.amplitude)
}
self.timer = timer
}
Change the withTimeInterval to how frequently you want to check the amplitude.
I think it's quite readable what I put there for you, but I'll break it down in a few words:
Keep a reference for the AKAmplitudeTracker in a property, here I've named it akMicrophoneAmplitudeTracker
Keep a reference for your timed event, that will check the amplitude value during a period
Compute the data in the closure body, the property holding value is .amplitude
The computation in the example is a logger that prints .amplitude
As required, use the .invalidate method to stop the timer
A few other things you may want to double-check on your code is to make sure that the tracker is part of the signal chain, as that's an AVAudioEngine engine requirement; I've also noticed in some other people's code a call for the method .start in the AKAmplitudeTracker, as follows:
akMicrophoneAmplitudeTracker.start()
To finish, have in mind that if you are testing it through Simulator, look at the microphone settings of your host-machine and expect amplitudes that might be different then the actual hardware.

Swift isLockingFocusWithCustomLensPositionSupported always returns false

I want to set the lens distance of my iPhoneX to a constant.
In order to check if that is supported, I check the isLockingFocusWithCustomLensPositionSupported property of my device, as described in the documentation here: https://developer.apple.com/documentation/avfoundation/avcapturedevice/2361529-islockingfocuswithcustomlensposi
The method always returns false, even when the device is locked for configuration, which means that calling the method setFocusModeLocked(lensPosition, completionHandler) will alway throw an error.
Why is this the case, and how to correctly call the setFocusModeLocked() ?
Below is my approach:
let device = self.deviceInput.device
do {
try device.lockForConfiguration()
if device.isFocusPointOfInterestSupported && device.isFocusModeSupported(focusMode)
{
// this returns true
}
if device.isLockingFocusWithCustomLensPositionSupported
{
//this always returns false
device.setFocusModeLocked(lensPosition: focusDistance, completionHandler: nil)
}
device.unlockForConfiguration()
} catch {
print("Could not lock device for configuration: \(error)")
}
Tested on iPhoneX, iOS 12

AudioKit AKMicrophone not outputting any data

I am trying to capture FFT data from a microphone. I've managed to get it to work before with a similar codebase but since macOS Mojave it's broken - the fft data constantly stays 0.
Relevant Code:
var fft: AKFFTTap?
var inputDevice: AKDevice? {
didSet {
inputNode = nil
updateAudioNode()
}
}
var inputNode: AKNode? {
didSet {
if fft != nil {
// According to AKFFTTap class reference, it will always be on tap 0
oldValue?.avAudioNode.removeTap(onBus: 0)
}
fft = inputNode.map { AKFFTTap($0) }
}
}
[...]
guard let device = inputDevice else {
inputNode = ViewController.shared.player.mixer
return
}
do {
try AudioKit.setInputDevice(device)
}
catch {
print("Error setting input device: \(error)")
return
}
let microphoneNode = AKMicrophone()
do {
try microphoneNode.setDevice(device)
}
catch {
print("Failed setting node input device: \(error)")
return
}
microphoneNode.start()
microphoneNode.volume = 3
print("Switched Node: \(microphoneNode), started: \(microphoneNode.isStarted)")
inputNode = microphoneNode
try! AudioKit.start()
All the code is called, no errors are output, but the fft simply stays blank. With some code reordering I get varying errors.
A full version of the class, for completeness, is here.
Finally, I also tried implementing one to one the examples from the playground. Since XCode playgrounds seem to crash with AudioKit, I tried it in my own codebase, but there's no difference there either. AKFrequencyTracker, for example, gets 0s for both amplitude and frequency.
I am not 100% positive of this, but I'd like you to try AudioKit v4.5.1 out. We definitely fixed a bug in AKMicrophone, and that could have downstream consequences. I'll withdraw this answer and keep looking if it is not fixed. Let me know.

CIDetector face detection in real-time However memory consumption increases linearly how to avoid this issue?

I have a question how to correctly call CIDetector correctly I'm trying to run the face detection in real-time this works very well. However the memory consumption of the app increases linearly with time how you can see in the image below I'm thinking this is due to objects being created but they're not released can anyone advise how to do it correctly.
I have pinpointed the issue down to this function as every time it's invoked memory increases linearly when it terminated it quickly drops down to almost 80 MB instead of 11 GB rising also check for memory leaks however none were found.
My target development platform is Mac OS I'm trying to extractthe mouth position from the CA detector and then use it to compute a Delta in the mouse function for a Game.
I also Looked that this post however I have tried their approach but it did not work for me
CIDetector isn't releasing memory
fileprivate func faceDetection(){
// setting up dispatchQueue
dispatchQueue.async {
// checking if sample buffer is equal to nil if not assign its value to sample
if let sample = self.sampleBuffers {
// if allfeatures is not equal to nil. if yes assign allfeatures to features otherwise return
guard let features = self.allFeatures(sample: sample) else { return }
// loop to cycle through all features
for feature in features {
// checks if the feature is a CIFaceFeature if yes assign feature to face feature and go on.
if let faceFeature = feature as? CIFaceFeature {
if !self.hasEnded {
if self.calX.count > 30 {
self.sens.append((self.calX.max()! - self.calX.min()!))
self.sens.append((self.calY.max()! - self.calY.min()!))
print((self.calX.max()! - self.calX.min()!))
self.hasEnded = true
} else {
self.calX.append(faceFeature.mouthPosition.x)
self.calY.append(faceFeature.mouthPosition.y)
}
} else {
self.mouse(position: CGPoint(x: (faceFeature.mouthPosition.x - 300 ) * 2, y: (faceFeature.mouthPosition.y + 20 ) * 2), faceFeature: faceFeature)
}
}
}
}
if !self.faceTrackingEnds {
self.faceDetection()
}
}
}
This problem was caused by repeatedly calling the function without waiting for its completion the fix was implementing a dispatch group and then calling the function on its completion
like this Now the CIdetector runs comfortably at 200 MB memory
fileprivate func faceDetection(){
let group = DispatchGroup()
group.enter()
// setting up dispatchQueue
dispatchQueue.async {
// checking if sample buffer is equal to nil if not assign its value to sample
if let sample = self.sampleBuffers {
// if allfeatures is not equal to nil. if yes assign allfeatures to features otherwise return
guard let features = self.allFeatures(sample: sample) else { return }
// loop to cycle through all features
for feature in features {
// checks if the feature is a CIFaceFeature if yes assign feature to face feature and go on.
if let faceFeature = feature as? CIFaceFeature {
self.mouse(position: faceFeature.mouthPosition, faceFeature: faceFeature)
}
}
}
group.leave()
}
group.notify(queue: .main) {
if !self.faceTrackingEnds {
self.faceDetection()
}
}
}

Crash (EXC_BAD_ACCESS) when using CharacterSet isSuperset

Trying to determine if an input string is a valid phone number using CharacterSet. Seems that isSubset works fine, but isSuperset will crash.
I think this is a bug in Foundation.
let phoneNumberCharacterSet = CharacterSet(charactersIn: "01234567890,;*+#").union(CharacterSet.whitespaces)
let zeroCharacterSet = CharacterSet(charactersIn: "0")
if zeroCharacterSet.isSubset(of: phoneNumberCharacterSet) {
print("zero is a subset of the phone number set")
}
if phoneNumberCharacterSet.isSuperset(of: zeroCharacterSet) {
// will never get here due to crash
print("is a superset of '0'")
}
According to this
Seems the current bridging of CharacterSet generates something weird
which does not work with isSuperset(of:). (It internally calls
CFCharacterSetIsSupersetOfSet(_:_:).)
You can get
if phoneNumberCharacterSet.isSuperset(of: zeroCharacterSet) {
// will never get here due to crash
print("is a superset of '0'")
}
replaced by
let zeroString = "0"
if zeroString.rangeOfCharacter(from: phoneNumberCharacterSet.inverted) == nil {
print("is a superset of '0'")
}