ARKit – How to use ARImageTrackingConfiguration with ARFaceTrackingConfiguration? - swift

I have built an app that uses image tracking and swaps flat images. I am also using people occlusion (now) so people can get photos in front of those images. I really want this app to have a selfie mode, so people can take their own pictures in front of image swapped areas.
I'm reading the features on ARKit 3.5, but as far as I can tell, the only front-facing camera support is with ARFaceTrackingConfiguration, which doesn't support image tracking. ARImageTrackingConfiguration and ARWorldTrackingConfiguration only use the back camera.
Is there any way to make a selfie mode with people occlusion (and image tracking) using the front-facing camera?

The answer is NO, you can't use any ARConfiguration except ARFaceTrackingConfiguration for front camera. Although, you can simultaneously use ARFaceTrackingConfiguration on the front camera and ARWorldTrackingConfiguration on the rear camera. This allows users interact with AR content in the rear camera using their face as certain controller.
Look at this docs page to find out what config to what camera (rear or front) corresponds to.
Here's a table containing ARKit 5.0 eight tracking configurations:
ARConfiguration
What Camera?
ARWorldTrackingConfiguration
Rear
ARBodyTrackingConfiguration
Rear
AROrientationTrackingConfiguration
Rear
ARImageTrackingConfiguration
Rear
ARFaceTrackingConfiguration
FRONT
ARObjectScanningConfiguration
Rear
ARPositionalTrackingConfiguration
Rear
ARGeoTrackingConfiguration
Rear
Simultaneous World and Face configs
To use driven World Tracking depending on driver Face Tracking use the following code:
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
guard ARFaceTrackingConfiguration.isSupported,
ARFaceTrackingConfiguration.supportsWorldTracking
else {
fatalError("We can't do face tracking")
}
let config = ARFaceTrackingConfiguration()
config.isWorldTrackingEnabled = true
sceneView.session.run(config)
}
Or you can use Face Tracking as a secondary configuration:
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
let config = ARWorldTrackingConfiguration()
if ARFaceTrackingConfiguration.isSupported {
config.userFaceTrackingEnabled = true
}
sceneView.session.run(config)
}
Pay attention that both properties are available on iOS 13 and higher.
var userFaceTrackingEnabled: Bool { get set }
var isWorldTrackingEnabled: Bool { get set }
P.S.
At the moment .userFaceTrackingEnabled = true still doesn't work. I tested it in Xcode 13.2.1 and iPad Pro 4th Gen with iPadOS 15.3 installed.

Related

Camera Zoom Issue on iPhone X, iPhone XS etc

The main issue I am having is with my camera. The main issue is that the camera is zoomed in too far on the iPhone X, Xs, Xs Max, and XR Models.
My camera is a full screen camera which is okay on the smaller iPhones but once I get to the models mentioned above the camera seems to be stuck on the max zoom level. What I really want to accomplish is a nature similar to how instagrams camera works. Where it is full screen on all models up until the iPhone X series and then seemingly respect the edge insets or if it is going to be full screen I want it to not be zoomed in so far the way it is now.
My thought process is to use something like this.
Determine the device. I figure I can use something like Device Guru which can be found here to determine the type of device.
GitHub repo can be found here --> https://github.com/InderKumarRathore/DeviceGuru
Using this tool or a similar tool I should be able to get the screen dimensions for the device. Then I can do some type of math to determine the proper screen size for the camera view.
Assuming DeviceGuru didn't work I would just use something like this to get the width and height of the screen.
// Screen width.
public var screenWidth: CGFloat {
return UIScreen.main.bounds.width
}
// Screen height.
public var screenHeight: CGFloat {
return UIScreen.main.bounds.height
}
This is the block of code I am using to fill the camera. However I want to turn into something that is based on the device size as opposed to just filling it despite the phone
import Foundation
import UIKit
import AVFoundation
class PreviewView: UIView {
var videoPreviewLayer: AVCaptureVideoPreviewLayer {
guard let layer = layer as? AVCaptureVideoPreviewLayer else {
fatalError("Expected `AVCaptureVideoPreviewLayer` type for layer. Check PreviewView.layerClass implementation.")
}
layer.videoGravity = AVLayerVideoGravity.resizeAspectFill
layer.connection?.videoOrientation = .portrait
return layer
}
var session: AVCaptureSession? {
get {
return videoPreviewLayer.session
}
set {
videoPreviewLayer.session = newValue
}
}
// MARK: UIView
override class var layerClass: AnyClass {
return AVCaptureVideoPreviewLayer.self
}
}
I want my camera took look something like this
or this
Not this ( what my current camera looks like )
I have looked at many questions and nobody has a concrete solution so please don't mark it as a duplicate and please don't say it's just an issue with the iPhone X series.
Firstly, you need to update your code with the apt information, since this will give a vague idea to a anyone else with less experience.
Looking at the Images it is clear that the type of camera you are trying to access is quite different than the one you have. With introduction of iPhone 7+ and iPhone X, apple has introduced many different camera devices to the users. All these are accesible through AVCaptureDevice.DeviceType.
So by looking at what you want to achieve, it is clear that you want more field of view within the screen. This is accessible by .builtInWideAngleCamera property of the above given capture device. Changing it to this will solve your problem.
Cheers

Adjust camera focus in ARKit

I want to adjust the device's physical camera focus while in augmented reality. (I'm not talking about the SCNCamera object.)
In an Apple Dev forum post, I've read that autofocus would interfere with ARKit's object detection, which makes sense to me.
Now, I'm working on an app where the users will be close to the object they're looking at. The focus the camera has by default makes everything look very blurry when closer to an object than around 10cm.
Can I adjust the camera's focus before initializing the scene, or preferably while in the scene?
20.01.2018
Apparently, there's still no solution to this problem. You can read more about this at this reddit post and this developer forum post for private API workarounds and other (non-helping) info.
25.01.2018
#AlexanderVasenin provided a useful update pointing to Apple's documentation. It shows that ARKit will be able to support not just focusing, but also autofocusing as of iOS 11.3.
See my usage sample below.
As stated by Alexander, iOS 11.3 brings autofocus to ARKit.
The corresponding documentation site shows how it is declared:
var isAutoFocusEnabled: Bool { get set }
You can access it this way:
var configuration = ARWorldTrackingConfiguration()
configuration.isAutoFocusEnabled = true // or false
However, as it is true by default, you should not even have to set it manually, unless you chose to opt out.
UPDATE: Starting from iOS 11.3 ARKit supports autofocusing, and it's enabled by default (more info). Manual focusing still aren't available.
Prior to iOS 11.3 ARKit did not supported neither manual focus adjust nor autofocusing.
Here is Apple's reply on the subject (Oct 2017):
ARKit does not run with autofocus enabled as it may adversely affect plane detection. There is an existing feature request to support autofocus and no need to file additional requests. Any other focus discrepancies should be filed as bug reports. Be sure to include the device model and OS version. (source)
There is another thread on Apple forums where a developer claims he was able to adjust autofocus by calling AVCaptureDevice.setFocusModeLocked(lensPosition:completionHandler:) method on private AVCaptureDevice used by ARKit and it appears it's not affecting tracking. Though the method itself is public, the ARKit's AVCaptureDevice is not, so using this hack in production would most likely result in App Store rejection.
if #available(iOS 16.0, *) {
// This property is nil on devices that aren’t equiped with an ultra-wide camera.
if let device = ARWorldTrackingConfiguration.configurableCaptureDeviceForPrimaryCamera {
do {
try device.lockForConfiguration ()
// configuration your focus mode
// you need to change ARWorldTrackingConfiguration().isAutoFocusEnabled at the same time
device.unlockForConfiguration ()
} catch {
}
}
} else {
// Fallback on earlier versions
}
Use configurableCaptureDeviceForPrimaryCamera method, and this is only available after iOS 16 or later.
Documentation
/
ARKit
/
Configuration Objects
/
ARConfiguration
/
configurableCaptureDeviceForPrimaryCamera

ARKit crashing on iPad Air 1

So I wanted to try ARKit. I installed iOS 11 on my iPad air but it keeps crashing.
Here's the code in my view controller
import UIKit
import ARKit
class ViewController: UIViewController {
#IBOutlet weak var sceneView: ARSCNView!
#IBOutlet weak var counterLabel: UILabel!
override func viewDidLoad() {
super.viewDidLoad()
let scene = SCNScene()
sceneView.scene = scene
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(true)
let configuration = ARSessionConfiguration()
sceneView.session.run(configuration)
}
}
So I searched up a bit and I came into this:
https://developer.apple.com/documentation/arkit/building_a_basic_ar_experience
which basically says that for devices that has an older chip than the A9, should use ARSessionConfiguration instead of ARWorldTrackingSessionConfiguration however I still keep getting crashes.
I tried the ARKit Demp provided by Apple, same thing.
I also tried sceneView.antialiasingMode = .none but it didn't help either.
Here's the console log I get when it crashes
2017-06-26 21:44:16.539469+0200 ARKitGame[562:56168] [DYMTLInitPlatform] platform initialization successful
2017-06-26 21:44:18.630888+0200 ARKitGame[562:55915] Metal GPU Frame Capture Enabled
2017-06-26 21:44:18.633276+0200 ARKitGame[562:55915] Metal API Validation Enabled
2017-06-26 21:44:19.625366+0200 ARKitGame[562:56176] [MC] System group container for systemgroup.com.apple.configurationprofiles path is /private/var/containers/Shared/SystemGroup/systemgroup.com.apple.configurationprofiles
2017-06-26 21:44:19.628963+0200 ARKitGame[562:56176] [MC] Reading from public effective user settings.
2017-06-26 21:44:22.706910+0200 ARKitGame[562:56176] -[MTLTextureDescriptorInternal validateWithDevice:], line 778: error 'MTLTextureDescriptor has invalid pixelFormat (520).'
-[MTLTextureDescriptorInternal validateWithDevice:]:778: failed assertion `MTLTextureDescriptor has invalid pixelFormat (520).'
(lldb)
Apple changed the ARKit documentation with beta 2: it now unequivocally says that ARKit as a whole -- not just world tracking -- requires A9.
Perhaps that explains why even the basic session configuration seemed to never actually work on devices below A9...
I have downloaded the latest beta and run it on iPad Air 1 and still crashes and as Apple mentioned here : https://developer.apple.com/documentation/arkit
Important ARKit requires an iOS device with an A9 or later processor.
To make your app available only on devices supporting ARKit, use the
arkit key in the UIRequiredDeviceCapabilities section of your app's
Info.plist. If augmented reality is a secondary feature of your app,
use the isSupported property to determine whether the current device
supports the session configuration you want to use.
So it seems it only works on A9 processors.
ARKit only works on A9 chips onward: http://www.iphonehacks.com/2017/06/list-iphone-ipad-compatible-arkit-ios-11.html
iPad Air 1 has an A7 chip, so i would think maybe that's the crash you are seeing.
Updated: Didnt notice your comment about older chips using ARSessionConfiguration instead of ARWorldTrackingSessionConfiguration. Maybe you could try changing this settings on the ARKit demo app.

Extracting a texture in watchOS 3

I am using the following code in iOS 10.0 in my GameScene.swift
//Shape storage
let playerShape = SKShapeNode(circleOfRadius: 10 )
...Color setup etc
//Get the texture from shape node
let playerTexture = view?.texture(from: playerShape)
The same code doesn't work in watchOS 3.0
Xcode 8.0 beta 2 complaints about the view:
Use of unresolved identifier 'view'
Does anyone know whats the equivalent of view in watchOS?
Thank you.
As mentioned above there are no views in Apple Watch's Sprite Kit.
So that is why you are unable to use view in your code.
Just use SKSpriteNode instead and do something like this if you want the texture for something else.
E.g. here I want to use my circleOriginal texture on circle 2
let circleOriginal = SKSpriteNode(imageNamed: "circle")
let circleTexture = circleOriginal.texture
let circle2 = SKSpriteNode(texture: circleTexture)
If you want an amazing overview of Apple Watch Game Tech check out Apple's WWDC lecture on it (The slides below are from there). This explains a lot and provides great code examples.
https://developer.apple.com/videos/play/wwdc2016/612/
Here are the key differences.
Here's Apple's Scene Graph for non-watchOS
Here's Apple's Scene Graph corresponding Scene Graph for watchOS
Here are the recommended alternatives for watchOS's SpriteKit / SceneKit

How to get the amount of available-light in the area surrounding the iPhone using Objective C? [duplicate]

I notice on my iPhone, after a few seconds of being in direct sun light, the screen will adjust to become brighter, dimmer etc. I was wondering if there was a way to interact with this sensor?
I have an application which is used outside. When you go into direct light, it becomes very difficult to see the screen for a few momments, before it adjusts. And even then, it's not always as bright as I'd like it to be. I would like to implement a high contrast skin for outdoor viewing, and a low contrast for indoor viewing.
Is this possible to read light sensor data, and if so, how do I extract these sensor values?
I would assume there is a light sensor however, as the camera knows when to use the flash.
On the other hand this is a different idea (maybe a silly one), using the brightness of the device's screen you can get some value of the external conditions.
From 0.12 (Dark) to 0.99 (Light)
The next line will get those values, give it a try, put some light on and off over the device to get different values.
NSLog(#"Screen Brightness: %f",[[UIScreen mainScreen] brightness]);
Obviously Automatic Brightness feature should be turned on in order to get this to work.
Regards.
To read the ambient light sensor data, you need to use IOHID in the IOKit framework.
http://iphonedevwiki.net/index.php/AppleISL29003
http://iphonedevwiki.net/index.php/IOKit.framework
However, this requires private headers, so if you use it, Apple probably won't let your app into the app store.
I continually ask the iOS forums whether there will be support for ambient light sensor readings in the future, but to no avail.
You can actually use the camera to do this, which is independent of the user's screen brightness settings (and works even if Automatic Brightness is OFF).
You read the Brightness Value from the video frames' metadata as I explain in this Stack Overflow answer.
Try using GSEventSetBacklightLevel();, which requires <GraphicsServices/GraphicsServices.h>. This is how one can programmatically adjust the brightness levels. There is also a get option, so I think that may have the information you're after.
For Swift 5, here is how to use the brightness detection which indirectly gives you the luminosity of the outside:
/// A view controller (you can use any UIView or AnyObj)
class MyViewConroller: UIViewController {
/// Remove observers on deinit
deinit {
removeObservers()
}
// MARK: - Observers management helpers
/// Add my observers to the vc
func addObservers() {
NotificationCenter.default.addObserver(self, selector: #selector(onScreenBrightnessChanged(_:)), name: UIScreen.brightnessDidChangeNotification, object:nil)
}
/// Clean up observers
func removeObservers() {
NotificationCenter.default.removeObserver(self)
}
/// Load the views
func loadView() {
// Add my observes to the vc
addObservers()
}
/**
Handles brightness changes
*/
#objc func onScreenBrightnessChanged(_ sender: Notification) {
// Tweak as needed: 0.5 is a good value for me
let isDark = UIScreen.main.brightness < 0.5. // in 0...1
// Do whatever you want with the `isDark` flag: here I turn the headlights off
vehicle.turnOnTheHeadlights( isDark )
}
}
For iOS 14 and above, Apple has provided SensorKit (https://developer.apple.com/documentation/sensorkit/srsensor/3377673-ambientlightsensor ) for explicit access to all kinds of sensors and system logs (call logs, message logs, etc.). In addition to the raw lux value, you can also get the chromaticity of the ambient light and the orientation relative to the device sensor.
(From https://developer.apple.com/documentation/sensorkit/srambientlightsample )
Measuring Light Level
var chromaticity: SRAmbientLightSample.Chromaticity A coordinate pair
that describes the sample’s light brightness and tint.
struct SRAmbientLightSample.Chromaticity A coordinate pair that
describes light brightness and tint.
var lux: Measurement An object that describes the
sample’s luminous flux.
var placement: SRAmbientLightSample.SensorPlacement The light’s
location relative to the sensor.
enum SRAmbientLightSample.SensorPlacement Directional values that
describe light-source location with respect to the sensor.
However, you need to request for approval for such an App to be accepted and published on App Store.