The following piece of code returns false on iPhone XR even-though it person segmentation is working on XR.
ARConfiguration.supportsFrameSemantics(.personSegmentation)
I want to know if it does officially supports person segmentation and person segmentation with depth on XR. Just to point out I have got iOS 13.1.2 on the XR.
Try this variation:
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
let config = ARWorldTrackingConfiguration()
if ARWorldTrackingConfiguration.supportsFrameSemantics(.personSegmentationWithDepth) {
config.frameSemantics = .personSegmentationWithDepth
}
arView.session.run(config)
}
And make sure that your Xcode version is 11.2.1 and iOS version is 13.2.3.
Related
Could this be related to SafeArea issues in iOS12.4 (and actually 12.2 as well)?
I use the following function to tap a view during my UITests.
func tapAtLocation(_ element: XCUIElement) -> Self {
let loc = element.coordinate(withNormalizedOffset: .init(dx: 0, dy: 0))
loc.tap()
return self
}
I try to tap at the location of a specific image.
So I get the image and trigger a tap
let myImage = App.images[myImageViewIdentifier].firstMatch
tapAtLocation(myImage)
It works on new iOS versions and also on iPhone 7 iOS 12.4 but not on iPhone X.
And I need it to work on iPhone X :)
What do you propose me to do? Maybe you have a nice debugging trick to see exactly where it tries to tap ?
ok ! After a day of fail and retry, a simple solution worked out!
func tapAtLocation(_ element: XCUIElement) -> Self {
let loc = element.coordinate(withNormalizedOffset: .init(dx: 0.5, dy: 0.5))
loc.tap()
return self
}
This works on all devices (iPhone X, 7 and 11) with iOS 12.4 and 13+
I am trying to load both people occlusion and motion capture on the same app.
Since ARBodyTrackingConfiguration does not support personSegmentationWithDepth, I am creating 2 ARViews, giving each a different configuration (ARWorldTrackingConfiguration and ARBodyTrackingConfiguration).
The problem is that for some reason only one of the delegates callback is fired, and no depth data is available.
What am I doing wrong here?
Is it not OK to have more than one ARSession live at the same time?
In ARKit 4.0 both features can be run simultaneously. However they are both CPU intensive.
override func viewDidLoad() {
super.viewDidLoad()
guard ARBodyTrackingConfiguration.isSupported
else { fatalError("MoCap is supported on devices with A12 and higher") }
guard ARBodyTrackingConfiguration.supportsFrameSemantics(
.personSegmentationWithDepth)
else { fatalError("People occlusion is not supported on this device.") }
let config = ARBodyTrackingConfiguration()
config.frameSemantics = .personSegmentationWithDepth
config.automaticSkeletonScaleEstimationEnabled = true
arView.session.run(config, options: [])
}
Unable to add more than three UILabels to a UIViewController’s view in the IPad Playgrounds app. Is it me or the system?
Simplified code to show the issue. Hardware is a 2018 iPad Pro running iOS 12.3.1 Playgrounds app 3.0. Using UIViews up to five can be added successfully.
import UIKit
import PlaygroundSupport
class MyViewController : UIViewController {
let square50 = CGSize(width: 50, height: 50)
override func viewDidLoad() {
var myFrame = CGRect(origin: .zero, size: square50)
for index in 0...4 {
view.addSubview(UIView(frame: myFrame)) // fails at 6th!
(view.subviews[index] as! UIView).backgroundColor = .red
myFrame.origin.y += 80
}
}
}
PlaygroundPage.current.liveView = MyViewController()
With the index range set as shown the code worked as expected, displaying the set number of coloured rectangles. With the range increased to 0...5 the runtime stopped with the message “There was a problem encountered while running this playground. Check your code for mistakes”.
Just been able to test the code on Xcode 10.3 under OS X 10.16.6 on my iMac Retina 5k 27" late 2015. There is NO problem with the code and there is no short-range limit on the number of sub-views that can be created.
The problem rests with iOS Swift Playgrounds 3.0 running on an iPad Pro 12.9 inch 3rd-gen using iOS 12.3.1. This is therefore a bug!
I am using twilio to send a video and use that video in a scenekit as a texture. But the problem is it works fine with iPhone X, but it gave this error Unsupported IOSurface format: 0x26424741 on iPhone XR and XS.
this is what I am doing:
Get Video:
func subscribed(to videoTrack: TVIRemoteVideoTrack, publication: TVIRemoteVideoTrackPublication, for participant: TVIRemoteParticipant) {
print("Participant \(participant.identity) added a video track.")
let remoteView = TVIVideoView.init(frame: UIWindow().frame,
delegate:self)
videoTrack.addRenderer(remoteView!)
delegate.participantAdded(with: remoteView!)
}
delegate:
func participantAdded(with videoView: UIView) {
sceneView.addVideo(with: videoView)
}
and add video to plane:
func addVideo(with view: UIView){
videoPlane.geometry?.firstMaterial?.diffuse.contents = view
}
The problem was actually with renderingType of remoteView. For older devices using metal was fine but newer devices it needed openGLES. I dont know why but it was the fix.
I used this solution to find out the device type.
Next I determined which renderingType to use
var renderingType: VideoView.RenderingType {
get{
let device = UIDevice()
switch device.type{
case .iPhoneXS:
return .openGLES
case .iPhoneXR:
return .openGLES
case .iPhoneXSMax:
return .openGLES
default:
return .metal
}
}
}
And used it to initialize remoteView
func didSubscribeToVideoTrack(videoTrack: RemoteVideoTrack, publication: RemoteVideoTrackPublication, participant: RemoteParticipant) {
print("Participant \(participant.identity) added a video track.")
let remoteView = VideoView.init(frame: UIWindow().frame,
delegate:self,
renderingType: renderingType)
videoTrack.addRenderer(remoteView!)
delegate.participantAddedVideo(for: participant.identity, with: remoteView!)
}
The following attributes are returning false for me, but I am not able to understand why.
ARImageTrackingConfiguration.isSupported
ARWorldTrackingConfiguration.isSupported
I am testing it on a iPhone Xs with iOS 12.1.1, with the code built with Xcode 10.1.
Note that ARConfiguration.isSupported does return true.
Any ideas why this might be happening?
Only one ARTracking class is supported per a given time.
You should write your code this way:
import UIKit
import ARKit
class ViewController: UIViewController, ARSCNViewDelegate {
#IBOutlet var sceneView: ARSCNView!
var configuration: ARConfiguration?
//.........
//.........
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
if ARWorldTrackingConfiguration.isSupported {
configuration = ARWorldTrackingConfiguration() // 6-DoF
} else {
configuration = AROrientationTrackingConfiguration() // 3-DoF
}
sceneView.session.run(configuaration!)
}
}
Also, read carefully about these 3 types of tracking configuration:
ARWorldTrackingConfiguration() (rotation and translation x-y-z) 6-DoF
AROrientationTrackingConfiguration() (only rotation x-y-z) 3-DoF
ARImageTrackingConfiguration() 6-DoF but image-only tracking lets you anchor virtual content to known images only when those images are in view of the camera.
Because 3-DoF tracking creates limited AR experiences, you should generally not use the AROrientationTrackingConfiguration() class directly. Instead, use the subclass ARWorldTrackingConfiguration() for tracking with six degrees of freedom (6-DoF), plane detection, and hit-testing. Use 3-DoF tracking only as a fallback in situations where 6-DoF tracking is temporarily unavailable.
Hope this helps.