I am developing an ARKit app with OpenGL, so working directly with ARKit and not using SceneKit.
By default, ARKit is set to landscape orientation, but I have been unable to track down any documentation or examples to rotate to portrait. SceneKit example works in portrait but the Metal example only works in landscape.
Is it possible to change the ARKit tracking orientation?
I was able to solve this in the application logic by multiplying the camera matrix by a quaternion that is rotated based on device orientation.
let cameraQuaternion = simd_quatf(arCameraTransform)
minimapcamera.quaternion2 = cameraQuaternion * simd_quatf(angle: Float.pi/2, axis: SIMD3<Float>(0, 0, 1))
my solution based on accepted answer for portrait only
sorry on objective-c.
find and rewrite this
uniforms->viewMatrix = [frame.camera viewMatrixForOrientation:UIInterfaceOrientationLandscapeRight];
uniforms->projectionMatrix = [frame.camera projectionMatrixForOrientation:UIInterfaceOrientationLandscapeRight viewportSize:_viewportSize zNear:0.001 zFar:1000];
to
uniforms->viewMatrix = [frame.camera viewMatrixForOrientation:[[UIApplication sharedApplication] statusBarOrientation]];
uniforms->projectionMatrix = [frame.camera projectionMatrixForOrientation:[[UIApplication sharedApplication] statusBarOrientation] viewportSize:_viewportSize zNear:0.001 zFar:1000];
and
CGAffineTransform displayToCameraTransform = CGAffineTransformInvert([frame displayTransformForOrientation:UIInterfaceOrientationLandscapeRight viewportSize:_viewportSize]);
to
CGAffineTransform displayToCameraTransform = CGAffineTransformInvert([frame displayTransformForOrientation:[[UIApplication sharedApplication] statusBarOrientation] viewportSize:_viewportSize]);
Currently in iOS 11 Beta, there's only horizontal surfaces tracking for plane detection.
By default, plane detection is off. If you enable horizontal plane
detection, the session adds ARPlaneAnchor objects and notifies your
ARSessionDelegate , ARSCNViewDelegate , or ARSKViewDelegate object
whenever its analysis of captured video images detects an area that
appears to be a flat surface.
If you need vertical tracking, do it yourself with hitTest(_:types:) function. It allows you to check surfaces or objects in real world.
Hit testing searches for real-world objects or surfaces detected
through the AR session's processing of the camera image. A 2D point in
the image coordinates can refer to any point along a 3D line that
starts at the device camera and extends in a direction determined by
the device orientation and camera projection. This method searches
along that line, returning all objects that intersect it in order of
distance from the camera.
Related
I am using RealityKit for a human body detection.
let configuration = ARBodyTrackingConfiguration()
arView.session.run(configuration, options: [.resetTracking, .removeExistingAnchors])
A person always need stay too far in front of that back camera, more than 4 m. Can I apply a negative zoom? If I use a system camera application I see that a person can stay only 3 m front in.
Thank you in advance.
You can't zoom out sadly with reality kit. Reality kit uses the camera provided by ARKit and it's fixed with a focal length of 28mm.
I am developing an augmented reality application that tracks an object via camera (real object, using Vuforia), my aim is to detect the distance it pass.
I am using unity + Vuforia.
For each frame, I calculate the distance between the first position and the current position (Vector calculating).
But I got wrong position/s details, and camera movements affect the result.
(I don't want to take the camera offset in account)
any solution?
for more clearing I want to implement this experience: (video):
https://youtu.be/-c5GiXuATh4
From the comments and the question i understood problem is using camera as origin. This means at all frames of your application camera will be origin and the position of all trackables will be calculated relative to camera. Therefore, even though if you do not move your target, it's position will change because of camera movement.
To eliminate this problem i would recommend using extended tracking. This will minimize the impact of camera movement to position of your target. You can try and test this by adding a trail renderer to your image and you will see your image will stay at a certain position regardless of camera movement.
I making viewer big model in Unity3d for mobile.
How rotate camera on one touch around self
Here is doc about TouchInput Unity Doc. E.g You can count screen width and hight to angle and rotate camera object. Or You can use one of assets from Asset Store e.g : Simple Touch Camera Script
I'm working on an app that performs geolocalization over the camera stream of the iPhone. I use the compass to figure out where to put the icons and information onto the the camera layer. If I rotate the device around yaw axis everything works fine.
However, when I roll the iPhone all the information on screen goes away. That's because when you roll the device the compass orientation also changes. However, there are apps like Layar or Wikitude that allow roll rotation without losing focus on the visual items you have onto the camera layer. That way, these apps allow smooth transition between portrait orientation to landscape orientation.
How they achieve that? How can I compensate the roll rotation of the device to keep information on screen?
By the way, the ARKit framework has the same problem as me.
Thanks.
If you are in 2D maybe it is enough to take the point you are calculating from camera field view and offset heading, calculate the distance to the center of the screen, and use that distance as a radius for a circle to do x += r*cos, y += r*sin with -roll as the angle, so the object moves in a circle against the roll. Then you just have to counter rotate the image itself with a transform (CGAffineTransformMakeRotation) to keep it vertical.
I have a scene divided in 3 different planes and I want to move these planes, left or right relative to the rotation of the device on the axis, so when the device is at 0 degree to the surface the planes will be on center. I see this effect on the home screen of a game from my iPod touch (first generation).
Which sensor I must work with for creating a similar effect?
To achieve a parallax effect, add the CoreMotion framework to your project and construct a CMMotionManager. Then for a device that has a gyroscope, you can use startDeviceMotionUpdatesToQueue:withHandler: and inspect motion.attitude.roll in your handler block.
For a device that doesn't have a gyroscope, you can use startAccelerometerUpdatesToQueue:withHandler: and inspect accelerometerData.acceleration.x, or you can use UIAccelerometer and implement UIAccelerometerDelegate. Either way, you'll probably want to create a low-pass filter to help distinguish gravity from linear acceleration. The GLGravity project has an example of this.
See the section on Motion Events in the Event Handling Guide for iOS.
try this UIView category
https://github.com/Przytua/UIView-MWParallax
you get the parallax effect on chosen UIView (or it's subclasses) objects with simply setting 1 property value