Camera Zoom Issue on iPhone X, iPhone XS etc - swift

The main issue I am having is with my camera. The main issue is that the camera is zoomed in too far on the iPhone X, Xs, Xs Max, and XR Models.
My camera is a full screen camera which is okay on the smaller iPhones but once I get to the models mentioned above the camera seems to be stuck on the max zoom level. What I really want to accomplish is a nature similar to how instagrams camera works. Where it is full screen on all models up until the iPhone X series and then seemingly respect the edge insets or if it is going to be full screen I want it to not be zoomed in so far the way it is now.
My thought process is to use something like this.
Determine the device. I figure I can use something like Device Guru which can be found here to determine the type of device.
GitHub repo can be found here --> https://github.com/InderKumarRathore/DeviceGuru
Using this tool or a similar tool I should be able to get the screen dimensions for the device. Then I can do some type of math to determine the proper screen size for the camera view.
Assuming DeviceGuru didn't work I would just use something like this to get the width and height of the screen.
// Screen width.
public var screenWidth: CGFloat {
return UIScreen.main.bounds.width
}
// Screen height.
public var screenHeight: CGFloat {
return UIScreen.main.bounds.height
}
This is the block of code I am using to fill the camera. However I want to turn into something that is based on the device size as opposed to just filling it despite the phone
import Foundation
import UIKit
import AVFoundation
class PreviewView: UIView {
var videoPreviewLayer: AVCaptureVideoPreviewLayer {
guard let layer = layer as? AVCaptureVideoPreviewLayer else {
fatalError("Expected `AVCaptureVideoPreviewLayer` type for layer. Check PreviewView.layerClass implementation.")
}
layer.videoGravity = AVLayerVideoGravity.resizeAspectFill
layer.connection?.videoOrientation = .portrait
return layer
}
var session: AVCaptureSession? {
get {
return videoPreviewLayer.session
}
set {
videoPreviewLayer.session = newValue
}
}
// MARK: UIView
override class var layerClass: AnyClass {
return AVCaptureVideoPreviewLayer.self
}
}
I want my camera took look something like this
or this
Not this ( what my current camera looks like )
I have looked at many questions and nobody has a concrete solution so please don't mark it as a duplicate and please don't say it's just an issue with the iPhone X series.

Firstly, you need to update your code with the apt information, since this will give a vague idea to a anyone else with less experience.
Looking at the Images it is clear that the type of camera you are trying to access is quite different than the one you have. With introduction of iPhone 7+ and iPhone X, apple has introduced many different camera devices to the users. All these are accesible through AVCaptureDevice.DeviceType.
So by looking at what you want to achieve, it is clear that you want more field of view within the screen. This is accessible by .builtInWideAngleCamera property of the above given capture device. Changing it to this will solve your problem.
Cheers

Related

Xcode AR Reference Image is too similar

I have an augmented reality app that plays a video on top of images and it was working very well, but as soon as I neared 50+ images, I started getting an error on some of the images, "AR reference image 'x' is too similar to 'y'." I am panicking because I need this done quickly and this error appears at random for no apart reason. In the linked picture, the reference images are clearly not similar in any way and when I even change the name of one of the pictures, it resolves itself at first until more issues of the same error come up on different reference images. If anyone has any theories or questions, please post them here! Thank you so much to anyone who can shine some light on this issue!
Image of AR Reference Image folder with error on pictures:
https://imgur.com/a/U3dlFef
Update: changed every image to be number 1-39 and the same images that in the last picture had errors still had errors so it must be something related to the pictures themselves. Still confused how though. Tried deleting every reference image and reuploaded exact same images and after giving each it's physical dimensions, same error popped up for 2 images still.
Is it possible to upload this update to Apple with this error and them allow it to go through? I did a test upload to my device and tested all images with errors and they all work as intended. I currently have no solution to a problem that seems very superficial. Thanks again!
ARReference image is created from these images and even so your human eyes sees fully different images, after parsing, it might be that the detected structure is too similar (because it is not looking for exact pixel images rather characteristics of given image). This is why your problem appears after increased number of images (bigger chance of similar characteristics). So your best bet might be to use less images or if not possible, change images until all warning disappear.
If you don't use these images at the same time (Same AR session), like maybe you have some kind of selection in your application, you can try that you use a simple assets catalog for the images and loading these reference images from code. I use this method because our application downloads simple images for markers and I create the reference image programatically.
private func startTracking(WithMarkerImage marker: UIImage) {
let configuration = ARWorldTrackingConfiguration()
configuration.planeDetection = [.horizontal, .vertical]
var orientation: CGImagePropertyOrientation!
switch marker.imageOrientation {
case .up:
orientation = CGImagePropertyOrientation.up
case .down:
orientation = CGImagePropertyOrientation.down
case .left:
orientation = CGImagePropertyOrientation.left
case .right:
orientation = CGImagePropertyOrientation.right
case .upMirrored:
orientation = CGImagePropertyOrientation.upMirrored
case .downMirrored:
orientation = CGImagePropertyOrientation.downMirrored
case .leftMirrored:
orientation = CGImagePropertyOrientation.leftMirrored
case .rightMirrored:
orientation = CGImagePropertyOrientation.rightMirrored
}
let referenceImage = ARReferenceImage(marker.cgImage!, orientation: orientation, physicalWidth: 0.15)
configuration.detectionImages = [mareferenceImageker]
configuration.environmentTexturing = .none
configuration.isLightEstimationEnabled = true
_arSession.run(configuration, options: [.removeExistingAnchors, .resetTracking])
}

ARKit – How to use ARImageTrackingConfiguration with ARFaceTrackingConfiguration?

I have built an app that uses image tracking and swaps flat images. I am also using people occlusion (now) so people can get photos in front of those images. I really want this app to have a selfie mode, so people can take their own pictures in front of image swapped areas.
I'm reading the features on ARKit 3.5, but as far as I can tell, the only front-facing camera support is with ARFaceTrackingConfiguration, which doesn't support image tracking. ARImageTrackingConfiguration and ARWorldTrackingConfiguration only use the back camera.
Is there any way to make a selfie mode with people occlusion (and image tracking) using the front-facing camera?
The answer is NO, you can't use any ARConfiguration except ARFaceTrackingConfiguration for front camera. Although, you can simultaneously use ARFaceTrackingConfiguration on the front camera and ARWorldTrackingConfiguration on the rear camera. This allows users interact with AR content in the rear camera using their face as certain controller.
Look at this docs page to find out what config to what camera (rear or front) corresponds to.
Here's a table containing ARKit 5.0 eight tracking configurations:
ARConfiguration
What Camera?
ARWorldTrackingConfiguration
Rear
ARBodyTrackingConfiguration
Rear
AROrientationTrackingConfiguration
Rear
ARImageTrackingConfiguration
Rear
ARFaceTrackingConfiguration
FRONT
ARObjectScanningConfiguration
Rear
ARPositionalTrackingConfiguration
Rear
ARGeoTrackingConfiguration
Rear
Simultaneous World and Face configs
To use driven World Tracking depending on driver Face Tracking use the following code:
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
guard ARFaceTrackingConfiguration.isSupported,
ARFaceTrackingConfiguration.supportsWorldTracking
else {
fatalError("We can't do face tracking")
}
let config = ARFaceTrackingConfiguration()
config.isWorldTrackingEnabled = true
sceneView.session.run(config)
}
Or you can use Face Tracking as a secondary configuration:
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
let config = ARWorldTrackingConfiguration()
if ARFaceTrackingConfiguration.isSupported {
config.userFaceTrackingEnabled = true
}
sceneView.session.run(config)
}
Pay attention that both properties are available on iOS 13 and higher.
var userFaceTrackingEnabled: Bool { get set }
var isWorldTrackingEnabled: Bool { get set }
P.S.
At the moment .userFaceTrackingEnabled = true still doesn't work. I tested it in Xcode 13.2.1 and iPad Pro 4th Gen with iPadOS 15.3 installed.

How to have sprite kit automatically resize depending on dimension of phone

At the moment, to handle the different screen sizes of the iphone 4 and 5 i use the following:
if ((int)[[UIScreen mainScreen] bounds].size.width == 568)
{
-----
} else {
-----
}
But with the edition of the 6 and 6 plus, this just seems like to much code to do for everything that requires a position on the screen. Is there an easier way to do this, where it automatically resizes for you?
I wrote an application using iPhone 6 dimensions in SpriteKit SKS files. When switching scenes I used the following code, the game scaled automatically when testing on different devices.
let yourScene = yourSceneFile(fileNamed: "YourScene")
gameplay!.scaleMode = .AspectFill
self.view?.presentScene(yourScene, transition)

How to get the amount of available-light in the area surrounding the iPhone using Objective C? [duplicate]

I notice on my iPhone, after a few seconds of being in direct sun light, the screen will adjust to become brighter, dimmer etc. I was wondering if there was a way to interact with this sensor?
I have an application which is used outside. When you go into direct light, it becomes very difficult to see the screen for a few momments, before it adjusts. And even then, it's not always as bright as I'd like it to be. I would like to implement a high contrast skin for outdoor viewing, and a low contrast for indoor viewing.
Is this possible to read light sensor data, and if so, how do I extract these sensor values?
I would assume there is a light sensor however, as the camera knows when to use the flash.
On the other hand this is a different idea (maybe a silly one), using the brightness of the device's screen you can get some value of the external conditions.
From 0.12 (Dark) to 0.99 (Light)
The next line will get those values, give it a try, put some light on and off over the device to get different values.
NSLog(#"Screen Brightness: %f",[[UIScreen mainScreen] brightness]);
Obviously Automatic Brightness feature should be turned on in order to get this to work.
Regards.
To read the ambient light sensor data, you need to use IOHID in the IOKit framework.
http://iphonedevwiki.net/index.php/AppleISL29003
http://iphonedevwiki.net/index.php/IOKit.framework
However, this requires private headers, so if you use it, Apple probably won't let your app into the app store.
I continually ask the iOS forums whether there will be support for ambient light sensor readings in the future, but to no avail.
You can actually use the camera to do this, which is independent of the user's screen brightness settings (and works even if Automatic Brightness is OFF).
You read the Brightness Value from the video frames' metadata as I explain in this Stack Overflow answer.
Try using GSEventSetBacklightLevel();, which requires <GraphicsServices/GraphicsServices.h>. This is how one can programmatically adjust the brightness levels. There is also a get option, so I think that may have the information you're after.
For Swift 5, here is how to use the brightness detection which indirectly gives you the luminosity of the outside:
/// A view controller (you can use any UIView or AnyObj)
class MyViewConroller: UIViewController {
/// Remove observers on deinit
deinit {
removeObservers()
}
// MARK: - Observers management helpers
/// Add my observers to the vc
func addObservers() {
NotificationCenter.default.addObserver(self, selector: #selector(onScreenBrightnessChanged(_:)), name: UIScreen.brightnessDidChangeNotification, object:nil)
}
/// Clean up observers
func removeObservers() {
NotificationCenter.default.removeObserver(self)
}
/// Load the views
func loadView() {
// Add my observes to the vc
addObservers()
}
/**
Handles brightness changes
*/
#objc func onScreenBrightnessChanged(_ sender: Notification) {
// Tweak as needed: 0.5 is a good value for me
let isDark = UIScreen.main.brightness < 0.5. // in 0...1
// Do whatever you want with the `isDark` flag: here I turn the headlights off
vehicle.turnOnTheHeadlights( isDark )
}
}
For iOS 14 and above, Apple has provided SensorKit (https://developer.apple.com/documentation/sensorkit/srsensor/3377673-ambientlightsensor ) for explicit access to all kinds of sensors and system logs (call logs, message logs, etc.). In addition to the raw lux value, you can also get the chromaticity of the ambient light and the orientation relative to the device sensor.
(From https://developer.apple.com/documentation/sensorkit/srambientlightsample )
Measuring Light Level
var chromaticity: SRAmbientLightSample.Chromaticity A coordinate pair
that describes the sample’s light brightness and tint.
struct SRAmbientLightSample.Chromaticity A coordinate pair that
describes light brightness and tint.
var lux: Measurement An object that describes the
sample’s luminous flux.
var placement: SRAmbientLightSample.SensorPlacement The light’s
location relative to the sensor.
enum SRAmbientLightSample.SensorPlacement Directional values that
describe light-source location with respect to the sensor.
However, you need to request for approval for such an App to be accepted and published on App Store.

Is it possible to prevent iPhone/iPad orientation changing in the browser?

I've seen similar questions on this issue, but they are related to native apps.
I build web apps for the iPhone/iPad that run in the browser (Safari).
I was wondering if there is a way to prevent orientation change in the browser, perhaps via some meta tag.
I know of the viewport meta tag which allows you to specify scale and zooming capabilities, so figured maybe there is something similar for orientation.
I doubt it is possible, but I thought I'd just pop a question on here to see.
You can detect orientation changes with onorientationchange.
Javascript:
/*window.orientation returns a value that indicates whether iPhone is in portrait mode, landscape mode*/
window.onorientationchange = function() {
var orientation = window.orientation;
switch(orientation) {
case 0:
//Portrait mode
case 90:
// Landscape left
case -90:
//Landscape right
}
It's written in iPhone doc. Here from iPhone docs.
It does not appear to be possible to prevent orientation change, but you can detect it with the code mentioned above.
If you want to prevent orientation change it appears as though you'll need to build a native app using WebKit which quashes the event.
You could be really tricky and rotate the body of the website to counteract the orientation change if nothing else works...
why can't you just put a event.preventDefault() inside the function?
you can not prevent it, but you can change the orientation of the body, according to the orientation of the ipad.
Use medopal's Answer to detect the orientation and change the orientation of your body.
for example:
document.getElementsByTagName("body")[0].style.webkitTransform = "rotate(-90deg)";
This bit of code will keep the orientation at -90 degrees (nice for landscape mode, iPad 2's cover will be hanging over the back). Put this under the end body tag in a script, or wrap the last two lines in an onready call.
function orient() {
var keepOrientationAt = -90;
var rotate = "rotate(" + (keepOrientationAt - window.orientation) + "deg)";
document.getElementsByTagName("body")[0].style.webkitTransform = rotate;
}
window.onorientationchange = orient;
orient();