I'm trying to retrieve the mapView current zoom level in google maps using Swift. The documentation states that it should be mapview.camera.zoom but Xcode states that there is no such method.
let camera = self.mapView.camera.zoom
error I receive
"Value of type '(GMSCoordinateBounds, UIEdgeInsets) -> GMSCameraPosition?' has no member 'zoom'"
You can get Zoom level in this way...
let zoom = self.mapView.camera.zoom
Related
The main issue I am having is with my camera. The main issue is that the camera is zoomed in too far on the iPhone X, Xs, Xs Max, and XR Models.
My camera is a full screen camera which is okay on the smaller iPhones but once I get to the models mentioned above the camera seems to be stuck on the max zoom level. What I really want to accomplish is a nature similar to how instagrams camera works. Where it is full screen on all models up until the iPhone X series and then seemingly respect the edge insets or if it is going to be full screen I want it to not be zoomed in so far the way it is now.
My thought process is to use something like this.
Determine the device. I figure I can use something like Device Guru which can be found here to determine the type of device.
GitHub repo can be found here --> https://github.com/InderKumarRathore/DeviceGuru
Using this tool or a similar tool I should be able to get the screen dimensions for the device. Then I can do some type of math to determine the proper screen size for the camera view.
Assuming DeviceGuru didn't work I would just use something like this to get the width and height of the screen.
// Screen width.
public var screenWidth: CGFloat {
return UIScreen.main.bounds.width
}
// Screen height.
public var screenHeight: CGFloat {
return UIScreen.main.bounds.height
}
This is the block of code I am using to fill the camera. However I want to turn into something that is based on the device size as opposed to just filling it despite the phone
import Foundation
import UIKit
import AVFoundation
class PreviewView: UIView {
var videoPreviewLayer: AVCaptureVideoPreviewLayer {
guard let layer = layer as? AVCaptureVideoPreviewLayer else {
fatalError("Expected `AVCaptureVideoPreviewLayer` type for layer. Check PreviewView.layerClass implementation.")
}
layer.videoGravity = AVLayerVideoGravity.resizeAspectFill
layer.connection?.videoOrientation = .portrait
return layer
}
var session: AVCaptureSession? {
get {
return videoPreviewLayer.session
}
set {
videoPreviewLayer.session = newValue
}
}
// MARK: UIView
override class var layerClass: AnyClass {
return AVCaptureVideoPreviewLayer.self
}
}
I want my camera took look something like this
or this
Not this ( what my current camera looks like )
I have looked at many questions and nobody has a concrete solution so please don't mark it as a duplicate and please don't say it's just an issue with the iPhone X series.
Firstly, you need to update your code with the apt information, since this will give a vague idea to a anyone else with less experience.
Looking at the Images it is clear that the type of camera you are trying to access is quite different than the one you have. With introduction of iPhone 7+ and iPhone X, apple has introduced many different camera devices to the users. All these are accesible through AVCaptureDevice.DeviceType.
So by looking at what you want to achieve, it is clear that you want more field of view within the screen. This is accessible by .builtInWideAngleCamera property of the above given capture device. Changing it to this will solve your problem.
Cheers
I notice on my iPhone, after a few seconds of being in direct sun light, the screen will adjust to become brighter, dimmer etc. I was wondering if there was a way to interact with this sensor?
I have an application which is used outside. When you go into direct light, it becomes very difficult to see the screen for a few momments, before it adjusts. And even then, it's not always as bright as I'd like it to be. I would like to implement a high contrast skin for outdoor viewing, and a low contrast for indoor viewing.
Is this possible to read light sensor data, and if so, how do I extract these sensor values?
I would assume there is a light sensor however, as the camera knows when to use the flash.
On the other hand this is a different idea (maybe a silly one), using the brightness of the device's screen you can get some value of the external conditions.
From 0.12 (Dark) to 0.99 (Light)
The next line will get those values, give it a try, put some light on and off over the device to get different values.
NSLog(#"Screen Brightness: %f",[[UIScreen mainScreen] brightness]);
Obviously Automatic Brightness feature should be turned on in order to get this to work.
Regards.
To read the ambient light sensor data, you need to use IOHID in the IOKit framework.
http://iphonedevwiki.net/index.php/AppleISL29003
http://iphonedevwiki.net/index.php/IOKit.framework
However, this requires private headers, so if you use it, Apple probably won't let your app into the app store.
I continually ask the iOS forums whether there will be support for ambient light sensor readings in the future, but to no avail.
You can actually use the camera to do this, which is independent of the user's screen brightness settings (and works even if Automatic Brightness is OFF).
You read the Brightness Value from the video frames' metadata as I explain in this Stack Overflow answer.
Try using GSEventSetBacklightLevel();, which requires <GraphicsServices/GraphicsServices.h>. This is how one can programmatically adjust the brightness levels. There is also a get option, so I think that may have the information you're after.
For Swift 5, here is how to use the brightness detection which indirectly gives you the luminosity of the outside:
/// A view controller (you can use any UIView or AnyObj)
class MyViewConroller: UIViewController {
/// Remove observers on deinit
deinit {
removeObservers()
}
// MARK: - Observers management helpers
/// Add my observers to the vc
func addObservers() {
NotificationCenter.default.addObserver(self, selector: #selector(onScreenBrightnessChanged(_:)), name: UIScreen.brightnessDidChangeNotification, object:nil)
}
/// Clean up observers
func removeObservers() {
NotificationCenter.default.removeObserver(self)
}
/// Load the views
func loadView() {
// Add my observes to the vc
addObservers()
}
/**
Handles brightness changes
*/
#objc func onScreenBrightnessChanged(_ sender: Notification) {
// Tweak as needed: 0.5 is a good value for me
let isDark = UIScreen.main.brightness < 0.5. // in 0...1
// Do whatever you want with the `isDark` flag: here I turn the headlights off
vehicle.turnOnTheHeadlights( isDark )
}
}
For iOS 14 and above, Apple has provided SensorKit (https://developer.apple.com/documentation/sensorkit/srsensor/3377673-ambientlightsensor ) for explicit access to all kinds of sensors and system logs (call logs, message logs, etc.). In addition to the raw lux value, you can also get the chromaticity of the ambient light and the orientation relative to the device sensor.
(From https://developer.apple.com/documentation/sensorkit/srambientlightsample )
Measuring Light Level
var chromaticity: SRAmbientLightSample.Chromaticity A coordinate pair
that describes the sample’s light brightness and tint.
struct SRAmbientLightSample.Chromaticity A coordinate pair that
describes light brightness and tint.
var lux: Measurement An object that describes the
sample’s luminous flux.
var placement: SRAmbientLightSample.SensorPlacement The light’s
location relative to the sensor.
enum SRAmbientLightSample.SensorPlacement Directional values that
describe light-source location with respect to the sensor.
However, you need to request for approval for such an App to be accepted and published on App Store.
I am making iPhone application in which i am using UIMapKit. I have a static pointer (Arrow) in the middle of screen of iPhone and behind that Pointer i am using iOS maps in which user interaction is enable , means user can move the map can zoom in or zoom out the map.
My task is to get the longitude and latitude of that location which comes under my Screen Pointer(Arrow)
Can anyone help me how can i do achieve this. I have very good command on UIMap i just need a idea
You can get middle point of screen using :
CGPoint screenCenterPoint = self.view.center;
and then convert it to coordinates using :
CLLocationCoordinate2D mapCenterPoint = [self.view convertPoint:screenCenterPoint toCoordinateFromView:self.view];
If you use Google maps then you can use this ,too :
CLLocationCoordinate2D mapCenterPoint = [self.mapView.projection coordinateForPoint: screenCenterPoint];
To learn more look at GMSProjection and GMSMapView.
The necessary functionality is built in to MapView.
CLLocationCoordinate2D pinCoordinate = [self.pickupLocationView convertPoint:CGPointMake(self.pickupLocationView.frame.size.width / 2, self.pickupLocationView.frame.size.height) toCoordinateFromView:self.view];
This is UIKit as the poster requested. I believe Akash's solution would work if the question was about Google's map.
I am showing a map view using MapKit framework in my iPhone application. I am also showing the particular friends list Pin point in map view based on the location they are. I would like to enhance now to showing the road map direction from my location to a particular selected friend location. I know the lat and long for both source and destination, but i have draw route line in road direction, it should be road direction. Could someone help me on this?
Thank you!
MapKit does not expose a means of performing driving directions. So, it's not as simple as asking the map to display a course from location A to location B. You have two options:
1) Integrate with Google's API to get the driving directions, and overlay your own lines onto the MapKit map.
or
2) Simply direct your users out of app and delegate this functionality to the built in Map app.
I have no experience with the former, but the later is very easy. Simply:
CLLocationCoordinate2D location = [[map userLocation] location].coordinate;
double currentLat = location.latitude;
double currentLong = location.longitude;
NSString *googleUrl = [[NSString alloc] initWithFormat:#"http://maps.google.com/maps?saddr=%f,%f&daddr=%f,%f", currentLat, currentLong, item.latitude, item.longitude];
NSLog(#"%#", googleUrl);
[[UIApplication sharedApplication] openURL:[[NSURL alloc] initWithString:googleUrl]];
Actually there is no api supported by iPhone sdk to draw route on map. There a repo on github which is using google maps api to draw route on map by using map overlay. It has some limitation but you can take help from this repo - https://github.com/kishikawakatsumi/MapKit-Route-Directions
I am using map functionality in my iphone app. I m showing stores for users current location on map.
Whenever user scrolls the map he needs to be shown stores of new location. eg. suppose user at
New York at first app will show New York stores but when he scrolls map to Texas then app should fire web service request for Texas location. My problem is
1) if web service request goes at each map scroll, app may crash or wait each time for response for new set of stores. (for this i m going to put some hardcoded radius to send request) So how to handle it proper way.
2) I want to know distance between two location so that i can send request to server only if the distance between 2 locations is greater than some specific value.
I am using map view delegates for above functionality. Please suggest me some proper way to handle it.
Thanks
Well to find the distance between 2 points i use
CLLocation *location1 = [[CLLocation alloc]initWithLatitude:[[dict valueForKey:#"lat"] doubleValue] longitude:[[dict valueForKey:#"lon"]doubleValue]];
float distance =[mUserCurrentLocation distanceFromLocation:location1]/1000;
float distanceinMeters=[mUserCurrentLocation distanceFromLocation:location1]; NSString *distancestr= [NSString stringWithFormat:#"%.2f KM",distance];
See If this can help you.