how to detect motion of iPhone 6 device? (determine whether iPhone device moved or not -smallest possible motion on x,y,z- ) - iphone

I'm working on a task to determine when the iPhone 6 is moving (smallest possible move not even a shake!) at any direction (x,y or Z) .
what is the best way to achieve that?

I used this code and found it useful, it contains four functions :
- Start motion manager
- Stop motion manager
- Update motion manager
- magnitudeFromAttitude
import CoreMotion
let motionManager: CMMotionManager = CMMotionManager()
var initialAttitude : CMAttitude!
//start motion manager
func StartMotionManager () {
if !motionManager.deviceMotionActive {
motionManager.deviceMotionUpdateInterval = 1
motionManager.startDeviceMotionUpdates()
}
}
//stop motion manager
func stopMotionManager ()
{
if motionManager.deviceMotionActive
{
motionManager.stopDeviceMotionUpdates()
}
}
//update motion manager
func updateMotionManager (var x : UIViewController)
{
if motionManager.deviceMotionAvailable {
//sleep(2)
initialAttitude = motionManager.deviceMotion.attitude
motionManager.startDeviceMotionUpdatesToQueue(NSOperationQueue.currentQueue(), withHandler:{
[weak x] (data: CMDeviceMotion!, error: NSError!) in
data.attitude.multiplyByInverseOfAttitude(initialAttitude)
// calculate magnitude of the change from our initial attitude
let magnitude = magnitudeFromAttitude(data.attitude) ?? 0
let initMagnitude = magnitudeFromAttitude(initialAttitude) ?? 0
if magnitude > 0.1 // threshold
{
// Device has moved !
// put the code which should fire upon device moving write here
initialAttitude = motionManager.deviceMotion.attitude
}
})
println(motionManager.deviceMotionActive) // print false
}
}
// get magnitude of vector via Pythagorean theorem
func magnitudeFromAttitude(attitude: CMAttitude) -> Double {
return sqrt(pow(attitude.roll, 2) + pow(attitude.yaw, 2) + pow(attitude.pitch, 2))
}

Related

iOS RealityKit. Changing Entity's translation causes unexpected behaviour

I am trying to create some AR experience.
I load the Model with animations as an Entity. Lets call it a Toy.
I create an AnchorEntity.
I attach the Toy to the AnchorEntity. Up to this point everything works great.
I want the Toy to walk in random directions. And it does for the first time. Then it gets interesting, allow me to share my code:
First method uses a newly created Transform for the Toy with the modified translation x, y, to make the Toy move and that is it.
func walk(completion: #escaping () -> Void) {
guard let robot = robot else {
return
}
let currentTransform = robot.transform
guard let path = randomPath(from: currentTransform) else {
return
}
let (newTranslation , travelTime) = path
let newTransform = Transform(scale: currentTransform.scale,
rotation: currentTransform.rotation,
translation: newTranslation)
robot.move(to: newTransform, relativeTo: nil, duration: travelTime)
DispatchQueue.main.asyncAfter(deadline: .now() + travelTime + 1) {
completion()
}
}
We get that new Transform from the method below.
func randomPath(from currentTransform: Transform) -> (SIMD3<Float>, TimeInterval)? {
// Get the robot's current transform and translation
let robotTranslation = currentTransform.translation
// Generate random distances for a model to cross, relative to origin
let randomXTranslation = Float.random(in: 0.1...0.4) * [-1.0,1.0].randomElement()!
let randomZTranslation = Float.random(in: 0.1...0.4) * [-1.0,1.0].randomElement()!
// Create a translation relative to the current transform
let relativeXTranslation = robotTranslation.x + randomXTranslation
let relativeZTranslation = robotTranslation.z + randomZTranslation
// Find a path
var path = (randomXTranslation * randomXTranslation + randomZTranslation * randomZTranslation).squareRoot()
// Path only positive
if path < 0 { path = -path }
// Calculate the time of walking based on the distance and default speed
let timeOfWalking: Float = path / settings.robotSpeed
// Based on old trasnlation calculate the new one
let newTranslation: SIMD3<Float> = [relativeXTranslation,
Float(0),
relativeZTranslation]
return (newTranslation, TimeInterval(timeOfWalking))
}
The problem is that the value of Entity.transform.translation.y grows from 0 to some random value < 1. Always after the second time the walk() method is being called.
As you can see, every time the method is called, newTranslation sets the Y value to be 0. And yet the Toy's translation:
I am out of ideas any help is appreciated. I can share the whole code if needed.
I have managed to fix the issue by specifying parameter relativeTo as Toy's AnchorEntity:
toy.move(to: newTransform, relativeTo: anchorEntity, duration: travelTime)

Swift Charts delay for realtime data

I am using Swift 5 with Charts 3.6.0 ( Line Chart - cubic lines ) to plot real-time watchOS core motion. The goal is to display watch movement as quickly as possible. Since my sample rate is high, I suspect there will be a bottleneck in updating the view, and as such, would only like to display N most recent items.
Here is the watch function that sends the data immediately, as mentioned in Apple docs, and numerous tutorials:
motion.deviceMotionUpdateInterval = 1.0 / 120.0
motion.startDeviceMotionUpdates(using: .xArbitraryZVertical, to: queue) { [self] (deviceMotion: CMDeviceMotion?, _ : Error?) in
guard let motion = deviceMotion else { return }
self.sendDataToPhone(quaternion: motion.attitude.quaternion, time: Double(Date().timeIntervalSince1970))
}
private func sendDataToPhone(quaternion: CMQuaternion, time: Double) {
if WCSession.default.isReachable {
WCSession.default.sendMessageData(try! NSKeyedArchiver.archivedData(withRootObject: [quaternion.x, quaternion.y, quaternion.z, quaternion.w, time], requiringSecureCoding: false), replyHandler: nil, errorHandler: nil);
}
}
Once received, the packets are interpreted by the session() function on the iPhone:
onViewDidLoad() {
self.lineChartView.leftAxis.axisMinimum = -1;
self.lineChartView.leftAxis.axisMaximum = 1;
}
func session(_ session: WCSession, didReceiveMessageData messageData: Data) {
let record : [Double] = try! NSKeyedUnarchiver.unarchivedObject(ofClasses: [NSArray.self], from: messageData) as! [Double]
laggyFunction(qaternions: [simd_quatd.init(ix: record[0], iy: record[1], iz: record[2], r: record[3])], quaternionTimes: [record[4]])
}
private func laggyFunction(qaternions: [simd_quatd], quaternionTimes: [Double]) {
DispatchQueue.main.sync {
let dataset = self.lineChartView.data!.getDataSetByIndex(0)!
var x = dataset.entryCount + 1;
for quaternion in qaternions {
let _ = dataset.addEntry(ChartDataEntry(x: Double(x), y: quaternion.vector.w))
x += 1;
}
// limit the amount of points
while (dataset.entryCount > 5) {
let _ = dataset.removeFirst()
}
// - re index so entries start from 1
for startIdx in 1..<dataset.entryCount {
dataset.entryForIndex(startIdx - 1)!.x = Double(startIdx);
}
self.lineChartView.data!.notifyDataChanged()
self.lineChartView.notifyDataSetChanged()
}
}
Logic flow:
As packets come in, new entries are added to the initial dataset from the lineChartView. In the event there are more than 5, first N are removed. Then, the x values are re-indexed on the chart to ensure a sequential flow.
The problem:
The delay in updating the UI chart is very high. The elapsed time of both functions to complete is plotted below. At the 73rd percentile, the laggy function is able to accommodate the sample rate of incoming packets ( < 1/120 = 0.008) . The session function seems to accommodate the sample rate throughout. The CDF plot, in my opinion, does not do it justice. Visually, the chart is very "sluggish". As an experiment, if I throw the watch against the wall, I can observe it hitting the concrete well before the chart is updated.
My goal is to update the Chart as quickly as possible to observe watch motion and discard new entries until the UI is updated. What is the correct way to do this with my chart choice?

MKTileOverlay, tiles at 180.0 / -180.0 not drawn consistenly

I used MKTileOverlay class to cover the map by self generated tile images.
All works good, except the tiles on the border at Longitude 180 or -180 degree. At this line, tiles are drawn only sometimes... can anybody give me a hint to solve that?
you can see the effect on this screenshot
This particular area of the map should be covered completely by this "default" tiles. The tile images itself should be OK, as they are displayed on the other tiles.
I use this loadTile(at: ... ) function to provide the generated tile images. The print statements shows that this function is called for all tiles and that the result function gets a valid image. It's just that the tiles are not drawn .. and I use the standard MKTileOverlayRenderer..
override func loadTile(at path: MKTileOverlayPath, result: #escaping (_ data: Data?, _ error: Error?) -> Void) {
let x: Int = path.x
let y: Int = path.y
let zoomLevel : Int = path.z
// calculate the x for the tile at longitude 180 degree
let xMax = (1 << zoomLevel) - 1
if (x == 0) || (x == xMax) {
print("\(zoomLevel)/\(x)/\(y) requested")
}
// local variable to hold the image of the tile
var localUIImage: UIImage = tileImageForDefaultImage
// lots of stuff to generate the tile image
// check if we have a valid image
if let resultImage = localUIImage.pngData() {
if (x == 0) || (x == xMax) {
print("resultImage: \(resultImage.debugDescription)")
}
result(resultImage, nil )
} else {
let noResultImage = tileImageForDefaultImage.pngData()
if (x == 0) || (x == xMax) {
print("noResultImage: \(noResultImage.debugDescription)")
}
result(noResultImage, nil )
}
}
.. any hint is welcomed ;-)
In short: Apple confirmed that this is a bug in IOS MapKit. At least IOS versions 11 and 12 are affected. There is no known work around so far.
Long version: I spend a DTS ticket for this and got in contact with a really good Apple engineer. After some work together, he could easily reproduce the issue. He asked me to file a bug report (49270907). By this he was able to talk to the MapKit team and they confirmed the bug.

Audiokit logging - 55: EXCEPTION (-1): "" - repeatedly

I updated to latest version of Audiokit 4.5
and my Audiokit class that is meant to listen to microphone amplitude is now printing: 55: EXCEPTION (-1): "" infinitely on the console. the app doesnt crash or anything.. but is logging that.
My app is a video camera app that records using GPUImage library.
The logs appear only when I start recording for some reason.
In addition to that. My onAmplitudeUpdate callback method no longer outputs anything, just 0.0 values. This didnt happen before updating Audiokit. Any ideas here ?
Here is my class:
// G8Audiokit.swift
// GenerateToolkit
//
// Created by Omar Juarez Ortiz on 2017-08-03.
// Copyright © 2017 All rights reserved.
//
import Foundation
import AudioKit
class G8Audiokit{
//Variables for Audio audioAnalysis
var microphone: AKMicrophone! // Device Microphone
var amplitudeTracker: AKAmplitudeTracker! // Tracks the amplitude of the microphone
var signalBooster: AKBooster! // boosts the signal
var audioAnalysisTimer: Timer? // Continuously calls audioAnalysis function
let amplitudeBuffSize = 10 // Smaller buffer will yield more amplitude responiveness and instability, higher value will respond slower but will be smoother
var amplitudeBuffer: [Double] // This stores a rolling window of amplitude values, used to get average amplitude
public var onAmplitudeUpdate: ((_ value: Float) -> ())?
static let sharedInstance = G8Audiokit()
private init(){ //private because that way the class can only be initialized by itself.
self.amplitudeBuffer = [Double](repeating: 0.0, count: amplitudeBuffSize)
startAudioAnalysis()
}
// public override init() {
// // Initialize the audio buffer with zeros
//
// }
/**
Set up AudioKit Processing Pipeline and start the audio analysis.
*/
func startAudioAnalysis(){
stopAudioAnalysis()
// Settings
AKSettings.bufferLength = .medium // Set's the audio signal buffer size
do {
try AKSettings.setSession(category: .playAndRecord)
} catch {
AKLog("Could not set session category.")
}
// ----------------
// Input + Pipeline
// Initialize the built-in Microphone
microphone = AKMicrophone()
// Pre-processing
signalBooster = AKBooster(microphone)
signalBooster.gain = 5.0 // When video recording starts, the signal gets boosted to the equivalent of 5.0, so we're setting it to 5.0 here and changing it to 1.0 when we start video recording.
// Filter out anything outside human voice range
let highPass = AKHighPassFilter(signalBooster, cutoffFrequency: 55) // Lowered this a bit to be more sensitive to bass-drums
let lowPass = AKLowPassFilter(highPass, cutoffFrequency: 255)
// At this point you don't have much signal left, so you balance it against the original signal!
let rebalanced = AKBalancer(lowPass, comparator: signalBooster)
// Track the amplitude of the rebalanced signal, we use this value for audio reactivity
amplitudeTracker = AKAmplitudeTracker(rebalanced)
// Mute the audio that gets routed to the device output, preventing feedback
let silence = AKBooster(amplitudeTracker, gain:0)
// We need to complete the chain, routing silenced audio to the output
AudioKit.output = silence
// Start the chain and timer callback
do{ try AudioKit.start(); }
catch{}
audioAnalysisTimer = Timer.scheduledTimer(timeInterval: 0.01,
target: self,
selector: #selector(audioAnalysis),
userInfo: nil,
repeats: true)
// Put the timer on the main thread so UI updates don't interrupt
RunLoop.main.add(audioAnalysisTimer!, forMode: RunLoopMode.commonModes)
}
// Call this when closing the app or going to background
public func stopAudioAnalysis(){
audioAnalysisTimer?.invalidate()
AudioKit.disconnectAllInputs() // Disconnect all AudioKit components, so they can be relinked when we call startAudioAnalysis()
}
// This is called on the audioAnalysisTimer
#objc func audioAnalysis(){
writeToBuffer(val: amplitudeTracker.amplitude) // Write an amplitude value to the rolling buffer
let val = getBufferAverage()
onAmplitudeUpdate?(Float(val))
}
// Writes amplitude values to a rolling window buffer, writes to index 0 and pushes the previous values to the right, removes the last value to preserve buffer length.
func writeToBuffer(val: Double) {
for (index, _) in amplitudeBuffer.enumerated() {
if (index == 0) {
amplitudeBuffer.insert(val, at: 0)
_ = amplitudeBuffer.popLast()
}
else if (index < amplitudeBuffer.count-1) {
amplitudeBuffer.rearrange(from: index-1, to: index+1)
}
}
}
// Returns the average of the amplitudeBuffer, resulting in a smoother audio reactivity signal
func getBufferAverage() -> Double {
var avg:Double = 0.0
for val in amplitudeBuffer {
avg = avg + val
}
avg = avg / amplitudeBuffer.count
return avg
}
}

iPhone GPS CoreLocation to get accurate reading quickly

I'm new to GPS and using CoreLocation to narrow a user's position down to ideally 2m. I've read somewhere that iPhone GPS coordinates accurate up to 4m, but I'm trying a few things to improve the accuracy reading.
My current solution
involves setting a 70 second window to average a list of up to 3 coordinates, and replacing them as newer more accurate readings come in. The averaged coordinate jumps around for about a minute before landing within an ~8m (observed) vicinity of the true location.
Select code blocks below:
...
locationManager.desiredAccuracy = kCLLocationAccuracyBestForNavigation
locationManager.distanceFilter = kCLDistanceFilterNone
...
var timer = 70
func startTracking() {
var _ = Timer.scheduledTimer(timeInterval: 1.0, target: self, selector: #selector(countdown), userInfo: nil, repeats: true)
}
#objc func countdown() { self.timer -= 1 }
func locationManager(_ manager: CLLocationManager, didUpdateLocations locations: [CLLocation]) {
// 70 seconds to pin down verify location
if self.timer <= 1 {return}
if let oldLoc = locationDataService.initialLocationEstimate {
updateInitialLocation(newLocation: self.locationManager.location!)
}
}
...
updateInitialLocationEstimate() {
let newLocationAccuracy = newLocation.horizontalAccuracy
if self.initialLocationAccuracy != nil {
// don't use if greater than 12m and less accurate.
if newLocationAccuracy > self.initialLocationAccuracy! && newLocationAccuracy > 8 {
return
} else if newLocation.horizontalAccuracy < initialLocationAccuracy! {
// eliminates the less accurate readings
trimByAccuracy(locations: &self.initialLocationEstimates, newAccuracy: newLocationAccuracy)
self.initialLocationAccuracy = newLocationAccuracy
}
if (self.initialLocationEstimates.count > 4) {
initialLocationEstimates.removeFirst()
}
} else {
self.initialLocationAccuracy = newLocationAccuracy
}
self.initialLocationEstimates.append(newLocation)
self.initialLocationEstimate = estimateCentralCoordinate(locations: initialLocationEstimates, accuracy: initialLocationAccuracy!)
}
I'm struggling on a few things:
1) Half the time it takes to get an >4m (observed) accuracy rating. Currently, it takes about a minute to get an ~8m reading. Is there anything I can do?
2) Improve observed accuracy. Anything suggestions would be helpful here.
3) It has been suggested to me to use Kalman Filters, but the velocity given to me by CoreLocation is iffy. While I'm standing still, it frequently fluctuates between +/- 0-1m/s, is there some setting that I'm not aware of to stabilize the readings?
4) Should I be averaging/filtering at all? I've read that good GPS software already does averages, but I've come across nothing on the iOS CoreLocation documentation.
Really, any help would be appreciated. Thanks.