Extracting accelerometer data from Microsoft Band - accelerometer

I am trying to do an analysis of data from various fitness tracker devices and i need to extract the accelerometer data from microsoft band jaw bone and Basis Peak for this.
I know they have the analyzed data in their app, so the data must be somewhere on my phone, but how do i access it?
Does someone know how can i do this?

Jawbone does not provide a way to access the raw accelerometer data. However, you can programmatically access the events (e.g., steps, sleep, heartrate) generated from the aggregated accelerometer data by using the Jawbone UP API.

It is not stored on the phone in an accessible way. You need to use the band api to communicate directly with the band and subscribe to this data.

As for Microsoft Band you can use SDK to get live accelerometer data from the device:
The Microsoft Band SDK gives developers access to the sensors available on the Band
MicrosoftHealth SDK website

How to subscribe to sensor data is described in the Microsoft Band SDK documentation. You should refer to section 5 "Subscribing to Band Sensors" (page 17).
The documentation can be found here:
https://developer.microsoftband.com/Content/docs/Microsoft%20Band%20SDK.pdf

Related

iOS api for real time traffic data (like geoloqi and Waze)

I have been visiting some services api website but can't find what I am looking for. My question is more aimed at what resources to use rather than how exactly it should be done.
My iphone app requirement is to be able to track users that are nearby, are commuting and locate them on map. Additional requirement maybe texting them, call them, have video session with them etc. On a high level, this will convert to something like
get user details based on longitude and latitude
get to know if they are registered users of the service subscribed
Sending message to user/users
Call user using iphone phone api or dedicated app session
Video call
Waze is one of them. While it is open source, there is quite less documentation on how one can use it as backend for real time traffic data.
Then there is this Geoloqi which is paid, but has iOS SDK as well as rich api. However I cannot find sections that are useful to me when I look up to my requirements listed above. What I believe is that there must be many apps already relying on such a useful service. If any of them are open source / tutorials, it would be most useful resource for me for feasibility of geoloqi. Geoloqi also charges users for using their api, so it is also important for me to know what features come at what price .
For the level of data/information your interested in, and the functionality, you should just make your own app, I dont think you need those APIs.
You can find and send the coordinates of the people who are using your app to your server. Then you need to determine the distance between them, to see if they are in the zone of talking, or whatever other functionality you have listed above.
To determine the distance between two people, This answer should be helpful: Calculate distance between 2 GPS coordinates

Reading NMEA data from iPhone GPS receiver

I am looking for a way to get NMEA data from the builtin gps receiver of a iPhone.
Is there a way to access this data without using CLLocationFramework?
NO, apple does not allow you to acces the GPS in an other way then via the CoreLocation Framework.
This also means that device without an GRPS chip can support location (it will use triangulation via WiFi).
I don't see why you wouldn't be able to forge your own $GPGGA sentence based in information that can be retrieved from CLLocationFramework. Obviously you won't be able to set then number of satellites and HDOPs value will be approximated. If you must have NMEA0183 sentences, you could create them yourself.

iPhone indoor location based app

I am researching how to create an app for my work that allows clients to download the app (preferably via the app store) and using some sort of wifi triangulation/fingerprints be able to determine their location for essentially an interactive tour.
Now, my question specifically is what is the best route to take for the iPhone? None of the clients will be expected to have jail broken iPhones.
To my understanding this requires the use of the wifi data which is a private api therefore not meeting the app store requirements. The biggest question I have is how does American Museum of Natural History get away with using the same technology, but still available on the app store?
if you're unfamiliar with American Museum of Natural History interactive tour app, see here:
http://itunes.apple.com/us/app/amnh-explorer/id381227123?mt=8
Thank you for any clarification you can provide.
I'm one of the developers of the AMNH Explorer app you're referencing.
Explorer uses the Cisco "Mobility Services Engine" (MSE) behind the scenes to determine its location. This is part of their Cisco wifi installation. The network itself listens for devices in the museum and estimates their position via Wifi triangulation. We do a bit of work in the app to "ask" the MSE for our current location.
Doing this work on the network side was (and still is) the only available option for iOS since, as you've found, the wifi scanning functions are considered to be private APIs.
If you'd like to build your own system and mobile app for doing something similar, you might start with the MSE.
Alternatively, we've built the same tech from Explorer into a new platform called Meridian which provides location-based services on both iOS and Android. Definitely get in touch with us via the website if you're interested in building on that.
Update 6/1/2017
Thought I would update this old answer - AMNH is no longer using the Wifi-based system I describe above, as of a few years ago. They now use an installation of a few hundred battery-powered Bluetooth Beacons (also provided by Meridian). The device (iOS or Android) scans for nearby beacons and, based on their known locations and RSSI values, triangulates a position. You can read more about it in this article.
Navizon offers an indoor positioning solution that works for iOS as well as any other platform. You can check it out here:
http://www.navizon.com/product-navizon-indoor-triangulation-system
It works by triangulating the WiFi signals transmitted by the device. Since it doesn't require an app to run on the phone, it bypasses the iOS limitations and can locate any other WiFi device for that matter.
Google recently launched an API called Maps Geolocation API. You can use it for indoor tracking of devices, which essentially can be used to achieve something similar to what AMNH's app does.
I would do this using Augmented Reality. There is a system sort of in place for this, the idea being that you place physical markers that have virtual information associated with them. I believe the system I saw was a type of bar code. When a user holds up the phone with the app, the app uses the camera to read the code and then display information. This could easily be used to make a virtual tour type app distributable through the app store and not even require a WIFI or 3/4G connection. This assumes that you simply load your information and store it locally with your app. Then to update it you simply push an update through the app store. Another solution is to use a SOAP/REST service and provide the information in that way, and this does not use private API's, though it does require some form of internet connection. For this you can see a question I asked about this topic a little bit ago:
SOAP/XML Tutorials Question
In addition, you could load a map of your tour location, and based on what code is scanned you can locate the user on the map and give suggested routes based on interests etc.
I found this tutorial recently on augmented reality, I haven't gone through it, but if its anything like the rest of Ray's tutorials, it will be extremely helpful.
http://www.raywenderlich.com/3997/introduction-to-augmented-reality-on-the-iphone
I'll stick around to clarify any questions or other concerns you may have with your app.
To augment the original answer for devs who were using Cisco MSE for indoor location - now they have an iOS and Android SDK which enables you to do indoor location using the MSE. A simulator can be used as well to develop the app without implementing the infrastructure to start with : https://developer.cisco.com/site/cmx-mobility-services/downloads/
For indoor location you can use Bluetooth LE beacons since it's a very accessible technology nowadays, there are several methods:
Trilateration: it uses 3 beacons, but with the noise and attenuation of Bluetooth signals, it gets quite difficult to determine the exact position and also it's not easy to use more than 3 beacons to increase accuracy.
Levenberg Marquadt method: used to solve non-linear squares problems showed good results on indoor positioning.
Dead Reckoning method: using the motion co-processor of the device, giving an initial position you can calculate the moving path of the device. Not that easy to implement anyway.
I wrote a post on the topic, you can find more info here: http://bits.citrusbyte.com/indoor-positioning-with-beacons/
And you can use this iOS app for your own indoor positioning experiments: https://github.com/citrusbyte/beacons-positioning
I doubt the American Museum is actually using private APIS; you'll probably find the routers that have been setup serve different responses to each other, so the app can detect it's position in the museum.
If you are looking for a cheaper to way to do the same task, you could have signs with QR codes, and use an open source library to let users scan these barcodes as they move through the museum, and update the onscreen content accordingly. On an even more low tech level, you can just tag each area with unique numbers, and distinguish that way.

scan/take picture of check and populate data

Hey, can i scan/take picture of a check and identify the account number and routing number using phonegap on iPhone
Probably not, at least not directly.
You are going to need to look into some opensource ocr software and integrate it into phone gap.

acessing to iphone camera data only with API

I would like to access in real time to the data of the camera to get the hue of several points in order to guide the user (inform him when is the best moment to take the picture).
The application will be probably available on the appstore and then I want to just use allowed API. I've seen a lot of similar topics, some of them telling this is possible but none of them showing a solution.
Do you have any idea for this?
Thanks in advance :)
You need the undocumented UIGetScreenImage() function; an Apple representative recently stated their approval of the use thereof in the iPhone developer forums.