Retrieving heart rate data from Android watch face service - service

Does anyone have links to sample code that shows how to set up heart sensor data retrieval directly from CanvasWatchFaceService or CanvasWatchFaceService.Engine ?
(Most of the code I've seen thus far perform sensor data retrieval from a wearable's Activity class, but not from a watch face environment.)
I've tried setting things up this way:
private class Engine extends CanvasWatchFaceService.Engine implements SensorEventListener {
...
... // implement interface methods
}
But I keep getting a null object for my sensor.
My manifest contains:
<uses-feature android:name="android.hardware.type.watch" />
<uses-permission android:name="android.permission.BODY_SENSORS" />

Check if you requested android.permission.BODY_SENSORS permission in your both mobile and wearable AndroidManifest.
Also, look into the logcat of your wearable device and grep for android.permission.BODY_SENSORS.

Related

Flutter Geolocator.isLocationServiceEnabled() isn't working as intended

I'm trying to access a location in Flutter via the Geolocator package.
This works well but the location permission check has some sort of bug.
I check if the location service is enabled. Once it's accessed i check the users permission and then return location. This works quite as intended, but there is a problem with the first step.
I use
Geolocator.isLocationServiceEnabled()
to check if the location service is enabled. Unfortunately, this function just returns in some cases which leads to the whole function not returning, even though the location service is activated.
I already searched in different forums for answers and it seems to be a known problem although i didn't find a solution yet.
Attemps i tried to get access to GPS but failed:
turning GPS off and on again
closing and opening the app again
inserted
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION"/>
in AndroidManifest
hot restart in IDE
I could just access the location if i re-run debugging, but not each time...
Does anybody have a clue how to get this built-in function to work as intended?
Thanks in advance!
I found a workaround.
I used the permission_handler package and replaced
Geolocator.isLocationServiceEnabled()
with
final hasPermission = await Permission.locationWhenInUse.serviceStatus.isEnabled;

NFC and Flutter implementation

I have been searching and googling for how NFC works with mobile apps , but I was not able to get the whole picture .
I want to make an NFC based Device with Arduino , something like this :
https://arduinogetstarted.com/tutorials/arduino-rfid-nfc
so basically I want my app to be able to communicate with my Arduino so when my mobile is close to it it will identify it and do something based on my code .
I have heard of many packages on pub like : nfc_manager to help me achieve that .
my extra question is there is something i can do to make the app only recognize the Arduino NFC reader that i made ?
Looks like you can't, but you can make assertion on the tag technology and only execute logic if the tag tech is the same as the one you put on your arduino
From the article you post, it looks like your tag is a MiFare tag. So you can use this static method of the MiFare class :
static MiFare? from(NfcTag tag) => $GetMiFare(tag);
The result of this is a nullable MiFare object, so in your code you can do :
tech = MiFare.from(tag);
if (tech is MiFare) {
// enter code here
}
You can also check that tech != null is true
This make your app only look for the same kind of NFC tag as the one you have. If you want more fine grain control, you might want to make some assertion on the data of the tag.

How can I get sensor data into Unity while using Google Cardboard?

I have the basic Google Cardboard Unity application loaded and working great.
I want to utilize a few sensors on my phone that aren't available in Unity. For example, I want to get access to the STEP_DETECTOR sensor.
I created my own UnityPlayerActivity, and without Cardboard, it seems to work well. My problem is that I have no idea how to use my custom UnityPlayerActivity WITH cardboard.
From what I can tell, the cardboard demo uses a "UnityCardboardActivity" class as the main activity. I took a look at UnityCardboardActivity.jar and it looks like the UnityCardboardActivity class inherits from CardboardActivity, NOT UnityActivity.
So my guess is that UnityCardboardActivity is manually starting the default UnityPlayerActivity somewhere in its code, but I can't change that to start my own custom UnityPlayerActivity in any way that I can tell.
Is there any way to get that sensor data without using a UnityPlayerActivity?
I tried making my activity extend UnityCardboardActivity instead, but for some reason I don't have access to the "getSystemService" method when do that, so I can't get access to any sensors.

Multiple live stream video publishers using FMS, Wowza, etc.?

I need to develop a web portal with multiple live stream publishers (up to 4), and many viewers, using RTMP.
Live video publishers are well known and always the same, so in the case of using FMS (since I have some experience with Flash and Influxis), I would have no problem of using FMLE for video publishers. But the problem is how to synchronize in the media server all 4 connections to show properly on the client side. I have tested the one-connection live example that brings FMS and works fine.
Video resolution is not an issue, since we don't mind low resolution 320x240 for example. Also, we need to develop the plaform by ourselves, not depending on external platforms of live streaming. Is there any tutorial or example to use as a start point?
What would you suggest?? thanks!
Ok, I have now found the solution and, I have to say, was extremely easy. I write if someone else has the same problem.
Finally I've solved with Flash Media Live Encoder. You have to create 4 (in my case) video objects in your webpage like below, changing localhost for your hostname.
<object width='640' height='377' id='StrobeMediaPlayback' name='StrobeMediaPlayback' type='application/x-shockwave-flash' classid='clsid:d27cdb6e-ae6d-11cf-96b8-444553540000'>
<param name='movie' value='swfs/StrobeMediaPlayback.swf' />
<param name='quality' value='high' />
<param name='bgcolor' value='#000000' />
<param name='allowfullscreen' value='true' />
<param name='flashvars' value='&src=rtmp://localhost/live/livestream&autoHideControlBar=true&streamType=live&autoPlay=true' />
<embed src='swfs/StrobeMediaPlayback.swf' width='640' height='377' id='StrobeMediaPlayback' quality='high' bgcolor='#000000' name='StrobeMediaPlayback' allowfullscreen='true' pluginspage='http://www.adobe.com/go/getflashplayer' flashvars='&src=rtmp://localhost/live/livestream&autoHideControlBar=true&streamType=live&autoPlay=true' type='application/x-shockwave-flash'> </embed>
</object>
As you can see, by default the stream name is "livestream", you have to change for every object to be different. Ensure "live" folder is created (when you install FMS in localhost by default creates this folder, but in influxis you have to create manually).
Every video publisher has to open Flash Media Live Encoder and change the Output value of "Stream" by the value of stream name of the respective video object.
That's it! Works perfectly, great resolution and great performance, better than expected. Hope it helps!

Creating dynamic manifest files that work with ipads and iphones

Problem: The browsers within apples ipad and iphone don't seem to like dynamically generated manifest files (we constantly get errors involving either missing images or .aspx pages that can be accessed from the device or "Application Cache manifest could not be fetched"). We originally had a manifest.ashx acting as our manifest that would dynamically create and pull some pieces from the web server for offline app functionality. This process worked fine for the majority of browsers and mobile devices but failed on the apple products.
Thoughts: For some reason safari doesn't seem to register the manifest.ashx correctly (this is where we dynamically create the manifest file) and just gives up on trying to open it. We truly need a dynamic manifest file for the requirements of the project so switching to a static manifest file would not work. Does anyone have any suggestions towards alternative creation methods for dynamic manifest files?
Code:
manifest.ashx
public class Manifest : IHttpHandler
{
public void ProcessRequest( HttpContext context )
{
ManifestGenerator generator = new ManifestGenerator();
context.Response.ContentType = "text/cache-manifest";
//Create the dynamic manifest file here (returns the manifest as a string)
context.Response.Write( generator.GenerateManifest() );
context.Response.Flush();
}
}
Thanks,
Updated Thoughts v1: Leaning towards thinking this maybe a device specific manifest fault as all the other mobile and desktop devices are accessing the app just fine (including being able to go offline). Currently I have moved back to a dynamically generated manifest (within the manifest.ashx) and the ipad / iphone still dies when trying to fetch but it does get further then it did before (error was: "Application Cache update failed, because "file path goes here" could not be fetched"). A strange aside to this is the fact that the desktop version of safari handles the web app just fine (as well as an install of chrome on the ipad had no troubles accessing the site on/off line) while the mobile versions of it do not.
Updated Thoughts v2: Seems that this issue is safari specific as I have the web app running online/offline with chrome for the apple products (iphone/ipad). Still looking for a fix / work around for the safari browsers though...
For Safari/iPad, the manifest file must end with .manifest. Atleast, that's what my tests determined.
So, in order to make this work, you will have to dynamically generate the .manifest file using a HttpHandler and some changes in web.config to do map cache.maifest to the handler. The idea is that the call to the non-existent cache.manifest would actually get mapped to the handler, which would then send back dynamic content.
This is currently the part I'm stuck at, so I cannot help you here yet.