Hi I'm trying to do a code sample that shows you the heart rate form a polar h6 heartrate sensor on a htc one m8. And I have some problems whine I write the
characteristic.setValue(new byte[]{0x01});
Log.d(TAG, "x ok");
gatt.writeCharacteristic(characteristic);
Log.d(TAG, "Writechar ok");
the log says that the writecharacteristic method was done but the oncharacteristicwrite in the callback doesn't react,
Thanks you for your help!
Related
I offer a service which needs the correct location and heading of the mobile device the user is accessing my service with. It was relatively easy to get the information but now I am stuck at finding out how accurate the heading I received is.
Since I'm not using any framework for this or develop in Android/iOS, where there are solutions for that problem, I'd need a solution only depending on javascript (+ 3rd party libraries).
I've found the Generic Sensor API from W3C but couldn't find any sensor which hold that information. Neither the Accelerometer-, nor the AbsoluteOrientationSensor held the needed information.
My current code looks like that..
let accelerometer = null;
try {
accelerometer = new Accelerometer({ frequency: 60 });
accelerometer.addEventListener('error', event => {
// Handle runtime errors.
if (event.error.name === 'NotAllowedError') {
console.log('Permission to access sensor was denied.');
} else if (event.error.name === 'NotReadableError') {
console.log('Cannot connect to the sensor.');
}
});
accelerometer.addEventListener('reading', (event) => {
console.log(accelerometer);
console.log(event);
});
accelerometer.start();
.. but since the problem if more of a 'finding the right tool to do the job'-type, it won't help much in better depicting my problems. Further, Google uses kind of the same functionality I'm aiming for in Google Maps, so theoretically there must be a way to get the heading accuracy.
So, in conclusion, is there any way to retrieve the heading accuracy in a simple javascript environment?
Thanks!
I am trying to implement the button sample from simplepio. I have made the connection as shown in schematics. After pressing the button I do not get the GPIO callback.
Code I am using is same as that of sample. There are no exceptions only "Starting Activity" gets print in log
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
Log.i(TAG, "Starting ButtonActivity");
PeripheralManagerService service = new PeripheralManagerService();
try {
String pinName = BoardDefaults.getGPIOForButton();
mButtonGpio = service.openGpio(pinName);
mButtonGpio.setDirection(Gpio.DIRECTION_IN);
mButtonGpio.setEdgeTriggerType(Gpio.EDGE_FALLING);
mButtonGpio.registerGpioCallback(new GpioCallback() {
#Override
public boolean onGpioEdge(Gpio gpio) {
Log.i(TAG, "GPIO changed, button pressed");
// Return true to continue listening to events
return true;
}
});
} catch (IOException e) {
Log.e(TAG, "Error on PeripheralIO API", e);
}
}
What I have tried so far:
Verified that the circuit and button are functional by running a
python button program in raspbian jessie with following code
#!/usr/bin/env python
import os
from time import sleep
import RPi.GPIO as GPIO
GPIO.setmode(GPIO.BCM)
GPIO.setup(21, GPIO.IN, pull_up_down = GPIO.PUD_UP)
while True:
if (GPIO.input(21) == False):
print("Button Clicked")
sleep(0.1)
The above code prints "Button Clicked" when button is pressed. So I
am sure that the button and GPIO pins on my PI are not an issue.
To make sure there is no issue with logging I also tried
modifying the original program to contain a TextView and a counter
so as when a button is clicked the counter value is incremented and
displayed in TextView but again the callback wasn't received and
TextView wasn't updated.
Tried different edge trigger type but onGpioEdge is never called.
Following is the picture of my setup
Is it just me or is your resistor in the wrong breadboard row
The arrow shows where it is, the circle shows where it should be.
According to the fritzing diagram:
I found the Button driver to be quite unreliable on the Raspberry PI with Android things, after all the driver is pretty much the same code you have.
However, ButtonInputDriver worked flawlessly.
In fact, you do not need to address the GPIO directly and can use the drivers layers, which is simpler. The button driver is here: https://github.com/androidthings/contrib-drivers/tree/master/cap12xx
I suggest you give ButtonInputDriver a try.
It might be possible that I didn't connect the circuit as per schematics or the resistor might not be making good contact. The best way to debug this is as Dave McKelvie suggested is to measure the voltage using voltmeter.
The reason why the Python code was working because the Raspberry PI 3 has internal pull up resistor which was got used as suggested by Dave McKelvie in the comments.
Another reason that the button might not be working is if the GPIO pin is already being used by another application. The logger show the following error for following scenario
Error on PeripheralIO API
com.google.android.things.pio.PioException: android.os.ServiceSpecificException: BCM21 is already in use
at com.google.android.things.pio.GpioImpl.<init>(GpioImpl.java:53)
at com.google.android.things.pio.PeripheralManagerService.openGpio(PeripheralManagerService.java:169)
at com.example.androidthings.simplepio.ButtonActivity.onCreate(ButtonActivity.java:129)
at android.app.Activity.performCreate(Activity.java:6662)
at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1118)
at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2599)
at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2707)
at android.app.ActivityThread.-wrap12(ActivityThread.java)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1460)
at android.os.Handler.dispatchMessage(Handler.java:102)
at android.os.Looper.loop(Looper.java:154)
at android.app.ActivityThread.main(ActivityThread.java:6077)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:865)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:755)
Caused by: android.os.ServiceSpecificException: BCM21 is already in use
at android.os.Parcel.readException(Parcel.java:1697)
at android.os.Parcel.readException(Parcel.java:1636)
at com.google.android.things.pio.IPeripheralManagerClient$Stub$Proxy.OpenGpio(IPeripheralManagerClient.java:776)
at com.google.android.things.pio.GpioImpl.<init>(GpioImpl.java:51)
at com.google.android.things.pio.PeripheralManagerService.openGpio(PeripheralManagerService.java:169)
at com.example.androidthings.simplepio.ButtonActivity.onCreate(ButtonActivity.java:129)
at android.app.Activity.performCreate(Activity.java:6662)
at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1118)
at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2599)
at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2707)
at android.app.ActivityThread.-wrap12(ActivityThread.java)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1460)
at android.os.Handler.dispatchMessage(Handler.java:102)
at android.os.Looper.loop(Looper.java:154)
at android.app.ActivityThread.main(ActivityThread.java:6077)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:865)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:755)
I need to detect when the current playing audio/video is paused. I cannot find anything for 1.0. My app is a bit complex but here is condensed code
/* This function is called when the pipeline changes states. We use it to
* keep track of the current state. */
static void state_changed_cb(GstBus *bus, GstMessage *msg, CustomData *data)
{
GstState old_state, new_state, pending_state;
gst_message_parse_state_changed(msg, &old_state, &new_state, &pending_state);
if(GST_MESSAGE_SRC(msg) == GST_OBJECT(data->playbin))
{
g_print("State set to %s\n", gst_element_state_get_name(new_state));
}
}
gst_init(&wxTheApp->argc, &argv);
m_playbin = gst_element_factory_make("playbin", "playbin");
if(!m_playbin)
{
g_printerr("Not all elements could be created.\n");
exit(1);
}
CustomData* data = new CustomData(xid, m_playbin);
GstBus *bus = gst_element_get_bus(m_playbin);
gst_bus_set_sync_handler(bus, (GstBusSyncHandler) create_window, data, NULL);//here I do video overly stuffs
g_signal_connect (G_OBJECT (bus), "message::state-changed", (GCallback)state_changed_cb, &data);
What do I do wrong? I cannot find working example on connecting such events on Gstreamer 1.0 and 0.x seems a bit different than 1.0 so the vast exaples there don't help
UPDATE
I have found a way to get signals. I run wxWidgets timer with 500ms time span and each time timer fires I call
GstMessage* msg = gst_bus_pop(m_bus);
if(msg!=NULL)
{
g_print ("New Message -- %s\n", gst_message_type_get_name(msg->type));
}
Now I get a lot of 'state-change' messages. Still I want to know if that message is for Pause or Stop or Play or End of Media (I mean way to differentiate which message is this) so that I can notify the UI.
So while I get signals now, the basic problem, to get specific signals, remains unsolved.
You have to call gst_bus_add_signal_watch() (like in 0.10) to enable emission of the signals. Without that you can only use the other ways to get notified about GstMessages on that bus.
Also just to be sure, you need a running GLib main loop on the default main context for this to work. Otherwise you need to do things a bit different.
For the updated question:
Check the documentation: gst_message_parse_state_changed() can be used to parse the old, new and pending state from the message. This is also still the same as in 0.10. From the application point of view, and conceptionally nothing much has changed really between 0.10 and 1.0
Also you shouldn't do this timeout-waiting as it will block your wxwidget main loop. Easiest solution would be to use a sync bus handler (which you already have) and dispatch all messages from there to some callback on the wxwidget main loop.
I used the following sample code in order to test the CodeReader component :
function Page1_TextButton1_OnPressed(e){
Pages.Page1.CodeReader1.visible = true;
Pages.Page1.CodeReader1.readCode(SMF.UI.CodeType.linear,function()
{
alert(Pages.Page1.CodeReader1.value);
},function()
{
alert("There is an error");
});
When I press the button, the control displays the camera stream, but nothing ever happens afterwards : it does not reach neither the OnSuccess nor the OnFailure callback functions.
The camera does not focus either.
Is there any extra code I should add ? Should I explicitly call the phone camera ? How ?
Thanks,
nico
P.S. : I have tested on 2 different android phones
I guess that the codeReader object is so small to read a barcode.
I suggest you to read the article below;
http://www.smartface.io/developer/guides/controls/codereader/
Try this:
1 - Drag in CodeReader into your Design Area.
2 - Write this code in the script for that page.
function MyPage_Self_OnShow(e){
Pages.MyPage.CodeReader1.readCode("[CODE TYPE (e.g. "qr")]",
function(){
alert("Pages.MyPage.CodeReader1.value");
},
function(){
alert("fail");
});
}
Try reading the documentation if you have any other questions!
Okay guys, I've read many things about the FFT stuff, but it seems to be a bit more complicated than building a tableView.
I am searching for a way to analyze the playing audio (from iPod Library) in three ranges (low, mid, high). I think FFT is doing the job, but I'm not sure if I could filter (Lowpass, Bandpass and Highpass) the playing audio and analyze the peaks as well.
So if anyone knows what is the best (by best I mean, fastest (CPU) way to do so, please help me. There will be no front-end, so I won't draw the FFT in a Window (I guess the drawing does eat a lot of the cpu).
Then I have no idea how I could analyze the audio. All the FFT Sample Codes I found are using the mic. I do not want to use the mic. I saw something getting the Audio File and exporting it to a uncompressed file, but I need a live-analysation.
I've had a look at aurioTouch2, but I don't get how I could change the input from the mic to the iPod Library.
I think, the part I'm searching for is here:
// Initialize our remote i/o unit
inputProc.inputProc = PerformThru;
inputProc.inputProcRefCon = self;
CFURLRef url = NULL;
try {
url = CFURLCreateWithFileSystemPath(kCFAllocatorDefault, CFStringRef([[NSBundle mainBundle] pathForResource:#"button_press" ofType:#"caf"]), kCFURLPOSIXPathStyle, false);
XThrowIfError(AudioServicesCreateSystemSoundID(url, &buttonPressSound), "couldn't create button tap alert sound");
CFRelease(url);
// Initialize and configure the audio session
XThrowIfError(AudioSessionInitialize(NULL, NULL, rioInterruptionListener, self), "couldn't initialize audio session");
UInt32 audioCategory = kAudioSessionCategory_PlayAndRecord;
XThrowIfError(AudioSessionSetProperty(kAudioSessionProperty_AudioCategory, sizeof(audioCategory), &audioCategory), "couldn't set audio category");
XThrowIfError(AudioSessionAddPropertyListener(kAudioSessionProperty_AudioRouteChange, propListener, self), "couldn't set property listener");
Float32 preferredBufferSize = .005;
XThrowIfError(AudioSessionSetProperty(kAudioSessionProperty_PreferredHardwareIOBufferDuration, sizeof(preferredBufferSize), &preferredBufferSize), "couldn't set i/o buffer duration");
UInt32 size = sizeof(hwSampleRate);
XThrowIfError(AudioSessionGetProperty(kAudioSessionProperty_CurrentHardwareSampleRate, &size, &hwSampleRate), "couldn't get hw sample rate");
XThrowIfError(AudioSessionSetActive(true), "couldn't set audio session active\n");
XThrowIfError(SetupRemoteIO(rioUnit, inputProc, thruFormat), "couldn't setup remote i/o unit");
unitHasBeenCreated = true;
drawFormat.SetAUCanonical(2, false);
drawFormat.mSampleRate = 44100;
(...)
But I'm quite new to all of these AudioUnits, so I can't understand where an input is loaded. Then, the code mentioned above uses AVAudioSession. A little birdie told me, this will be deprecated, so what is the alternative?
So, basically:
How can I get the currently playing audio in order to do an analyzation? Can I just use a MPMusicPlayerController and get the samples? Or do I have to build a entire AudioUnit which plays the Library?
What is the fastest way (CPU) to analyze lows, mids and highs? Filtering? FFT? Something else?
Will I get in trouble with the Copyrights of bought music? Because I tried to convert the playing file to PCA Samples and sometimes I have this error:
VTM_AViPodReader[7666:307] * Terminating app
due to uncaught exception 'NSInvalidArgumentException', reason:
'* -[AVAssetReader initWithAsset:error:] invalid parameter not
satisfying: asset != ((void *)0)'
What is the "new" way to do an FFT if the whole AVAudioSession stuff won't work in the future?
You can't get the currently playing audio (security sandbox prevents this) on iOS, unless your app is the one playing the audio using certain select APIs (Audio Queue, RemoteIO, etc.)
3 bandpass filters (made with IIR biquads) will be faster than an FFT. But even a full FFT will use a very small percentage of CPU time.
An app can't convert or play protected music from the iTunes library in a form where samples can be captured.
The FFT is in the Accelerate framework, not in the audio session.