SUPSTRONICS X400 Raspberry pi 3 with Android things - raspberry-pi

I've been trying to create a car head using the raspberry pi and android things. In order to power the car audio I bought this amp Suptronics X400 but I haven't been able to use it as the default output for audio and I'm trying to integrate the Spotify SDK. I tried to create the drive but most of the Documentation here has been removed from the libraries. I'm a bit lost

The audio driver user driver is no longer available in Android Things. The right way is to use the AudioTrack class and set the preferred device type, as is done in this sample project.
You may need to specify the audio bus you want to send sounds to:
private AudioDeviceInfo findAudioDevice(int deviceFlag, int deviceType) {
AudioManager manager = (AudioManager) this.getSystemService(Context.AUDIO_SERVICE);
AudioDeviceInfo[] adis = manager.getDevices(deviceFlag);
for (AudioDeviceInfo adi : adis) {
if (adi.getType() == deviceType) {
return adi;
}
}
return null;
}
Then find the I2S bus:
mAudioInputDevice = findAudioDevice(AudioManager.GET_DEVICES_INPUTS, AudioDeviceInfo.TYPE_BUS);
mAudioOutputDevice = findAudioDevice(AudioManager.GET_DEVICES_OUTPUTS, AudioDeviceInfo.TYPE_BUS);
Then you can run audioTrack.setPreferredDevice(mAudioOutputDevice);

Related

flutter_reactive_ble scanned devices not including device UUIDs

_scanStream = flutterReactiveBle
.scanForDevices(withServices: <Uuid>[]).listen((device) {
_foundBleUARTDevices.add(device);
for (var element in _foundBleUARTDevices) {
print(
"${element.serviceUuids}---${element.name}---${element.id}---${element.serviceData}---${element.rssi}");
}
I am using the above code snippet to scan for BLE devices and on discovering and printing them out, the UUID is an empty list, and when I try to scan with a specific UUID no device is found. I am using ESP32 as the BLE device and android as the platform to run/debug the app
after waiting for hours, I got no answer from anyone and I had to keep trying, and what I find out is that I can actually connect to the BLE device without the Service UUID but just the id which is the BLE device mac address I think.
I used the below snippet to establish the connection:
void _connectToDevice() {
// We're done scanning, we can cancel it
// _scanStream.cancel();
// Let's listen to our connection so we can make updates on a state change
Stream<ConnectionStateUpdate> _currentConnectionStream = flutterReactiveBle
.connectToAdvertisingDevice(
id: _ubiqueDevice.id,
prescanDuration: const Duration(seconds: 3),
withServices: <Uuid>[]);
_currentConnectionStream.listen((event) {
print(":::::::: ${event.connectionState.name}");
});
}
so I will now proceed to see how I can read and write to the characteristics, i hope someone find this helpful

How to integrate Hand Controlle HC1 in my car simulator game

I m building a car simulator game using Unity. For the input I m using Logitheck steering wheel G29. Now I need to use Hand Controller to accelerate or break.
This is my Hand Controller
Hand Controller HC1
Link
Now I can I interpect his input ? This device is recognize by my windows 10 system, but if I try to start the game with this device I cannot accelerate or break the car.
I configured this in my InputController of Unity:
And in my IRDSPlayerControls.cs file I write these lines of code:
if (Input.anyKey)
{
foreach (KeyCode kcode in Enum.GetValues(typeof(KeyCode)))
{
Debug.Log("Joystick pressed " + kcode);
}
}
Debug.Log("Input debug acc: " + Input.GetAxis("Vertical3"));
Debug.Log("Input debug frenata: " + Input.GetAxis("Vertical4"));
In Console of Unity, I can display this:
Input debug acc: -1
Input debug frenata: -1
You can detect a specific button on a specific joystick joystick 1 button 0, joystick 1 button 1, joystick 2 button 0…
or a specific button on any joystick joystick button 0, joystick button 1, joystick button 2…
Check out Input Manager
I can explain this step by step here but it wont be as good as some tutorials online. I recommend this video as a good tutorial to do this.
UPDATE:
I think your hand controller give analog values and the acceleration/brake buttons are not actually buttons but they are analog joy sticks and have a range of values.
to check this use Input.GetJoystickNames :
using UnityEngine;
public class Example : MonoBehaviour
{
// Prints a joystick name if movement is detected.
void Update()
{
// requires you to set up axes "Joy0X" - "Joy3X" and "Joy0Y" - "Joy3Y" in the Input Manager
for (int i = 0; i < 4; i++)
{
if (Mathf.Abs(Input.GetAxis("Joy" + i + "X")) > 0.2 ||
Mathf.Abs(Input.GetAxis("Joy" + i + "Y")) > 0.2)
{
Debug.Log(Input.GetJoystickNames()[i] + " is moved");
}
}
}
}
I will suggest to first check if those inputs are being received with:
if (Input.anyKey)
{
foreach(KeyCode kcode in Enum.GetValues(typeof(KeyCode)))
{
Debug.Log(kcode);
}
}
This way you can know if the game is recognizing the keycodes of your controller, and if it does, which names are assigned to them.
Once you got this, you only need to check keycodes as an usual keyboard!
Not every joystick, steer etc. is mapping it's inputs to the same axis.
There is a unity forum about that topic (and other related problems). And I found that there are some unity plugins, that could probably solve your problem:
https://github.com/speps/XInputDotNet
https://github.com/JISyed/Unity-XboxCtrlrInput
There are some programs that you can use to list all input axis and see which one you are currently affecting. I used one of them but don't remember the name of it. It might help you to see to which axis your break and throttle are mapped to.
Some of them also allow you to remap then, if this is what you want.
The most probable cause is Unity3D does not support this device.
Unity3D uses a mix of XInput, GameInput?, and USB HID processing for its input on Windows.
It is unclear(closed source), if GameInput is used on Windows, it is required on the modern XBOX's.
I cannot provide a definitive answer, since I do not have this controller to test, and the documentation on the controller is sparse.
The best I can do is point you in the right direction.
Does the device exist in Unity3D:
See if the Input System identifies the device when plugged in while running (make sure the game window has focus):
Adapted from https://docs.unity3d.com/Packages/com.unity.inputsystem#1.4/manual/HowDoI.html
InputSystem.onDeviceChange +=
(device, change) =>
{
switch (change)
{
case InputDeviceChange.Added:
// New Device.
Debug.Log("New device added.");
break;
case InputDeviceChange.Disconnected:
// Device got unplugged.
break;
case InputDeviceChange.Connected:
// Plugged back in.
break;
case InputDeviceChange.Removed:
// Remove from Input System entirely; by default, Devices stay in the system once discovered.
break;
default:
// See InputDeviceChange reference for other event types.
break;
}
}
A lack of log output, when plugged in means the device was not identified as a potential input device. Skip to "All else Fails" below.
Identification at this level does not imply support, as it may flag all HID devices.
Look at all low level input events while pressing the buttons:(Also adapted from 4)
var trace = new InputEventTrace(); // Can also give device ID to only
// trace events for a specific device.
trace.Enable();
//…run stuff
var current = new InputEventPtr();
while (trace.GetNextEvent(ref current))
{
Debug.Log("Got some event: " + current);
}
// Trace consumes unmanaged resources. Make sure to dispose.
trace.Dispose();
The chances of getting here with responses(given the edited output) are slim, but if it happens explore the output to find hints to the device associations and fix your mappings accordingly.
All else Fails
Request device support though Unity3D.com website. Highly recommended.
You can write your own support for the device using either the USB HID, may be flagged by virus scanners, and there is limited documentation or implement a custom GameInput interface. The inclusion in Windows Game Controllers makes this the most probable solution.

Using multiple audio devices simultaneously on osx

My aim is to write an audio app for low latency realtime audio analysis on OSX. This will involve connecting to one or more USB interfaces and taking specific channels from these devices.
I started with the learning core audio book and writing this using C. As I went down this path it came to light that a lot of the old frameworks have been deprecated. It appears that the majority of what I would like to achieve can be written using AVAudioengine and connecting AVAudioUnits, digging down into core audio level only for the lower things like configuring the hardware devices.
I am confused here as to how to access two devices simultaneously. I do not want to create an aggregate device as I would like to treat the devices individually.
Using core audio I can list the audio device ID for all devices and change the default system output device here (and can do the input device using similar methods). However this only allows me one physical device, and will always track the device in system preferences.
static func setOutputDevice(newDeviceID: AudioDeviceID) {
let propertySize = UInt32(MemoryLayout<UInt32>.size)
var deviceID = newDeviceID
var propertyAddress = AudioObjectPropertyAddress(
mSelector: AudioObjectPropertySelector(kAudioHardwarePropertyDefaultOutputDevice),
mScope: AudioObjectPropertyScope(kAudioObjectPropertyScopeGlobal),
mElement: AudioObjectPropertyElement(kAudioObjectPropertyElementMaster))
AudioObjectSetPropertyData(AudioObjectID(kAudioObjectSystemObject), &propertyAddress, 0, nil, propertySize, &deviceID)
}
I then found that the kAudioUnitSubType_HALOutput is the way to go for specifying a static device only accessible through this property. I can create a component of this type using:
var outputHAL = AudioComponentDescription(componentType: kAudioUnitType_Output, componentSubType: kAudioUnitSubType_HALOutput, componentManufacturer: kAudioUnitManufacturer_Apple, componentFlags: 0, componentFlagsMask: 0)
let component = AudioComponentFindNext(nil, &outputHAL)
guard component != nil else {
print("Can't get input unit")
exit(-1)
}
However I am confused about how you create a description of this component and then find the next device that matches the description. Is there a property where I can select the audio device ID and link the AUHAL to this?
I also cannot figure out how to assign an AUHAL to an AVAudioEngine. I can create a node for the HAL but cannot attach this to the engine. Finally is it possible to create multiple kAudioUnitSubType_HALOutput components and feed these into the mixer?
I have been trying to research this for the last week, but nowhere closer to the answer. I have read up on channel mapping and everything I need to know down the line, but at this level getting the audio at. lower level seems pretty undocumented, especially when using swift.

I can't get data by I2C bus in Android Things

I can't get data by I2C bus in Android Things on Raspberry Pi 3.
I connect Android Things on RPi and DS18B20(Temperature Sensor).
Connect to RPi
and run the I2C address scan App (https://github.com/dennisg/i2c-address-scanner), but can't find available address.
for (int address = 0; address < 256; address++) {
//auto-close the devices
try (final I2cDevice device = peripheralManagerService.openI2cDevice(BoardDefaults.getI2cBus(), address)) {
try {
device.readRegByte(TEST_REGISTER);
Log.i(TAG, String.format(Locale.US, "Trying: 0x%02X - SUCCESS", address));
} catch (final IOException e) {
Log.i(TAG, String.format(Locale.US, "Trying: 0x%02X - FAIL", address));
}
} catch (final IOException e) {
//in case the openI2cDevice(name, address) fails
}
}
How do I get data by I2C?
The linked test project appears to use the I2C protocol poorly in several ways.
First, it makes the assumption that there is only a single I2C bus on a particular board. Although that is true now, it may not scale to additional SOMs that support Android Things in the future.
Second, it assumes the register address to read from (0x00). This may have worked for whatever device the developer started with, but a large number of I2C peripherals may not respond to that address.
You should take a look at the datasheet for this device. After a cursory examination, it seems that there is no register corresponding to 0x00. Additionally, it has a custom read flow that would make it difficult to use the snippet above. The microcontroller getting these values on your sensor probably throws them away and returns no signal.
It may be useful to read through the datasheet again. It seems like the sensor is "one-wire" instead of I2C. While the two protocols may be similar, using the built in readByte method may make some assumptions in the data transmission that may not match the peripheral protocol exactly.
As Nick Felker wrote in his answer DS18B20 has 1-Wire interface and you can't connect it to Raspberry Pi with Android Things directly. You should use industrial (e.g DS2482-100) or custom MCU-based (like in that project) 1-Wire <-> I2C converter, or other (e.g. USB <-> 1-Wire, UART<-> 1-Wire) converters.

How to calculate Voice and Data usage in BlackBerry 10 application

Hi I am developing an application in BlackBerry 10 platform which will calculate data usage and voice usage of user device for particular duration.
There are some of the feasibility check need to be done for this which are as follows
Does BB10 API supports data usage calculation? If yes, Can I differentiate 3G/Cellular data from WiFi data?If yes, how can I achieve this?
How can I calculate Voice usage in BB 10 application? Voice usage is nothing but duration of all calls happened within particular timespan
Is there any API BB10 provides through which I can check if device is currently in Roaming or not?
Please let me know if this can be done in BB 10 application
Does BB10 API supports data usage calculation?
Yes, there are a few for API's for this
Can I differentiate 3G/Cellular data from WiFi data?
Yurp you can.
1) Add the following line to your .pro file:
LIBS += -lbbdevice
2) Make sure you include:
#include <bb/device/NetworkDataUsage>
3) Getting data useage for cellular network only
bb::device::NetworkDataUsage *nduCell = new bb::device::NetworkDataUsage("cellular0");
nduCell ->update();
quint64 bytesSent = nduCell ->bytesSent();
quint64 bytesReceived = nduCell ->bytesReceived();
4) Getting data useage for wifi only
bb::device::NetworkDataUsage *nduWifi = new bb::device::NetworkDataUsage("tiw_sta0");
nduWifi ->update();
quint64 bytesSent = nduWifi ->bytesSent();
quint64 bytesReceived = nduWifi ->bytesReceived();
That will give your the data useage since the device has started.
You will need to call ndu->update() regularly to get the most recent data usage statistics.
Extra Info:
Changing the parameter for the NetworkDataUsage changes the interface it minitors:
cellular0 == Cellular
tiw_sta0 == Wifi
ecm0 == USB
To find out which interfaces are available on your device:
1) Add the following line to your .pro file:
QT += network
2) Make sure you include:
#include <QNetworkInterface>
3) Displaying available interfaces
QList<QNetworkInterface> interfaces = QNetworkInterface::allInterfaces();
for (int i = 0; i < interfaces.size(); i++) {
qDebug() << QString::number(i) + ": " + interfaces.value(i).humanReadableName();
}
How can I calculate Voice usage in BB 10 application? Voice usage is nothing but duration of all calls happened within particular timespan
This can be done using Phone class.
There is signal call void callUpdated (const bb::system::phone::Call &call, ) using which we can get to know if incoming call is received or outgoing call is initiated.
With the combination of this and Timer class we can calculate Voice usage of device. (This code is not tested)