We try to adapt our navigator application for Android Automotive OS and we see that if we use STREAM_MUSIC for navigator notifications then it doesn't mix with FM-radio.
If we use STREAM_NOTIFICATION then navi does mix with FM-radio, but there is no way in Android Car Settings to change STREAM_NOTIFICATION volume.
Do we understand correctly that streams doesn't map to AudioAttributes in Automotive OS automatically and all navigator applications will have to move to USAGE_ASSISTANCE_NAVIGATION_GUIDANCE to run properly on Automotive OS?
You should use usage and content audio attributes in your application instead of legacy stream types.
Stream types like STREAM_MUSIC, STREAM_NOTIFICATION, etc. are superseded by AudioAttributes for defining the behavior of audio playback and most of the stream-based API is already deprecated - one exception in regular Android is volume control (refer to AudioManager and methods like setStreamVolume, adjustStreamVolume, etc.).
But in Android Automotive implementations the Android Framework is configured to use fixed volume policy which means there's also no stream-based volume control. Actual volume control is expected to be realized by Audio HAL or hardware amplifier (refer to Android Automotive Audio documentation) and for such case there's a separate volume control API based on volume groups in CarAudioManager.
Please also remember that before starting an audio stream, the application should request audio focus using the same audio attributes as it will use for its audio stream.
Simplified example:
AudioAttributes playbackAttributes = new AudioAttributes.Builder()
.setUsage(AudioAttributes.USAGE_MEDIA)
.setContentType(AudioAttributes.CONTENT_TYPE_MUSIC)
.build();
MediaPlayer player = new MediaPlayer();
player.setDataSource( /* ... */);
player.setAudioAttributes(playbackAttributes);
player.prepare();
AudioFocusRequest focusRequest = new AudioFocusRequest.Builder(AudioManager.AUDIOFOCUS_GAIN)
.setAudioAttributes(playbackAttributes)
.setOnAudioFocusChangeListener( /* ... */)
.build();
int result = mAudioManager.requestAudioFocus(focusRequest);
if (result == AudioManager.AUDIOFOCUS_REQUEST_GRANTED) {
// Respect audio focus
player.start();
}
Related
I'm writing a small app that will operate with Bluetooth devices. I want it to only manage peripheral devices (mouse, keyboard, trackpad, etc.).
I'm using IOBluetoothDevice.pairedDevices() to get a list of devices. I need to filter it out to only include peripheral devices. I'm trying to use IOBluetooth instance properties deviceClassMajor and deviceClassMinor to do that. Unfortunately, apple documentation does not say what each value means.
Is there some document or spec that describes which deviceClassMajor and deviceClassMinor refers to a keyboard or mouse or trackpad? Or maybe there is another way to check if a particular device is peripheral?
I'm using altbeacon android library for beacon detection in my app. It runs foreground services as a feature by default, but I only want to run foreground services when I enter a beacon region or when one beacon is detected. This problem does not appear to be documented anywhere. Would you be able to help me?
Using AndroidBeaconLibrary 2.19+, autobind APIs can be used to easily switch to using a foreground service after a detection.
The basic steps are:
In a custom Application class, call beaconManager.startMonitoring(region) in the onCreate() method. This sets up initial beacon detections using scheduled jobs.
When you get a callback to didEnterRegion() you will need to call beaconManager.stopMonitoring(...) and beaconManager.stopRanging(...) for all monitored and ranged regions.
After step 2, configure the library for a foreground service as shown here
Again start monitoring/ranging any desired regions.
Be careful with the above approach, as standard Android will block starting a foreground service from the background in Android 12 in some cases. In addition, some non-standard OEMs already do this on earlier Android versions.
In general, the recommended practice is to set up the foreground service only if (a) the app is in the foreground or (b) you know the user has recently interacted with your app's UI. If Android blocks your app from starting the foreground service, starting monitoring/ranging from the background with a foreground service configured will cause your app to crash. Because the conditions that may cause this are complex and hard to predict, this technique may lead to unexpected crashes and related bugs.
One alternative to the above is to use the new IntentScanStrategy introduced in the library, which allows faster background scanning without the need for a foreground service.
I'm planning to develop simple audio player, but currently I'm stuck on notification support. I want notification that is persistent while audio is playing and I also need some controls (like play/pause, next/previous) and possibly fanart visible on notification. That's similar as most other player do (Aimp, Google Play Music etc).
I was looking forward for good example how to develop it using Dart/Flutter, well is that even possible without writing native code for each platform (Android/iOS)? Is there any plugin that supports that kind of notification on both platforms? Plus, of course, foreground service bound to it to don't kill audio playback when screen is off.
You might be interested in looking at package:audio_service which seems like it might handle the majority of the work for you, including the handling of background execution and notifications.
Also, for notifications I've had success with package:flutter_local_notifications but you shouldn't need this if you're going to use package:audio_service.
How to control the camera retake in Ziggeo for Ionic3. Ziggeo is taking the user to the camera and according to the device option, their user can take a lot of retakes. is it possible to stop the camera retakes or user may reflect back to ionic app as soon as user take video (Stop recording button).
I tried to found this on Ziggeo documentation but didn't got succeed.
Let me first mention that I work at Ziggeo. Now with that being said, lets get cracking :)
When camera is requested on the desktop systems the browsers talk to OS and OS talks to the drivers. The drivers talk with the camera and provide the video data. On the mobile devices this is slightly different.
The mobile browser will ask the system, which will reply by activating the camera app. The camera app is different for different versions of system and system itself, however in general they refuse to listen for any parameters that are sent to them from browser.
This is why you might see the option to retake the recording on the mobile devices.
The purpose of Ziggeo is to however provide a way to use camera and mic in many ways. As such there is a way to actually skip the native app and go to a new way of recording videos.
This is accomplished by adding the webrtc_on_mobile parameter when you are creating your app.
var ziggeoApp = new ZiggeoApi.V2.Application({
token:"APPLICATION_TOKEN",
webrtc_streaming_if_necessary: true,
webrtc_on_mobile: true
});
Now the above is just the HTML version of it. The Ionic is a bit different. Currently it is not possible, however it will be possible in the next update.
Edit 2020:
To support iOS webrtc_streaming_if_necessary: true was created. This is because the WebRTC implementation of WebRTC on those systems is for streaming, not the standard WebRTC. By using it, you make sure that you are not using the WebRTC Streaming unless it is actually necessary to do so.
Added the way you would use it in the above code.
You can always check and find the latest on the header building page on Ziggeo here: https://ziggeo.com/docs/sdks/javascript/browser-integration/header
Is it possible for me to programmatically access a smartphone's sensors (e.g. accelerometer, compass, etc. on an Android or iPhone device) through a browser webpage and JavaScript? I know that the W3C Devices standard can allow access to the camera.
HTML5 is likely to contain a sensor API. Until this is fully standardized, vendors provide their own APIs such as Apple does for mobile Safari.
There's no need for full blown solutions like PhoneGap or similar if it is Ok for you to restrict yourself to a specific vendor/device. If not, frameworks like PhoneGap provide you with a unified, device independent API.
You should be aware of the Performance constraints that apply to Javascript applications running inside the browser of a mobile device. Depending on your type of application and the amount of processing you intend to do on the sensor data, you are better off writing a native application
See https://developer.apple.com/library/safari/iPad/#documentation/SafariDOMAdditions/Reference/DeviceMotionEventClassRef/DeviceMotionEvent/DeviceMotionEvent.html for some reference documentation.
The answer is both "yes" and "no". Each phone manufacturer/OS combination behaves as it sees fit here - for example, the GPS on an iPhone can be accessed, but the compass not:
accessing iPhone compass with JavaScript
You can use something like PhoneGap to do this, I believe.
Check out this chapter called "Controlling the iPhone with JavaScript" from the book Building iPhone Apps with HTML, CSS, and JavaScript
This demo considers the iPhone movements on the three axis using the event.accelerationIncludingGravity object:
http://www.omiod.com/iphone/acceleration-demo.php
So far Safari on iPhone is the first to implement it, but I see Android filling this gap very soon.