I have a messaging app that receives push notification from Firebase Cloud Messaging and when it does, it pushes it to a stream. In order to keep track of different streams, I have a streamtable which is a global variable. This is what I have
Map<String, StreamController<dynamic>> streamtable;
Future<dynamic> myBackgroundMessageHandler(Map<String, dynamic> message) async {
log('FCM onBackground: $message');
log('streamTable: $streamtable');
addToStream(message);
await showNotification(message);
return;
}
bool addToStream(Map<String, dynamic> message) {
var data;
if (Platform.isAndroid) {
data = message['data'];
} else if (Platform.isIOS) {
data = message;
}
bool status;
try {
switch (data['streamType']) {
case "Message":
streamtable['MessageStream'].add(message);
status = true;
break;
default:
log("streamType: ${data['streamType']}");
status = false;
break;
}
} catch (e) {
status = false;
log('addToStream: error($e)');
}
return status;
}
When the app is in the foreground, everything works great. However, when the app is in the background, streamtable becomes null, and nothing works.
First off, why is that happening? If I bring the app back to foreground, streamtable is not null.
How do I store data I receive when the app is in the background?
Mobile systems work differently than desktop systems. For desktop systems, "in the background" just means the UI is not on top, but the program is running. For mobile systems, "in the background" means the app is not used and as such can be killed by the operating system any time it sees fit.
Different systems have different callbacks to notify the app of impeding shutdown so it can write it's last data to a storage and restore from that state when it's called again.
The thought that your app holds global data and retains it in memory, since it's running, just "in the background" is not correct. It won't. It can be gone any time on mobile systems.
You will need to actually save the data you want to restore later, the "memory" only lasts as long as it's in the foreground or the operating system feels generous.
Seems like it's a bug in flutter: (https://github.com/FirebaseExtended/flutterfire/issues/1878)
Related
I'm using SignalR for push notifications on my Flutter app and that works ok. I get the message from the backend and show notification using flutter_local_notifications. The problem is that the SignalR service would shut down after some time.
How can I make my app stay on in the background? and even start on reboot?
Here's my code:
import 'dart:async';
import 'dart:convert';
import 'package:flutter/material.dart';
import 'package:isolate_test/Model/UserMessageModel.dart';
import 'package:signalr_core/signalr_core.dart';
import 'EndPointService.dart';
import 'NotificationService.dart';
class SignalRProvider {
static String appName = "NOTIFICATION";
static String? userName = "";
static String deviceName = "android_app";
static List<UserMessageModel> messages = <UserMessageModel>[];
HubConnection connection = HubConnectionBuilder()
.withUrl(
'my_url',
HttpConnectionOptions(
logging: (level, message) => print(message),
))
.withAutomaticReconnect()
.withHubProtocol(JsonHubProtocol())
.build();
Function(bool update)? onMessagesUpdateCallback;
SignalRProvider({
this.onMessagesUpdateCallback,
});
setUsername(String username) {
userName = username;
}
Future initSignalR(BuildContext context) async {
WidgetsFlutterBinding.ensureInitialized();
await NotificationService().init();
connection.on('SignalRUserReceiveMessage', (message) async {
var data = message!.first;
if (data != null) {
UserMessageModel msg = UserMessageModel.fromJson(data);
messages.add(msg);
msg.showNotification();
}
if (onMessagesUpdateCallback != null) {
onMessagesUpdateCallback!(true);
}
});
connection.on('SignalRMonitoringMessage', (message) async {
var data = message!.first;
if (data != null) {
UserMessageModel msg = UserMessageModel.fromJson(data);
messages.add(msg);
msg.showNotification();
}
if (onMessagesUpdateCallback != null) {
onMessagesUpdateCallback!(true);
}
});
connection.on("SignalRReceiveConnectedMessage", (message) async {
await connection.send(methodName: 'SignalRInit', args: [
userName,
appName,
connection.connectionId,
]);
});
connection.on("SignalRReceiveDisconnectedMessage", (message) async {
if (connection.state == HubConnectionState.disconnected) {
connection.start();
}
});
await connection.start();
}
List<UserMessageModel> getMessages() {
return messages;
}
Future deleteMessage(UserMessageModel _msg) async {
if (_msg == null) return;
var response =
await EndPointService().SetupApi("Message", "", []).httpDelete(
HeaderEnum.BasicHeaderEnum,
ResponseEnum.ResponseModelEnum,
jsonEncode(_msg),
);
}
addOrUpdateMessage(UserMessageModel _msg) {
if (_msg == null) return;
if (messages != null) {
var found =
messages.firstWhere((e) => e.user == _msg.user && e.id == _msg.id);
var index =
messages.indexWhere((e) => e.user == _msg.user && e.id == _msg.id);
if (found != null) {
messages[index] = _msg;
} else {
messages.add(_msg);
}
if (onMessagesUpdateCallback != null) {
onMessagesUpdateCallback!(true);
}
}
}
setMessagesUpdateCallback(Function(bool update) func) {
onMessagesUpdateCallback = func;
}
}
SignalR problems
SignalR for Flutter uses web sockets and SSE to receive messages from the SignalR service. If the app was terminated because the user restarted their phone or the OS shut down the app to save battery, these push notifications would not be received by the app.
To overcome this, app developers (and SignalR) have to use FCM on Android, and APNs on iOS (or FCM which will also use APNs on iOS). All other approaches will be more limited because the operating systems do not allow users to keep background processes running the entire time. This was actually allowed years ago, but the operating systems have made these changes to save the user battery - they enforce that all apps go through the same push notification medium - FCM on Android, APNs on iOS.
SignalR for Flutter uses neither FCM nor APNs. At it's current state, SignalR is not well suited for Android or iOS - take a look at the comments with people struggling with similar problems to you on How to use signalr in Android.
Alternative solution
The simplest / easiest way to get started is to use Firebase Cloud Messaging.
On Android, it will be used directly to send messages to devices, and
on iOS, FCM will use APNs to reach devices reliably
Caveat: On Android, there is a more complicated alternative called unifiedpush, but the limitations include showing a notification to the user at all times to handle background notifications.
My analysis: This is all done based on my quick investigation by reading the pubspec.yaml, the GitHub issues on the original repo, the SignalR documentation, and some experience implementing Push Notifications for Flutter.
Disclosure: I just released a push notification library 2 days ago called push which would be well suited to these types of Push Notification packages making the transformation to using FCM on Android and APNs on iOS. However, as an app developer, in most cases, you should use firebase_messaging, not push.
I worked with SignalR but on native Platform(IOS & Android), I made stock app and get realtime price. When app go to background, I will disconnect with SignalR server after 5 second, and when app go to foreground again, I check if app's current state not connect to server SignalR, I'll connect again. I think it not good if your app still connect and receiver data from signalR server in background state.
I work on a Flutter mobile app and I want to detect the first app launch to show a little tutorial to the user. I have tests the Shared-Preferences Module But when If I start the app for the first time The console tells me the key is not recognized, I think it's normal because this key does exist! There is another method for checking that?
Thank you guys
use shared preferences to store value that indicate if user has ever been in this page or not
like this
try {
final SharedPreferences sharedPreferences = await SharedPreferences.getInstance();
splash = sharedPreferences.getString("SPLASH_DONE");
} catch (e) {
print("this is first time");
}
if (splash == null) {
page = AppSplashScreen();
} else {
page = LoginPage();
}
and after completing your first time operation call
sharedPreferences.setString("SPLASH_DONE", "DONE");
I need to send a file .txt from a device to my app (worst case almost 2mb). The BLE device divide the file into packages. I don't know if my method is correct, but I create a loop of characteristic.write/characteristic.read telling everytime what package the device has to send.
Here's my code:
for(int i = 0; i < packNumber.length; i++) {
initialValue = '9,50,100,$i,0,$checksumId,0/';
await characteristic
.write(utf8.encode(initialValue)).then((wValue) async {
await Future.delayed(Duration(milliseconds: 100)).then((value) async {
await characteristic.read().then((rValue) {
//do something with rVlaue
});
});
});
}
It works, but is it the best solution? And in case how can I speed up the transfer (for now I have to set a delay before reading, waiting for characteristic.write to finish)?
Thank you guys
You should change the remote device to send notifications instead of using reads. That gives maximum throughput.
I need to ensure that a certain HTTP request was send successfully. Therefore, I'm wondering if a simple way exists to move such a request into a background service task.
The background of my question is the following:
We're developing a survey application using flutter. Unfortunately, the app is intended to be used in an environment where no mobile internet connection can be guaranteed. Therefore, I’m not able to simply post the result of the survey one time but I have to retry it if it fails due to network problems. My current code looks like the following. The problem with my current solution is that it only works while the app is active all the time. If the user minimizes or closes the app, the data I want to upload is lost.
Therefore, I’m looking for a solution to wrap the upload process in a background service task so that it will be processed even when the user closes the app. I found several posts and plugins (namely https://medium.com/flutter-io/executing-dart-in-the-background-with-flutter-plugins-and-geofencing-2b3e40a1a124 and https://pub.dartlang.org/packages/background_fetch) but they don’t help in my particular use case. The first describes a way how the app could be notified when a certain event (namely the geofence occurred) and the second only works every 15 minutes and focuses a different scenario as well.
Does somebody knows a simple way how I can ensure that a request was processed even when there is a bad internet connection (or even none at the moment) while allowing the users to minimize or even close the app?
Future _processUploadQueue() async {
int retryCounter = 0;
Future.doWhile(() {
if(retryCounter == 10){
print('Abborted after 10 tries');
return false;
}
if (_request.uploaded) {
print('Upload ready');
return false;
}
if(! _request.uploaded) {
_networkService.sendRequest(request: _request.entry)
.then((id){
print(id);
setState(() {
_request.uploaded = true;
});
}).catchError((e) {
retryCounter++;
print(e);
});
}
// e ^ retryCounter, min 0 Sec, max 10 minutes
int waitTime = min(max(0, exp(retryCounter)).round(), 600);
print('Waiting $waitTime seconds till next try');
return new Future.delayed(new Duration(seconds: waitTime), () {
print('waited $waitTime seconds');
return true;
});
})
.then(print)
.catchError(print);
}
You can use the plugin shared_preferences to save each HTTP response to the device until the upload completes successfully. Like this:
requests: [
{
id: 8eh1gc,
request: "..."
},
...
],
Then whenever the app is launched, check if any requests are in the list, retry them, and delete them if they complete. You could also use the background_fetch to do this every 15 minutes.
I'm using the Plugin.Media from #JamesMontemagno version 2.4.0-beta (which fixes picture orientation), it's working on Adroind 4.1.2 (Jelly Bean) and Marshmallow, but NOT on my Galaxy S5 Neo with Android version 5.1.1.
Basically when I take a picture it never returns back on the page from where I started the process; always returns back to the initial home page.
On devices where it works, when I take a picture, I see that first of all the application fires OnSleep, then after taking the picture fires OnResume.
On my device where is NOT working it fires OnSleep and after taking the picture doesn't fire OnResume, it fires the initialization page and then OnStart.
For this reason it doesn't open the page where I was when taking the picture.
What should I do to make sure it fires OnResume returning to the correct page and not OnStart which returns on initial fome page ?
In addition, when I take a picture it takes almost 30 seconds to get back to the code after awaiting TakePhotoAsync process, and it's too slow!
Following my code:
MyTapGestureRecognizerEditPicture.Tapped += async (sender, e) =>
{
//Display action sheet
String MyActionResult = await DisplayActionSheet(AppLocalization.UserInterface.EditImage,
AppLocalization.UserInterface.Cancel,
AppLocalization.UserInterface.Delete,
AppLocalization.UserInterface.TakePhoto,
AppLocalization.UserInterface.PickPhoto);
//Execute action result
if (MyActionResult == AppLocalization.UserInterface.TakePhoto)
{
//-----------------------------------------------------------------------------------------------------------------------------------------------
//Take photo
await CrossMedia.Current.Initialize();
if (!CrossMedia.Current.IsCameraAvailable || !CrossMedia.Current.IsTakePhotoSupported)
{
await DisplayAlert(AppLocalization.UserInterface.Alert, AppLocalization.UserInterface.NoCameraAvailable, AppLocalization.UserInterface.Ok);
}
else
{
var MyPhotoFile = await CrossMedia.Current.TakePhotoAsync(new Plugin.Media.Abstractions.StoreCameraMediaOptions
{
Directory = "MyApp",
Name = "MyAppProfile.jpg",
SaveToAlbum = true,
PhotoSize = Plugin.Media.Abstractions.PhotoSize.Small
});
if (MyPhotoFile != null)
{
//Render image
MyProfilePicture.Source = ImageSource.FromFile(MyPhotoFile.Path);
//Save image on database
MemoryStream MyMemoryStream = new MemoryStream();
MyPhotoFile.GetStream().CopyTo(MyMemoryStream);
byte[] MyArrBytePicture = MyMemoryStream.ToArray();
await SaveProfilePicture(MyArrBytePicture);
MyPhotoFile.Dispose();
MyMemoryStream.Dispose();
}
}
}
if (MyActionResult == AppLocalization.UserInterface.PickPhoto)
{
//-----------------------------------------------------------------------------------------------------------------------------------------------
//Pick photo
await CrossMedia.Current.Initialize();
if (!CrossMedia.Current.IsPickPhotoSupported)
{
await DisplayAlert(AppLocalization.UserInterface.Alert, AppLocalization.UserInterface.PermissionNotGranted, AppLocalization.UserInterface.Ok);
}
else
{
var MyPhotoFile = await CrossMedia.Current.PickPhotoAsync();
if (MyPhotoFile != null)
{
//Render image
MyProfilePicture.Source = ImageSource.FromFile(MyPhotoFile.Path);
//Save image on database
MemoryStream MyMemoryStream = new MemoryStream();
MyPhotoFile.GetStream().CopyTo(MyMemoryStream);
byte[] MyArrBytePicture = MyMemoryStream.ToArray();
await SaveProfilePicture(MyArrBytePicture);
MyPhotoFile.Dispose();
MyMemoryStream.Dispose();
}
}
}
};
Please help!! We need to deploy this app but we cannot do it with this problem.
Thank you in advance!
It is perfectly normal to have the Android OS terminate and restart an Activity. As you are seeing, your app's Activity it will be automatically restarted when the camera app exits and the OS returns control to your app. The odds are it just needed more memory in order to take that photo with the Neo's 16MP camera, you can watch the logcat output to confirm that.
Restarted – It is possible for an activity that is anywhere from paused to stopped in the lifecycle to be removed from memory by Android. If the user navigates back to the activity it must be restarted, restored to its previously saved state, and then displayed to the user.
What to do:
So on the Xamarin.Forms OnStart lifecycle method you need to restore your application to a valid running state (initializing variables, preforming any bindings, etc...).
Plug code:
The Android platform code for the TakePhotoAsync method looks fine to me, but remember that the memory for that image that is passed back via the Task will be doubled as it is marshaled from the ART VM back the Mono VM. Calling GC.Collect() as soon as possible after the return will help (but your Activity is restarting anyway...)
public async Task<MediaFile> TakePhotoAsync(StoreCameraMediaOptions options)
{
~~~
var media = await TakeMediaAsync("image/*", MediaStore.ActionImageCapture, options);
In turn calls:
this.context.StartActivity(CreateMediaIntent(id, type, action, options));
Not much less you can really do within the Android OS to popup the Camera.
In addition, when I take a picture it takes almost 30 seconds to get back to the code after awaiting TakePhotoAsync process, and it's too slow!
Is that on your Neo? Or all devices?
I would call that very suspect (ie. a bug) as even flushing all the Java memory after the native Camera Intent/Activity and the restart time for your app's Activity should not take 30 seconds on a oct-core 1.6 GHz Cortex... but I do not have your device, app and code in front of me....