How to retrieve logcat in Flutter? - flutter

How do you get the output written to logcat back into the Flutter app that caused it? Or simpler asked: How to read logcat in Flutter?
The problem is this:
The app uses a stack of Android plugins to communicate with some custom hardware through Bluetooth. Those Android plugins write extensively to logcat. Now, for debugging, it would be very helpful to be able to read all the messages the App (including native plugins) has written to logcat. Question is, is this somehow possible?
How would you tackle that?

Check out the plugin called logcat on pub.dev.
Sadly, it seems to be no longer maintained and isn't updated for null safety.
But you can check out the source code here and see how the plugin gets access to the android logcat.
Because the logcat is a native thing, you'll have to use a MethodChannel to call a Java/Kotlin function:
// define MethodChannel
final platform = const MethodChannel('app.channel.logcat');
// call native method
logs = await platform.invokeMethod('execLogcat');
And the native part:
public class LogcatPlugin implements MethodCallHandler {
public static void registerWith(Registrar registrar) {
final MethodChannel channel = new MethodChannel(registrar.messenger(), "app.channel.logcat");
channel.setMethodCallHandler(new LogcatPlugin());
}
#Override
public void onMethodCall(MethodCall call, Result result) {
if (call.method.equals("execLogcat")) {
String logs = getLogs();
if (logs != null) {
result.success(logs);
} else {
result.error("UNAVAILABLE", "logs not available.", null);
}
} else {
result.notImplemented();
}
}
String getLogs() {
try {
Process process = Runtime.getRuntime().exec("logcat -d");
BufferedReader bufferedReader = new BufferedReader(new InputStreamReader(process.getInputStream()));
StringBuilder log = new StringBuilder();
String line;
while ((line = bufferedReader.readLine()) != null) {
log.append(line);
}
return log.toString();
} catch (IOException e) {
return "EXCEPTION" + e.toString();
}
}
}
The code samples are from github.com/pharshdev/logcat.
Maybe you can just fork the git repo and migrate it to null safety if needed.

Check the plugin called logcat_monitor on pub.dev.
Its biggest advantage over the other logcat plugin is that it allows continuous monitoring of logcat messages.
Follows a screenshot example:
how to use
Add the dependencies:
dependencies:
logcat_monitor: ^0.0.4
Create a function to consume the logcat messages
void _mylistenStream(dynamic value) {
if (value is String) {
_logBuffer.writeln(value);
}
}
Register your function as a listener to get logs then use it in anyway within your app.
LogcatMonitor.addListen(_mylistenStream);
Start the logcat monitor passing the filter parameters as defined in logcat tool.
await LogcatMonitor.startMonitor("*.*");

Related

Unity3D firebase AUTH not working on ANDROID fierbase doesnt connect or something

in editor work fine. i am trying to add my project fierbase and in work. but when i start test at android login window dont react. i am trying to do this
Unity3D firebase AUTH not working on ANDROID
public class FirebaseINIT : MonoBehaviour
{
public static bool firebaseReady;
void Start()
{
CheckIfReady();
}
void Update()
{
if(firebaseReady == true)
{
SceneManager.LoadScene("LoginScene");
}
}
public static void CheckIfReady()
{
Firebase.FirebaseApp.CheckAndFixDependenciesAsync().ContinueWith(task => {
Firebase.DependencyStatus dependencyStatus = task.Result;
if (dependencyStatus == Firebase.DependencyStatus.Available)
{
Firebase.FirebaseApp app = Firebase.FirebaseApp.DefaultInstance;
firebaseReady = true;
Debug.Log("Firebase is ready for use.");
}
else
{
firebaseReady = false;
UnityEngine.Debug.LogError(System.String.Format(
"Could not resolve all Firebase dependencies: {0}", dependencyStatus));
}
});
}
}
but not loaded next scene. and i am confused why this happen, i cant do next because dont understand problem. May this will be from android resolve? Because in order to build the project, I must remove the Resolved libraries.
otherwise if I don't remove assets=> Android=> resolve libraries then I will get the following errors
Configure project :launcher
WARNING: The option setting 'android.enableR8=false' is deprecated.
Execution failed for task ':launcher:processReleaseResources'.
A failure occurred while executing com.android.build.gradle.internal.tasks.Workers$ActionFacade
Android resource linking failed
the project has an additional plugin for advertising appodeal + admob at least the direction of movement of the study of the issue
and i am integrate this first time i am not sure i am at the right way

Flutter Signalr Listener is not connected in 2nd screen after migrated to null safety stable version

Chatting was working perfectly before migrating to null safety using signalr. But after migrating It is not working in chatting part.
Scenario is like there are 2 screens where I am using signalr.
1)Chatlist.
2)Chatting with person.
listener in Chatlist is perfect but in 2nd screen it is not working(Just worked when I installed and run for the 1st time). Weird issue.
All was working in old. I am using bloc for statemanagement and also migrated to yield to emit.
Piece of code is like:
void listenOnMessageReceived(
HubConnection hubConnection,
Function(Message? chatMessageReceive) onMessageReceived,
) {
final SocketResponseCallBack chatMessageReceived =
(response) => onMessageReceived(Message.fromJson(response));
final hubMethod = HubMethod(
CHAT_RECEIVED_MESSAGE_METHOD_NAME,
SignalRHelper.toSocketFunction(
CHAT_RECEIVED_MESSAGE_METHOD_NAME, chatMessageReceived));
bool exists = listenOnHubMethod.any((method) => method.methodName == CHAT_RECEIVED_MESSAGE_METHOD_NAME);
if(exists) {
listenOnHubMethod.removeWhere((element) =>
element.methodName == CHAT_RECEIVED_MESSAGE_METHOD_NAME);
SignalRHelper(hubConnection: hubConnection).on(
hubMethod.methodName,
hubMethod.methodFunction,
);
listenOnHubMethod.add(hubMethod);
}else{
SignalRHelper(hubConnection: hubConnection).on(
hubMethod.methodName,
hubMethod.methodFunction,
);
listenOnHubMethod.add(hubMethod);
}
}
I am having 2 types of above code in different screens. but it is working in only 1 screen and not listening in 2nd screen.
here is a piece of signalr listener code:
static MethodInvocationFunc toSocketFunction(
String methodName, SocketResponseCallBack responseCallBack) {
return (arguments) {
try {
if (arguments!.isEmpty) {
throw SocketEmptyResponseException(methodName);
}
final response = arguments.first;
responseCallBack(response);
} on FormatException {
throw SocketResponseException(methodName);
}
};
}
Is there any limitations in migration of stable version or anything else. Every help is appreciable.
Thank you.
Do not use signalR, on IOS it will be impossible to run listener on background or when app is closed and you will miss messages. Use FCM.

How to get audio buffers from the flutter webrtc plugin?

I am using the flutter-webrtc-plugin and would like to record both local and remote audio streams. Is there any way for me to get audio buffers from the media streams? I have tried using the AudioFileRenderer in the unified-plan branch. In the startRecording function of MediaRecorderImpl.java, I supplied the file storage path e.g. "storage/emulated/0/Android/data", a file is successfully created everytime I ended my call but the recording file is broken so it can't be played. There are no errors coming from the terminal. I'm using flutter v1.22.6 and forked the flutter-webrtc from 0.5.8. I added the AudioFileRenderer file to the flutter-webrtc 0.5.8, my code is as below:
public void startRecording(File file) throws Exception {
recordFile = file;
if (isRunning)
return;
isRunning = true;
//noinspection ResultOfMethodCallIgnored
file.getParentFile().mkdirs();
if (videoTrack != null) {
System.out.println("try123 1");
videoFileRenderer = new VideoFileRenderer(
file.getAbsolutePath(),
EglUtils.getRootEglBaseContext(),
audioInterceptor != null
);
videoTrack.addSink(videoFileRenderer);
if (audioInterceptor != null)
audioInterceptor.attachCallback(id, videoFileRenderer);
} else {
Log.e(TAG, "Video track is null");
if (audioInterceptor != null) {
//TODO(rostopira): audio only recording
// throw new Exception("Audio-only recording not implemented yet");
Log.d(TAG, "Try to use onWebrtcSamplesReady");
audioFileRenderer = new AudioFileRenderer(file);
audioInterceptor.attachCallback(id, audioFileRenderer);
}
}
}
Any help is appreciated! Thanks!
I am also looking for the same solution, none has been found so far.
So, I am using webview for the the RTC part (communication & recording), while keeping the Firebase messaging and EventSource/SSE (I'm not using socket) in Flutter.
This is not directly answer your question, just providing alternative solution, it's better than having no solution at all, probably in the future when flutter RTC updated and supporting voice only recording, we can update the apps we develop.

flutter embedded pos printer (Like: Android Q2 device)

Already I have built a flutter project. Now I need to print from a pos embedded device. I am googling, but I don't get any solution.
Please help me if there is any solution.
Actually I need for Android Q2 device
 
I have the same device and i already built flutter application , and ran into the same probleme .
I contacted the company and they provided me with android sdk so i added a channel and called the print from the flutter code.
EDIT:
In your case, you can download the SDK provided by the company and write native code to print the receipt.
example:
in flutter
static const platform = const MethodChannel('com.example.myapplication');
Future<void> _print() async {
try {
final bool result = await platform.invokeMethod('print',{"data":Printdata});
if (result) {
// success
} else {
// failed
}
} on PlatformException catch (e) {
//failed to print
}
}
Then in the Main.java / Main.kt implement the method from the SDK documentation
example:
public void onMethodCall(MethodCall call, MethodChannel.Result result) {
if (call.method.equals("print")) {
data = call.argument("data");
// Code from SDK documentation
} else {
result.notImplemented();
}
}
ref: Example Nativcode in flutter
Add third party SDK to android
Try this library "flutter_bluetooth_serial" and connect to the printer via direct mac address like this:
BluetoothConnection connection = await BluetoothConnection.toAddress("mac");

Diconnected from App Center through my application using java SDK

I face an issue while running disconnect from app center via my application. here is my source code.
private boolean disconnet(final String oauthConsumerKey, final String oauthConsumerSecret, final String accessToken, final String accessTokenSecret, final String realmID) {
try {
if (accessToken != null && accessTokenSecret != null
&& realmID != null) {
final IAPlatformClient pClient = new IAPlatformClient();
pClient.disconnect(oauthConsumerKey, oauthConsumerSecret, accessToken, accessTokenSecret);
return true;
}
} catch (Exception e) {
System.err.println("Exception : "+e.getMessage());
return false;
}
return false;
}
I get the exception:
Exception :Failed to disconnect: java.lang.NullPointerException null
Can someone help me? Thanks in advance.
We currently have a defect in the development environment that prevents the disconnect from the app center flow to be executed properly. The defect only affects the development environment, the production environment is working correctly.
The defect presents a pop up with a Close button at the end of the sequence and control is returned to the app center Manage My Apps page rather than directing the user to the application's disconnect URL.
To test this after returning to the app center, simply paste the disconnect URL into the browser address box and manually navigate to the disconnect URL.