This is a Flutter plugin to share content from your Flutter app via the platform's share dialog.
Reference: https://pub.dev/packages/share_plus/install
I am using the following commands to share a picture with a caption or a text message only.
await Share.shareFiles([imagePath],text: text); //share picture + text
or
await Share.share(text,); //share text only
Here .share is a Future Function and I would like to await until have it done. This seems not happening. You can quickly test it by adding a print command after one of the above command as following.
await Share.share(text,);
print('this should be printed after the sharing process');
You will notice the text is printed in the console before the sharing has completed (the platform's share dialog is still on).
Am I missing something? Or you believe there is an issue within the package?
Many thanks in advance
This is expected behavior. Futures and async-await functions cannot provide a result immediately when it is started. You can read the docs or view this tutorial to learn how to use Futures. In your case, perhaps you should try:
await Share.share(text,).then((result){
print('this should be printed after the sharing process ${result}');
}
)
Related
I want to run a specific job while using the mobile app without copying and pasting the code at every screen code. I have a lot of screens, and each screen is at a different dart file. I tried to use streams but it did not work.
Here is my code:
Stream.periodic(Duration(seconds: 5), (count) async {
final response = await get(
Uri.parse(
'SOME REST API URI'),
headers: headers,
);
if (response.statusCode == 200) {
final responseDecoded =
json.decode(response.body);
// Do something with the decoded response
}
});
The API call works fine, but the code is not being executed at all. I want this code to be executed every 5 seconds while using the app regardless of what screen the user at.
I was going to suggest using a Mixin (https://dart.dev/guides/language/language-tour#adding-features-to-a-class-mixins), which is still possible, but having each individual Widget do this seems silly.
Instead, you should launch a separate Isolate (similar to what would be a thread in traditional languages). You can launch it from your main function and use something like a Provider to listen to its messages when it has some updates, or write straight to a file/database from the isolate.
A package like flutter_isolate (https://pub.dev/packages/flutter_isolate) can help if you want to use packages in this isolate. An isolate will keep running whatever the app is doing. Note that you do need to terminate it when the app closes.
Their examples should be sufficient to get you going.
I am new to Flutter and I feel like if I could see what the widgetTest is "rendering", as a debug method, things could be a bit easier. I don't fully understand at the moment the mechanics of the widgetTest, but I am wondering if it is possible to capture an actual output render in the middle of the widgetTest.
I am talking about widgetTest not integration test.
I think the package golden_toolkit might be what you are looking for. Instead of using widgetTest you will be using testGoldens. And you can even compare screenshots if you want to ensure non regression in your UI using the provided method screenMatchesGolden.
Add in your pubspec.yaml:
#This is a development environment dependency only
dev-dependencies:
golden_toolkit: ^0.9.0
Code Sample:
testGoldens('MyWidget test', (tester) async {
await tester.pumpWidget(MyWidget());
await screenMatchesGolden(tester, 'my_widget-example');
});
Then run the following command to generate or regenerate your reference images:
flutter test --update-goldens
You can check the documentation about testGoldens here.
I am trying to create automated performance profiling test for our application. At the moment dart allows collecting cpu samples and timeline info as well as dumping that into json file.
Example:
driver = await FlutterDriver.connect(printCommunication: true).timeout(appConnectTimeout);
vms = await vmServiceConnectUri(vmUrl);
isolate = await driver.appIsolate.loadRunnable();
expect((await vms.setFlag("profiler", "true")), isA<Success>());
await driver.startTracing();
...
CpuSamples cpuSamples = await vms.getCpuSamples(
(await vms.getVM()).isolates.first.id,
isolate.startTime.microsecondsSinceEpoch,
(new DateTime.now()).microsecondsSinceEpoch
);
flutter_timeline.Timeline timeline = await driver.stopTracingAndDownloadTimeline();
await TimelineSummary.summarize(timeline).writeTimelineToFile("main", pretty: true);
But it is possible to view the timeline.json file only in chrome developer tools which don't provide enough information about dart execution. What I am looking for something that will provide me with same capabilities as Observatory web page but with imported main.timeline.json results. Do you know anything like this?
Ideally I would like to open Observatory page and upload this timeline there and be able to see timeline, cpu table, call tree, navigate through executed code and see which parts were not executed, etc. Well, do all the things I can do with Observatory running for a live VM. Do you have any tools in mind?
Question 2: Why within this code I have cpuSamples.samples empty list while cpuSamples.functions is fully populated?
Question 3: Why chrome dev tools are able to see my app function names using the dump from the code above while dart dev tools are not able to read those and reference encoded function names in dynamic library (compiled .so file of my app). The app is running with --debug mode for now because I don't have physical device ATM
The issue gone by itself while discussing it with google here so look for a solution in thread along with source code:
https://github.com/dart-lang/sdk/issues/42591
I'm unable to get any error output from my Flutter web app. Printing to the console using
print('some text');
works fine, but no errors get printed. For example, throwing an exception
throw new Exception('testexception');
doesn't result in any output, neither in the browser console nor in IntelliJ. The log level settings in Chrome are set to [Info, Warnings, Errors].
I even tried implementing a custom error handler
void main(){
FlutterError.onError = (FlutterErrorDetails details) {
print('main.onError: details: ${details.toString()}');
};
runApp(new MyApp());
}
but no luck. Do I have to enable error outputs somewhere? I can't find any info about this in the documentation.
I tried running the app both using the Dart Dev Server (which is started when using Run from IntelliJ), as well as calling webdev serve and webdev serve --auto restart from the Terminal.
Flutter Web currently doesn't have a way to be debugged. Will generate a main.dart.js and you can debug it with Chrome console.
One cool trick to "debug" your Web App, is by showing a popup to your browser:
import 'dart:js' as js;
#override
void initState() {
super.initState();
js.context.callMethod("alert", <String>["Your debug message"]);
}
Did you look to see if it was a known issue? Flutter-web is only at developer preview level. If there's not a current issue, create one, and provide a minimal "how to reproduce". I'm sure the Flutter-web team would appreciate it.
I tried in both IDE and its correctly working in Intellij but not in VSCode. Its showing white blank screen give it try to run again flutter run -d chrome and wait in blank screen for 2minutes.
And check the SDK path of flutter in visual code is correct or not.
In JavaScript, the console.log method is used to output to the Web browser's JavaScript console.
So you can do it from Dart by getting the JavaScript console object and invoking that method on it.
import 'dart:js' as javascript; // ignore: avoid_web_libraries_in_flutter
/// Invoke the the JavaScript `console.log` method.
void logToBrowser(String message) {
final console = javascript.context['console'] as javascript.JsObject;
console.callMethod('log', [message]);
}
Since dart:js is only supported for Dart compiled to JavaScript, this means the Flutter app using it will only work as a Flutter Web app. Importing it will mean the code won't work for an app targeting iOS, Android, etc.
Tip: the JavaScript console object has other methods that can be invoked. Instead of log, use info, warn and error. Those last two highlight the entry when they appear in the console.
I am able to use the SimpleResponse, BasicCard, List and other such rich responses. Can the following be supported?
a. only speech + basicCard + simpleResponse
if I build a response such as:
conv.ask('<speak> ...</speak>');
conv.ask(new BasicCard(
);
conv.ask(new SimpleResponse({
speech: ...
text: ...
});
I notice that on display devices (phone), the content of the speak appears as text too. Is there a way to avoid it?
Next, the text of the Simple Response appears before the Card. Is there a way to ensure it appears after the card.
Currently, for the first problem, I am forced to use a SimpleResponse with a short text (like Hi) and for the second problem, I have put the text as the card text and remove the SimpleResponse.
But would like to know if there is a way out? Thanks
First of all; As stated in the reference docs for the node.js library, the first item in your response should always be a SimpleResponse. And a SimpleResponse always shows a text, whether it's a short text that you define or the transcription of its speech property. But I like that you're putting a short text instead to avoid showing the user what your Action says verbatim.
Second; from my experience, the order of the responses aren't shown accurately on the simulator. I've tested your case in a dummy Action and while the simulator shows the Final Response (which is last in my code) before the card, my phone shows them in the correct order.
Simulator:
Smartphone:
Test in on a device and see if the error persists. I currently don't have my Google Home near me but test on it as well if you can.
For your first problem: If you want to use ssml tags you are forced to use a SimpleResponse, that's how it's meant to be. In other words your first problem is not a problem :)