I want to show a composed SVG image in flutter and I am currently using the flutter_svg library.
My picture consists of different layers that can be put together using the app's GUI (creating an avatar).
In principle it works very well with a stack of SVGPictures, but when loading, the problem arises that some of the SVGs are displayed a little later than others and the graphics look broken in this short time - for example, the upper body of an avatar is not loaded but the rest of the body -> the avatar has a hole in the middle ...
Is there a way to display a composition from SVGPictures only when all parts are loaded and can be displayed? Ideally, a dummy should also be displayed for this long.
SVGs with a lot of layers will definitely cause a bit of lag while loading, hence if you want to load them all smoothly, you can try preloading SVGs.
final svg = SvgPicture.asset('assets/vector.svg');
final svgAnother = SvgPicture.asset('assets/vector.svg');
#override
Widget build(BuildContext context) {
return Stack(
children:[
svg, // Load preloaded svg smoothly
svgAnother,
],
);
}
Related
Flutter has a way to convert any widget to an image by boundary?.toImage(). But it doesn't work on the flutter web HTML version. So I am using a JS library html2canvas to take snapshots of widget.
This is the js function used on the native side.
async function captureImageHtmlCanvas(x, y, width, height, scale) {
// Get The canvas
try {
var canvas = await html2canvas(document.body, {
x: x,
y: y,
width: width,
height: height,
scale: scale || 1
});
return canvas.toDataURL("image/jpg", 1);
} catch (e) {
return reject(e);
}
}
It was working in flutter 2, But after flutter 3.0 updated it's not working anymore, Flutter's custom ClipPath is getting avoided if child has any image.
<img src="https://github.com/shofizone/widget_to_image_flutter_web_html/blob/master/Screenshot%202022-12-01%20at%208.52.13%20AM.png?raw=true" alt="drawing" width="300"/>
here is full code in git repository.
https://github.com/shofizone/widget_to_image_flutter_web_html
Clone the repo
Run flutter run -d Chrome --web-renderer html
Try to capture an image with save FAB.
Expected results: Expected result Image should be captured as it looks on the UI. It should be available on the bottom of the heart shape.
Actual results: Image gets captured properly but the custom shape is missing in the captured image.
I didn't find any other way to make it work in HTML version. While I don't want to use the canvasKit version because it is heavy in size and the web app becomes slow when it loads too many images. This issue is breaking out on the production site.
I have been following posts of people trying to use precacheImage to avoid flickering when loading AssetImage first time. Some key code below:
late Image logo;
#override
void initState() {
super.initState();
logo = Image.asset("path_to_logo");
}
#override
void didChangeDependencies() {
precacheImage(logo.image, context);
super.didChangeDependencies();
}
Then later logo is used as a child widget:
Widget build(BuildContext context) {
...
logo
...
}
I also tried to use precacheImage before current page loads in main.dart but that didn't do any difference. The image is empty for a brief while when first time loaded and then all content is shifted to correct position once the asset image loads.
Using sdk: '>=2.19.0-168.0.dev <3.0.0'
Any ideas what could be wrong?
EDIT:
I found out that since I have support for multiple device pixel ratios, the precaching caches only 1x dimensions because I am using the generic path, but the device can be e.g. using 3x and ImageCache also seems to have some internal scale property when it caches images.
This ticket might be about the issue I am facing: https://github.com/flutter/flutter/issues/3905
I picked the image from camera and gallery in the edit profile page and I need to make it effective to the dashboard and drawer of the app immediately and also when user closes his app and comes back the photo must be there in the app. How can I achieve this.
Now I will store the image in a variable of type File, but it doesn't effect immediately and also it collpases when app is closed.
The Image you are picking, store the image in a variable of type File in setState({}).
First, your problem is divided into two parts:
Once the image is picked, it should appear in the different
places in your app.
One way to do that using the provider package as state management. Then create a class that extends change notifier. after that define a function that updates the image and notifies listeners. Something like this
class YourClass extends ChangeNotifier {
String? _image;
String? get image => _image;
void updateImageAndSave(String providedImage) {
_image = providedImage;
saveTolocaleStorage(providedImage); //To save your image to any locale storage of your prefrence.
notifyListeners();
}
void updateImage(String providedImage) {
_image = providedImage;
notifyListeners();
}
}
Note I use string because I actually provide the path, not the image itself. but you can change that.
After that, you use the consumer widget right above anywhere you want to be updated once the updateImage() function is called.
Note the image is nullable so you need to check if the image is null to show your temporary widget(your placeholder).
Read the image from your local storage.
Once you have done the above part, read the image from your locale storage on the app startup. Then, update the image in your change notifier. This will display the image at all the different places you specified in your app
We uploaded a set of "floor plans" to Mapbox in GeoTiff format. While layers show up fine in Mapbox Studio, the layers appear to have a huge black background surrounding it's rendered area.
This is how it looks in Studio
And this is how it appears in out app
We tried following the guide on documentation yet we don't understand it quite clearly:
This is the current code in charge of the map loading
mapView.onCreate(savedInstanceState);
mapView.getMapAsync(new OnMapReadyCallback() {
#Override
public void onMapReady(#NonNull MapboxMap mapboxMap) {
mapboxMap.setStyle(new Style.Builder().fromUri("mapbox://styles/gustavjohannson/ckpzb9eop05mq18qviptlbm5d"), new Style.OnStyleLoaded() {
#Override
public void onStyleLoaded(#NonNull Style style) {
I just resolved this same exact issue. Check out this link.
https://docs.mapbox.com/help/troubleshooting/raster-transparency-issues/
It gives a useful example for android.
Basically you have to use a standard style like SATELLITE_STREETS or STREETS or whatever you choose. But do NOT embed your map overlay into the style. You want to add your overlay as a Raster Source tileset directly in your android code.
Here's what my code looks like:
mapboxMap.setStyle(Style.Builder().fromUri(Style.SATELLITE_STREETS)) {
loadedMapStyle.addSource(RasterSource("source-id",
TileSet( "tileset-id", "https://api.mapbox.com/v4/HERE I PUT MY TILESET ID/{z}/{x}/{y}.png?access_token=pk.eyJ1IjoidG9tbXlib21iIiwiYSI6ImNqZXg5NHZlcjB6czEyd3J5MnAxZDdieGYifQ.GUqv9fHb__o5Xq8DLdNnnA"), 256))
loadedMapStyle.addLayer(RasterLayer("raster-layer", "source-id"))
}
Remember...In Mapbox Studio... Instead of creating a custom style with your map embedded, create a custom Tileset with your map and use that tileset id in your android code.
Description :
I'm working on slide show app. Where I can import a video , add on it sound + text (subtitles and stuff) Then render it & save it locally or share it on social media.
I've managed doing the first part with this package.
But I have 3 problems :
First: However I'm just putting text widgets overlaid onto the video player.
Second : How can I input a certain sound on the video?
Third : After all that , How can I render the output video which I can save it locally or share it?
I think you could theoretically make it seem like you are combining video and audio based on how you render them, however to do the third it looks like you would need to get into video editing. Doing it from scratch could be quite a task, and I know in java there is ffmeg but not sure about dart. Here is another stackoverflow discussion that has more on the matter
Video Editor in Flutter using dart
There is now a high-level render package that heavily optimizes the approach of repaint boundary capturing.
Wrap you widget with Renderwidget:
import 'package:render/render.dart';
final controller = RenderController();
#override
Widget build(BuildContext context) {
return Render(
controller: controller,
child: Container(),
);
}
And then capture the motion with the controller:
final result = await renderController.captureMotion(
duration,
format: Format.gif,
);
final file = result.output;