I’m pretty new in flutter. So bear with me. I’m not sure the difference between plugin “camera” and “image picker”.
I was able to capture video and image by using “image picker”. From my perspective,”image picker” is more straightforward and easy to implement but on the internet, seems like “camera” plugin is more popular.
Then I want to ask, when it comes to taking video and picture, especially video. Is there any pros and cons? Any help appreciated!
The two plugins differ in functionality and most importantly in purpose:
camera allows you to embed the camera feed into your own application as a widget, i.e. have control over it as well.
The image_picker plugin will launch a different application (a camera or gallery application) and return a File (an image or video file selected by the user in the other application) to your own application.
If you want to implement (and customize) the camera for you own purposes, you will have to use camera, but if you only want to retrieve imagery or video from the user, image_picker will be your choice.
Related
I need to create the following video player for my app: Screenshot from Netflix. I need a start/pause button, a back button, the title of the video, and a time indicator. The video should also always be in landscape mode. I've already found a lot of video players in flutter but I haven't found a way to modify them. If anybody knows a good library, a tutorial, or has some source code it would really help me out, thanks.
I think your best bet is using the video_player package, as it's the most bare-bones package out there. You can make the player all to your liking by using the Stack Widget, having the video at its base and any other elements on top of it. It should be very doable; only the progress indicator is kind of complex, but it's not too bad.
I'm wondering if there's a way to recreate the "Object" experience when viewing a .usdz file through Apple's AR Quick Look. I want an experience that showcases a 3D object without "augmenting reality".
Some options that I'm thinking of that might be able to recreate this feature:
1) Using ARKit, disabling the camera and setting my own background with a custom image. I would then set the usdz/object in the center of the device's screen while having all the interaction functionalityfor the 3D object.
2) Web AR - recreate this 3D experience elsewhere and showcase this on a webview.
Any guidance or discussion about this is much appreciated - thank you!
You can use Google's model-viewer if you are going with the web solution. Another easy and effective solution would be echoAR (full disclosure, this is where I work). You can simply upload your models there and then get a link to thier model-view. You can upload models in different formats (obj, fbx, glTF, glb, USDZ) and it'll automatically convert it to the format you need to view on any device.
What I need is access to the camera and the normal features the camera would provide (e.g. changing the ISO value manually).
After doing some research I found two options to take a photo from within Flutter:
Image Picker: Here the built-in camera app will be used, but is heavily restricted. There are hardly any settings which can be changed and not a lot of freedom to make a photo with specific settings.
Camera: If I understood correctly, this is basically an access to the camera lens itself, so there is no camera software involved and I would need to implement all the basic features a camera provides, by myself.
My aim is to be able to take a photo with all (or at least most) of the common functionality a camera provides included. Is there another plugin I did not find by now or some way the remove the restrictions by the Image Picker?
I am creating a video player app for android. for that i need to create thumbnails for the videos present in the videos folder.
After searching web i could able to understand unity's MovieTexture doesn't support for android. This one i could able to solve using a plugin.
For creating thumbnails i planned to create a canvas, and load GUI objects at runtime from prefab. create "GUI Raw Image" with images that would be the thumbnail image representing the video.
My hard luck, i came to understand GUI Raw Image is non trigger object. So changed to use GUI buttons instead of Raw Images.
But the issue, i am not able to attach image to my button prefab.
Can anyone help me.
Thanks in advance.
When using the new UI objects, as opposed to the Legacy GUI, you must have the image imported as Sprite/UI.
Unity Import Reference
I know that generally, it's no problem to overlay HTML (and even do advanced compositing operations) to HTML5 native video. I've seen cool tricks with keying out green screens in realtime, in the browser, for example.
What I haven't see yet, though, is something that tracks in-video content, perhaps at the pixel level, and modifies the composited overlay in accordance. Motion tracking, basically. A good example would be an augmented reality sort of app (though for simplicity's sake, let's say augmenting an overlay over on-demand video rather than live video).
Has anyone seen any projects like this, or even better, any frameworks for HTML5 video overlaying (other than transport controls)?
If we use the canvas tag to capture the instances of the video, we are able to get the pixel level information of the video. Then we can detect the motion tracking i think. May be the work of HTML5 will be upto grabing the pixel informaion, then its our work to detect the things we need..
And i didnt find any such frame works for HTML5 video tag, as there is no common video format supported by all browsers...
The canvas tag video mode is not supported by iOS.
Wirewax built an open API for that. Works quite well - even on Iphone.
http://www.wirewax.com/