I am creating a video player app for android. for that i need to create thumbnails for the videos present in the videos folder.
After searching web i could able to understand unity's MovieTexture doesn't support for android. This one i could able to solve using a plugin.
For creating thumbnails i planned to create a canvas, and load GUI objects at runtime from prefab. create "GUI Raw Image" with images that would be the thumbnail image representing the video.
My hard luck, i came to understand GUI Raw Image is non trigger object. So changed to use GUI buttons instead of Raw Images.
But the issue, i am not able to attach image to my button prefab.
Can anyone help me.
Thanks in advance.
When using the new UI objects, as opposed to the Legacy GUI, you must have the image imported as Sprite/UI.
Unity Import Reference
Related
I'm wondering if there's a way to recreate the "Object" experience when viewing a .usdz file through Apple's AR Quick Look. I want an experience that showcases a 3D object without "augmenting reality".
Some options that I'm thinking of that might be able to recreate this feature:
1) Using ARKit, disabling the camera and setting my own background with a custom image. I would then set the usdz/object in the center of the device's screen while having all the interaction functionalityfor the 3D object.
2) Web AR - recreate this 3D experience elsewhere and showcase this on a webview.
Any guidance or discussion about this is much appreciated - thank you!
You can use Google's model-viewer if you are going with the web solution. Another easy and effective solution would be echoAR (full disclosure, this is where I work). You can simply upload your models there and then get a link to thier model-view. You can upload models in different formats (obj, fbx, glTF, glb, USDZ) and it'll automatically convert it to the format you need to view on any device.
I am trying to create an ImageRenderer for Exoplayer. I'm not sure how to start and am struggling to understand all of the components needed. My use case is I would like to have a playlist that could be a mix of hls and png url sources. I'm playing the Hls videos perfectly find but get 403 download errors with the pngs (as expected). I'm assuming I need a custom MediaSource and DataSource for the pngs as well? I'm getting really confused with possibly needing a custom MediaPeriod and SampleStream as well and it's one big jumble. I feel like this should be relatively simple to create the bitmap from the png stream and renderer it to a SurfaceView.
I’m pretty new in flutter. So bear with me. I’m not sure the difference between plugin “camera” and “image picker”.
I was able to capture video and image by using “image picker”. From my perspective,”image picker” is more straightforward and easy to implement but on the internet, seems like “camera” plugin is more popular.
Then I want to ask, when it comes to taking video and picture, especially video. Is there any pros and cons? Any help appreciated!
The two plugins differ in functionality and most importantly in purpose:
camera allows you to embed the camera feed into your own application as a widget, i.e. have control over it as well.
The image_picker plugin will launch a different application (a camera or gallery application) and return a File (an image or video file selected by the user in the other application) to your own application.
If you want to implement (and customize) the camera for you own purposes, you will have to use camera, but if you only want to retrieve imagery or video from the user, image_picker will be your choice.
I am trying to make a slideshow of images from a folder but i don't know how to load the images from a folder in order and then display them onscreen. I am using an NSTimer to switch pictures but it only works if I know the filename for each image.
This is for a program that runs on Mac. I have tried NSFileManager according to other answers I have found on Stack Overflow, but I couldn't get them to work. I am using NSImage to load the image onto the view.
You do not need to do it yourself. Apple created for that (and more) the IKImage Kit. (Read the docs about that.) Here a short part of Image Kit Programming Guide:
A slideshow is a popular way for consumers to view digital images. The IKSlideshow class and the IKSlideshowDataSource protocol provide an easy way for your application to support slideshows of images . . .
I'm trying to use the vision.VideoPlayer in a custom created GUIDE GUI. The video source is a camera. Right now I can get it to work with the camera but the vision.VideoPlayer object pops out of my gui. I've read the example given but it seems that this doesn't use the videoplayer rather than the videoreader object to read a video file and project the frames in a gui.
Is there any way to embed the vision.VideoPlayer in my GUI using input from a camera?
I just uploaded this to the FEX:
http://www.mathworks.com/matlabcentral/fileexchange/53600-fancyflowplayer
It uses no high level dependencies, only the "VideoReader" interface (which btw the vision.VideoPlayer also uses).
It is by no means a competitor to VLC(this IS Matlab we are talking about), but it is all open source and shows how you can access and process video in real-time. It also has a working draggable seekbar, extra keyboard and mouse interfaces that are really handy, which the vision.VideoPlayer does not have.
cheers,
Stefan
Unfortunately there is no way to use vision.VideoPlayer in a custom GUI directly. However here an example of how to play a video inside a custom GUI without using vision.VideoPlayer.