How can I customize the calling view in a flutter of agora SDK?
I want to customize the AgoraRenderWidget.
ex- If someone disables the video so I want to show dummy image instead of video.
AgoraRendererWidget can be customized as per your use case. For this particular use case whenever a video is turned off during a call i.e. disableVideo() is called it triggers the onUserEnableVideo(false) callback.
You can use this callback function to trigger your dummy image instead of the video.
I hope this answer your query.
Related
I want to create an iphone app that if you shoot using the camera it automatically save and shoot again until you click Done.
is that possible?
Look at the takePicture method of UIImagePickerController. From the documentation:
Use this method in conjunction with a custom overlay view to initiate the programmatic capture of a still image. This supports taking more than one picture without leaving the interface, but requires that you hide the default image picker controls.
Since you have to have the default controls hidden, you'll also want to look at cameraOverlayView, which lets you provide your own controls to use instead. That's where you can put your start and done buttons.
I wanted to use the UiImagePicker to record a video then let the user browse the frames using the scrollview built in the UiImagePicker and while at it, select a few frames that interest the user.
I know I can overlay a control on top of the uiimagepicker to trigger the selection
What I am not sure about is whether I have programatic access to the current frame shown by the UIImagePicker for me to extract an image out of it.
Please let me know if this is doable/possible.
Are there any other time efficient/elegant ways to achieving the above?
I had created by own view with video recording and frame selection (using AVCaptureSession, AVCaptureVideoDataOutput) but would take me quite a bit of time to polish it to make it look good like UIImagePicker plus everyone is familiar with the default camera app in the iPhone.
I hope it makes sense what I want to achieve.
I know one can kind of achieve the goal by taken a screenshot of the screen but I just want the user to click one button to capture the current show frame.
Thanks
Screenshot method did not work for frozen frame from the video recording.
Went with a custom AVFoundation based frame capture.
I have compiled the sample PLayCap from the directshow.NET website in C#.
PLayCap - This application creates a preview window for the first video capture device.
Although the program works fine, i need to add buttons to the form. I cannot do that since, the video capture is displayed on the entire form. I want it to be displayed on a picturebox, for instance picturebox1. So that I can move the picturebox1 a little up and put some buttons.
Does anyone know how to do that ?
Many Thanks
You have to set handler of video render filter to picterbox handler.
use picturebox1.handler property
Every time the camera of the iPhone captures a new image, I want to modify it and then display it on the iPhone screen. In other way: how to modify in real time the video stream from the iPhone camera?
I need a method who is called every time when a new image comes from the camera.
Thanks for your help! :-)
EDIT: what I want to do is like augmented reality: while I'm taking a video, every image is modified, and is showed in real time on the iPhone screen.
You could capture an image with the UIImagePickerController's takePicture method and draw your modified version to the cameraOverlayView.
You get the picture recorded as a result of the takePicture message from the UIImagePicker's delegate in a imagePickerController:didFinishPickingMediaWithInfo:. From the dictionary supplied to that method, you can get the original image which you modify and draw to the overlay.
Here is an example for using the cameraOverlayView. You should be able to re-use the captured image from your delegate for drawing your overlay view.
Many augmented reality apps do not actually modify the image but just overlay information based on what they think is on the screen from input from the accelerometer and compass. If this is the kind of AR you are looking to do then try looking at ARKit.
You cannot process the camera's image data in real time. THere is no API to do this. File a request with Apple using their bug tracker. Many of us have done this already. More requests might lead to this being possible.
Oh, yes, just use an overlay view then. You were talking about modifying the video stream, which is clearly not needed.
I am creating an app in which, as soon as the UIImagePickerController loads (i.e. the camera view), it should start taking pictures without any click and store images in an array. How can I do this without clicking on the "shoot" button?
In reference library, UIImagePickerController contains an instance method, -takePicture. Can somebody tell me if this function will do the trick, if I call it through timer?
Thanks in advance.
-takePicture should do the trick in deed. You have to provider a custom UI for the camera controls, because otherwise (for me) it doesn't work. Check out the developer documentation in Xcode and search for takePicture. The method description has everything you need.