How to set iPhone video output image size - iphone

I'm trying to do some image processing on iPhone.
I'm using http://developer.apple.com/library/ios/#qa/qa2010/qa1702.html to capture the camera frames.
I saw that I can set AVCaptureVideoDataOutput image format using setVideoSettings, but is it possible to get the images in lower resolution?
If not, is the an efficient way to downscale the resulted image?
Thanks,
Asaf.

This is how we can get a lower resolution output so we get a higher FPS when manipulating the image:
// sessionPreset governs the quality of the capture. we don't need high-resolution images,
// so we'll set the session preset to low quality.
self.captureSession.sessionPreset = AVCaptureSessionPresetLow;
Asaf Pinhassi.

Related

Unity Agora screenshare blurry video quality

How to improve the image quality while sharing the screen using agora sdk with unity. I've used below
settings for VideoProfile as
mRtcEngine.SetVideoEncoderConfiguration(new VideoEncoderConfiguration()
{
// Sets the video encoding bitrate (Kbps).
minBitrate = 100,
bitrate = 1130,
// Sets the video frame rate.
minFrameRate = 10,
frameRate = FRAME_RATE.FRAME_RATE_FPS_24,
// Sets the video resolution.
dimensions = new VideoDimensions() { width = EncodeWidth, height = EncodeHeight },
// Sets the video encoding degradation preference under limited bandwidth. MIANTAIN_QUALITY means to degrade the frame rate to maintain the video quality.
degradationPreference = DEGRADATION_PREFERENCE.MAINTAIN_QUALITY,
// Note if your remote user video surface to set to flip Horizontal, then we should flip it before sending
mirrorMode = VIDEO_MIRROR_MODE_TYPE.VIDEO_MIRROR_MODE_ENABLED,
// Sets the video orientation mode of the video
orientationMode = ORIENTATION_MODE.ORIENTATION_MODE_FIXED_PORTRAIT
});
the Output from Editor to Device looks like below:
And Ouput from Device to Editor or another Deivce looks blurry as below:
Ive tested with WIFI on both device and ensure with good quality and also with forced settings as Image Quality than Frame Rate.
mRtcEngine.SetVideoQualityParameters(false);
mRtcEngine.EnableDualStreamMode(false);
mRtcEngine.SetRemoteDefaultVideoStreamType(REMOTE_VIDEO_STREAM_TYPE.REMOTE_VIDEO_STREAM_HIGH);
Do i missed anything else to improve the image quality?
How to share a Rect part of the screen and this rect can be draggable by user at part of the screen
Blurry videos may be caused by low bitrates and resolution ratios. Check the following:
Check videoProfile. If possible, set videoProfile to a higher level
to see whether the video is clearer.
Check the stream type of the receiver. If the stream type is low, call the setRemoteVideoStreamType method to switch from a low stream to high stream. (You did this)
Switch to another WiFi network to ensure that the blurry video is not caused by poor Internet connections.
Turn off all pre-processing options.
If this issue persists, contact Agora customer support (via ticket system) with the following information:
The uid of the user who sees the blurry video.
The time frame during which the blurry video appears.
SDK logs and screen recording files of the user.
You can check the statistics of every call in Agora Analytics in Dashboard.

Avoid Video compression while picking video using uiimagepickercontroller

I'm using UIImagePickerController to allow my user to select a video .
When the user selects the "Choose" button on the second screen, the view displays a progress bar and a "Compressing Video..." message.
Is there some way I can avoid this compression operation?
Try to set the videoQuality property of UIImagePickerController
The video quality setting specified by this property is used during video recording. It is also used whenever picking a recorded movie. Specifically, if the video quality setting is lower than the video quality of an existing movie, displaying that movie in the picker results in transcoding the movie to the lower quality. UIImagePickerController Reference
Available quality values are:
UIImagePickerControllerQualityTypeHigh
UIImagePickerControllerQualityTypeMedium
UIImagePickerControllerQualityTypeLow
UIImagePickerControllerQualityType640x480
UIImagePickerControllerQualityTypeIFrame1280x720
UIImagePickerControllerQualityTypeIFrame960x540
Please see UIImagePickerController Reference for detail.

How to play a video slowly for marking

I am Creating application for coaching. I struck with the marking on video. So I choose ffmpeg for converting video to image frame. That make me time delay as well as memory issues. I need to provide the user play the video slowly as frame by frame. Is there any other way to do that with out image conversion. V1 Golf did that process very quick manner. Please help me.
I would try converting video frame in separate thread and I would extract a few frames ahead as images in the background when user gets into 'slow motion mode'.
Here is example for one frame, so you should be quick with others: Video frame capture by NSOperation.
This should reduce delays and frames could be converted while user is eye-consuming subsequent frames.

iPhone UIImagePickerController video mode

When using the UIImagePickerController, is there a way to specify the resolution of the captured video, or to restrict the length of the video (based on size). I would love to be able to have the video capture at a standard lower res mode rather than 720P.
For iOS 3.1 or later, you can use the videoQuality property of UIImagePickerController. The values are an in the enumeration
UIImagePickerControllerQualityType which defines UIImagePickerControllerQualityType{Low|Medium|High}. The default is UIImagePickerControllerQualityTypeMedium. 4.0 and later have additional resolution options.
As for duration, the property videoMaximumDuration allows you to specify the maximum time in seconds.

how many fps can iPhone's UIGetScreenImage() actually do?

Now that Apple is officially allowing UIGetScreenImage() to be used in iPhone apps, I've seen a number of blogs saying that this "opens the floodgates" for video capture on iPhones, including older models. But I've also seen blogs that say the fastest frame rate they can get with UIGetScreenImage() is like 6 FPS.
Can anyone share specific frame-rate results you've gotten with UIGetScreenImage() (or other approved APIs)? Does restricting the area of the screen captured improve frame rate significantly?
Also, for the wishful thinking segment of today's program, does anyone have pointers to code/library that uses UIGetScreenImage() to capture video? For instance, I'd like an API something like Capture( int fps, Rect bounds, int durationMs ) that would turn on the camera and for the given duration record a sequence of .png files at the given frame rate, copying from the given screen rect.
There is no specific frame rate. UIGetScreenImage() is not a movie recorder. It just try to return as soon as it could, unfortunately still very slow.
Restricting the area of the screen captured is useless. UIGetScreenImage doesn't take any input parameters. Cropping the output image could make the frame rate even worse due to the excess work.
UIGetScreenImage() returns an image of current screen display. It's said to be slow but whether it's fast enough depends on the use case. The video recording app iCamcorder is using this function.
According to their blog,
iCamcorder records at an remarkable average minimum of 10 frames per second and a maximum of 15 frames per second.
The UIGetScreenImage method Apple recently allowed developers to use captures the current screen contents. Unfortunately it is really slow, about 15% of the processing time of the App just goes into calling this method. http://www.drahtwerk.biz/EN/Blog.aspx/iCamcorder-v19-and-Giveaway/?newsID=27
So the raw performance of UIGetScreenImage() should be at least much higher than 15 fps.
To crop the returned image, you can try
extern CGImageRef UIGetScreenImage(void);
...
CGImageRef cgoriginal = UIGetScreenImage();
CGImageRef cgimg = CGImageCreateWithImageInRect(cgoriginal, rect);
UIImage *viewImage = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgoriginal);
CGImageRelease(cgimg);