I recording video using MediaRecorder.When using back-camera,it working fine,but when using front camera,the video captured is being flipped/inverse.Means that the item in right,will appear on the left.The camera preview is working fine,just final captured video flipped.
Here is the camera preview looks like
But the final video appear like this(all the item in left hand side,appear on right hand side)
What I tried so far:
I tried to apply the matrix when prepare recorder,but it seems does change anything.
private boolean prepareRecorder(int cameraId){
//# Create a new instance of MediaRecorder
mRecorder = new MediaRecorder();
setCameraDisplayOrientation(this,cameraId,mCamera);
int angle = getVideoOrientationAngle(this,cameraId);
mRecorder.setOrientationHint(angle);
if(cameraId == Camera.CameraInfo.CAMERA_FACING_FRONT){
Matrix matrix = new Matrix();
matrix.preScale(1.0f,-1.0f);
}
//all other code to prepare recorder here
}
I already read for all this question below,but all this seems didnt solve my problem.For information,I using SurfaceView for the camera preview,so this question here doesn't help.
1) Android flip front camera mirror flipped video
2) How to keep android from inverting the image from the front facing camera?
3) Prevent flipping of the front facing camera
So my question is :
1) How to capture a video by front camera which the video not being inverse(exactly the same with camera preview)?
2) How to achieve this when the Camera preview is using SurfaceView but not TextureView ? (cause all the question I mention above,tell about using TextureView)
All possible solution is mostly welcome..Tq
EDIT
I made 2 short video clip to clarify the problem,please download and take a look
1) The video during camera preview of recording
2) The video of the final product of recording
So, if the system camera app produces video similar to your app, you didn't do something wrong. Now it's time to understand what happens to front-facing camera video recording.
The front facing camera is not different from the rear facing camera in the way it captures still pictures or video. There is a difference how the phone displays camera preview on the screen. To make it look more natural to the user, Android (and all other systems) mirrors the preview, so that you can see yourself as if in a mirror.
It is important to understand that this only applies to the way the preview is presented to you. If you pick up any video conferencing app, connect two devices that you hold in two hands, and look at yourself, you will see to your surprise that the two instances of yourself are flipped.
This is not a bug, this is the natural way to present the video to the other party.
See the sketch:
This is how you see the scene:
This is how your peer sees the same scene
Normally, recording of a video is done from the point if view of your peer, as in the second picture. This is the natural setup for, e.g., video conferencing.
But Snapchat and some other social apps choose to store the front-facing video clip as if you record it from the mirror (as if the recorder is in your hand on the first picture). Some people like this feature, others hate it (see https://forums.androidcentral.com/general-help-how/664539-front-camera-pics-mirrored-reversed-only-snapchat.html and https://www.reddit.com/r/nexus6/comments/3846ay/has_anyone_found_a_fix_for_snapchat_flipping)
You cannot use MediaRecorder for that. You can use the lower-level API of MediaCodec to record processed frames. You need to flip each frame 'manually', and this may be a significant performance hit, because normally the MediaRecorder 'connects' the camera to hardware encoder in a very efficient way, without need even to copy the pixels to user memory. This answer shows how you can manipulate the way camera is rendered to texture.
You can achieve this by recording video manually from surface view.
In such case preview and recording will match exactly.
I've been using this library for this purpose:
https://github.com/spaceLenny/recordablesurfaceview
Here is the guide how to use it (not with camera but with OpenGL drawing): https://withintent.uncorkedstudios.com/recording-screen-video-on-android-with-recordablesurfaceview-451c9daa213e
Related
I am trying to create an app &/or web based online school that I am filming instruction from four different angels. I can't find anything that I could code to allow the user to select a different camera view and progress during the video.
Ideally there will be four cameras filming the exact technique but the user should be able to jump to different views throughout the video without having to start again.
I have searched online for three days now but cannot find anything to what I want/need.
I just want the video to play and the user to be able to switch to different camera views.
You can set the property:
VideoPlayer.frame
When the user "changes camera angle", record the frame of the currently playing video, stop it, start playing the new video from the different angle and set it's frame property to match the last video.
I am using AVFoundation to display a Video in my UIView via an AVCaptureVideoPreviewOverlay.
I then use AVStillImageOutput's -captureStillImageAsynchronouslyFromConnection: to capture a still Image from the Video with the AVCaptureSessionPresetPhoto preset.
I am freezing the video using AVCaptureSession's -stopRunning in the -captureStillImageAsynchronouslyFromConnection completion block mentioned earlier. However, it's too late and the video has continued running while the still image is taken, so the freeze is a second or two later. When I display the image there is a jump.
How can I freeze the video at the exact moment the photo is taken?
Almost a year later...Your approach is all wrong. Instead of trying to pause the video at the precise moment that the image is captured why don't you pause the video and then capture that paused image. To a user it makes no difference, to a developer you don't have to worry about exact precision.
To reiterate my idea, if you pause a video and flash white visual and play a click the user will think you have just captured that frame regardless if you are or not. Actually, you could consider pausing video the same as capturing an image without saving it.
So I have an application that can currently capture video with the front facing iphone camera and then do some processing on the video feed real-time. What I'm trying to do, however, is make this process run in the background and put other controls onscreen. So for example, say I'd like to run the camera and process the image feed, but I want the user to see a black screen with some buttons on it. Any ideas on how to do this?
Just so we get terminology right, by "in the background", you mean running the camera capture while your application is in the foreground, but not displaying the actual video feed. This is possible, but I wanted to make clear that if you move your whole application into the background you will not have access to the camera then.
There are a few ways to do this, but the one that I've spent the most time with is grabbing frames of video (or photos) via AV Foundation. Using an AVCaptureDevice and AVCaptureSession, you can grab the frames of video and route them to an encoder for saving to disk or for processing using your own custom code. None of this requires the camera feed to be displayed onscreen, so you can put up whatever interface you like and do this video recording or photo capture without any onscreen indication.
I would caution that you should make it explicit to your users what you are doing, so that you do not run the risk of violating someone's privacy. Apple does not react kindly to those who do this (for good reason).
I encapsulate a lot of this within my open source GPUImage video and photo processing framework, so you could look at the code for the GPUImageVideoCamera class there to see how I configure the capture inputs. I hand the video frames off to OpenGL ES for the application of filters and other processing operations, but you could ignore that portion of it if you just wanted to do your own encoding or processing.
Heres an exemple code from Apple's doc:
http://developer.apple.com/library/ios/#samplecode/PhotoPicker/Introduction/Intro.html
there is also the way to customize the camera interface.
I have been facing this since so many days but I have not reach to any conclusion.
My problem is : I want to play an mp3 file but not simply by clicking on a play button.
It is this way I want to play it.
*There is a slider that I can drag using finger, I want that the mp3 should play with the frequency with which I am dragging the finger (or speed with which I am dragging my finger, so that it will give an effect of fast forwarding (funny type of voice)) or if I drag slider slowyly the output will be slow *
The problem is the output of the sound is not coming out smooth. its very distorted and disturbed voice.
I want the outuput to be smoother.
Please help. Any suggestions please. Presently I am using AVAudioPlayer and passing the time value based upon slider input to play the file. (It does not seems to be feasible though).
I feel that it is possible using openAL only and no other way. Because using openAL we can modify the frequency of the sound file (pitch)
CAN SOME ONE PLEASE REFER ME A LINK TO openAL implementation for iPhone . I have never played a sound file using openAL
Help!!
You won't be able to do it with AVAudioPlayer, as it does not support pitch operations.
You can load and decode the entire track into memory for playback with OpenAL (which supports pitch), or you can do realtime loading/decoding and pitch changing using Audio Units (MUCH lower level, and more complicated, though).
I know how to record a video from the camera on the iPhone, my question, is it possible to take the recording and overlay it on a saved video and save it out as another file?
No, I don't think so. Not using an standard framework. You could probably do something involving screen capture and combing a load of images to make a video. But it would be complicated.