Ive searched everywhere and havnt managed to find an answer to this question so I thought Id ask it here.
Im currently using the AVCaptureVideoDataOutput to and CaptureOutput to get frames from the camera in real time at 30fps. I have left the default settings of autoexposure and auto focus etc.
I want to be able to query the camera per frame WITHOUT RESORTING TO THE STILL CAPTURE OPTION and ask what the cameras current frame exposure, aperture and focal length are.
Anyone have any ideas?
Thanks
Take a look at the results of:
CFDictionaryRef metadataDictionary = (CFDictionaryRef)CMGetAttachment(sampleBuffer, CFSTR("MetadataDictionary"), NULL);
Related
I am trying to create an app &/or web based online school that I am filming instruction from four different angels. I can't find anything that I could code to allow the user to select a different camera view and progress during the video.
Ideally there will be four cameras filming the exact technique but the user should be able to jump to different views throughout the video without having to start again.
I have searched online for three days now but cannot find anything to what I want/need.
I just want the video to play and the user to be able to switch to different camera views.
You can set the property:
VideoPlayer.frame
When the user "changes camera angle", record the frame of the currently playing video, stop it, start playing the new video from the different angle and set it's frame property to match the last video.
I recording video using MediaRecorder.When using back-camera,it working fine,but when using front camera,the video captured is being flipped/inverse.Means that the item in right,will appear on the left.The camera preview is working fine,just final captured video flipped.
Here is the camera preview looks like
But the final video appear like this(all the item in left hand side,appear on right hand side)
What I tried so far:
I tried to apply the matrix when prepare recorder,but it seems does change anything.
private boolean prepareRecorder(int cameraId){
//# Create a new instance of MediaRecorder
mRecorder = new MediaRecorder();
setCameraDisplayOrientation(this,cameraId,mCamera);
int angle = getVideoOrientationAngle(this,cameraId);
mRecorder.setOrientationHint(angle);
if(cameraId == Camera.CameraInfo.CAMERA_FACING_FRONT){
Matrix matrix = new Matrix();
matrix.preScale(1.0f,-1.0f);
}
//all other code to prepare recorder here
}
I already read for all this question below,but all this seems didnt solve my problem.For information,I using SurfaceView for the camera preview,so this question here doesn't help.
1) Android flip front camera mirror flipped video
2) How to keep android from inverting the image from the front facing camera?
3) Prevent flipping of the front facing camera
So my question is :
1) How to capture a video by front camera which the video not being inverse(exactly the same with camera preview)?
2) How to achieve this when the Camera preview is using SurfaceView but not TextureView ? (cause all the question I mention above,tell about using TextureView)
All possible solution is mostly welcome..Tq
EDIT
I made 2 short video clip to clarify the problem,please download and take a look
1) The video during camera preview of recording
2) The video of the final product of recording
So, if the system camera app produces video similar to your app, you didn't do something wrong. Now it's time to understand what happens to front-facing camera video recording.
The front facing camera is not different from the rear facing camera in the way it captures still pictures or video. There is a difference how the phone displays camera preview on the screen. To make it look more natural to the user, Android (and all other systems) mirrors the preview, so that you can see yourself as if in a mirror.
It is important to understand that this only applies to the way the preview is presented to you. If you pick up any video conferencing app, connect two devices that you hold in two hands, and look at yourself, you will see to your surprise that the two instances of yourself are flipped.
This is not a bug, this is the natural way to present the video to the other party.
See the sketch:
This is how you see the scene:
This is how your peer sees the same scene
Normally, recording of a video is done from the point if view of your peer, as in the second picture. This is the natural setup for, e.g., video conferencing.
But Snapchat and some other social apps choose to store the front-facing video clip as if you record it from the mirror (as if the recorder is in your hand on the first picture). Some people like this feature, others hate it (see https://forums.androidcentral.com/general-help-how/664539-front-camera-pics-mirrored-reversed-only-snapchat.html and https://www.reddit.com/r/nexus6/comments/3846ay/has_anyone_found_a_fix_for_snapchat_flipping)
You cannot use MediaRecorder for that. You can use the lower-level API of MediaCodec to record processed frames. You need to flip each frame 'manually', and this may be a significant performance hit, because normally the MediaRecorder 'connects' the camera to hardware encoder in a very efficient way, without need even to copy the pixels to user memory. This answer shows how you can manipulate the way camera is rendered to texture.
You can achieve this by recording video manually from surface view.
In such case preview and recording will match exactly.
I've been using this library for this purpose:
https://github.com/spaceLenny/recordablesurfaceview
Here is the guide how to use it (not with camera but with OpenGL drawing): https://withintent.uncorkedstudios.com/recording-screen-video-on-android-with-recordablesurfaceview-451c9daa213e
I am trying to see if it is possible to record a video from the iPhone's camera and write this to a file. I then want the video to start playing on the screen a set time after. This all needs to happen continuously. For example, I want the video on the screen to always be 20 seconds behind what the camera is recording.
Some background:
I have a friend who is a coach and would like for his players to be able to see their last play. This could be accomplished by a feed going to a TV from an iPad always 20 seconds behind what is recorded. This needs to continually run until practice is over. (I would connect the iPad to the TV either with a cable or AirPlay to an Apple TV). The video would never need to be saved and should just be discarded after playing.
Is this even possible with the APIs AVFoundation offers? Will the iPhone let you write to a file and read from a file at the same time to accomplish this? Any other better way to accomplish this?
Thanks for your time.
Instead of writing to a file, how about saving your frames in a circular buffer big enough to hold X seconds of video?
The way I would start to do this would be to look at what's provided in AVCaptureVideoDataOutput and its delegate methods (where you can get the frame data from).
Can somebody tell me how to scrub the AQPlayer ( used in Apple's SpeakHere example ) using a UISlider like the iPod does?
I know how to handle the slider part, but once I have my value from the slider, what do I need to set/change/update in AQPlayer, or the AudioQueue, so that the player moves to that part of the Queue and continues playing from that point?
Is there any easy way to do this with a percentage of the playing time or do I have to make some calculations with the packets??
Thanks for any input.
Al
For anyone who also needs to seek/scrubb in an audio file, I found a solution to my question at the following link: Audio Queues
Have a look at the function
-(void)seek:(UInt64)packetOffset;
It worked perfectly after some initial fine tuning.
Is there a way to have my iPhone program step frame by frame through a movie recorded by the iPhone? What I want to do is have the user record a quicktime movie, then be able to step through the movie frame by frame.
Alternately, I suppose if there was a way to extract every single frame from the movie to a jpg, then I could easily step through the pictures. Anyone know of a way to do this???
I suppose the third option (which might not get past Apple's store) is to capture the movie the way the old jailbroken apps did, which is somehow capture the pictures directly from the camera view????
Any help appreciated. Thanks in advance!!!!
You cannot step through a movie frame by frame. That functionality does not exist in the public API.
You can include your own media decoder code (open source or not) and use that to parse your movies of course. It is perfectly fine to do that.