Android how to set captured video to .mp4 instead of .3gp - android-camera

I'm using the default camera to capture video:
Intent intent = new Intent(MediaStore.ACTION_VIDEO_CAPTURE);
intent.putExtra(MediaStore.EXTRA_VIDEO_QUALITY, 1);
intent.putExtra(MediaStore.EXTRA_DURATION_LIMIT,MAX_DURATION);
intent.putExtra(MediaStore.EXTRA_SIZE_LIMIT, MAX_SIZE_BYTES);
startActivityForResult(intent, CAPTURE_VIDEO_ACTIVITY_REQUEST_CODE);
How can I set the file container to .mp4 instead of .3gp?

You can change the MediaStore.EXTRA_VIDEO_QUALITY value to 0.
Just like this:
intent.putExtra(MediaStore.EXTRA_VIDEO_QUALITY, 0);

Related

How to get duration of an audio file given only its url

I have an mp3 audio file located under this link:
https://www.soundhelix.com/examples/mp3/SoundHelix-Song-1.mp3
How can I get its duration in my flutter app without getting the whole audio ?
Just it's length duration? I want the quickest possible way to get it.
You could go with the just_audio package.
import 'package:just_audio/just_audio.dart';
final player = AudioPlayer(); // Create a player
final duration = await player.setUrl( // Load a URL
'https://example.com/bar.mp3'); // Schemes: (https: | file: | asset: )

AudioClip.Length is incorrect when loading from UnityWebRequestMultimedia GetAudioClip

When I get an unity audio clip from firebase by url
var request = UnityWebRequestMultimedia.GetAudioClip(url, AudioType.MPEG);
yield return request.SendWebRequest();
var clip = DownloadHandlerAudioClip.GetContent(request);
print("success..." + clip.length);
audio clip real length is 4.86, but after download clip.length is 8.1
my upload audioclip used unity micphone record , and use "SavWav" Convert to mp3 data,
download audioclip length is a unity bug bug not fixed ,
i use SavWav.TrimSilence(clip, 0, (clip_)=>{...} get the correct length
Hope that helps

White frame before video plays in Unity instead of a custom thumbnail

I load a thumbnail before the video starts to play, but later when the video is playing, there is first a white frame and then the video is playing. How can I avoid this white frame??
Here is my code-
video.GetComponent<RawImage>().texture = thumbnailTex;
//Play the video:
RenderTexture rt = new RenderTexture(1920, 1080, 16, RenderTextureFormat.ARGB32);
rt.Create();
video.GetComponent<RawImage>().texture =rt;
video.GetComponent<RawImage>().targetTexture=rt;
video.GetComponent<VideoPlayer>().url = "www....";
video.GetComponent<VideoPlayer>().Play();
// white frame, and then the video is playing
You need to wait first and test if the video is ready to play
It would be better if it's not already to have the above code in a coroutine. What is happening is you call play before the player has had a chance to download/load the first frame. Then display your rendertexture.
video.GetComponent().url = "...";
video.GetComponent().Prepare();
while (!video.GetComponent().isPrepared)
yield return new WaitForEndOfFrame();
video.GetComponent().frame = 0; //just incase it's not at the first frame
video.GetComponent().Play();
//now display your render texture
Thanks you very much for this solution to avoid jump frames during the playing of a video!
But there are some changes to do :
In the "Start" function, I call a coroutine :
StartCoroutine(PrepareVideoCoroutine());
In this coroutine, I put :
while (!gameOverGlassVideo.GetComponent().isPrepared) {
gameOverGlassVideo.GetComponent<VideoPlayer>().Prepare();
yield return new WaitForEndOfFrame();
}
//::: I put the line to prepare the video in the condition and NOT
outside ::://
When I want to play the video after, it's was already prepared and there will be not jumping frames!

Unity 5.1 Distorted image after download from web

When I load my png after compressing with tiny png, they get distorted( all purple and transparent)
http://s22.postimg.org/b39g0bhn5/Screen_Shot_2015_06_28_at_10_39_50_AM.png
the background for example should be blue
http://postimg.org/image/fez234o6d/
this only happens when i use pictures that got compressed by tinypng.com
and only after i updated to unity 5.1.
Im downloading the image with WWW class and loading texture using Texture2D.
is this problem known to anyone?
I had exactly the same issue. I was able to solve it using the following code
mat.mainTexture = new Texture2D(32, 32, TextureFormat.DXT5, false);
Texture2D newTexture = new Texture2D(32, 32, TextureFormat.DXT5, false);
WWW stringWWW = new WWW(texture1URL);
yield return stringWWW;
if(stringWWW.error == null)
{
stringWWW.LoadImageIntoTexture(newTexture);
mat.mainTexture = newTexture;
}
The key seemed to be using DXT5 as the texture format, and using the method LoadImageIntoTexture(...);

How to use multiple USB webcam in Matlab working simultaneously?

I would like to take the live video with two USB webcams (Philips SPC 900NC), but I found that they cannot work simultaneously on my laptop. Either of the two USB webcams could work alone or work with another webcam (mounted on my laptop originally).
When I use the simulink block 'From video device', Matlab gave the error message with ' Multiple VIDEOINPUT objects cannot access the same device simultaneously.'. Then I checked the video input device with command 'imaqhwinfo', only one of the USB Philips webcam could be detected.
I would like to know that,
what's the reason of this situation? is it because the hardware limitation (USB bus bandwidth) or just matlab video object don't support same multiple video devices?
what's the solution of this? could anyone give me some suggestions?
You may interest in this link:
http://opencv.willowgarage.com/wiki/faq#How_to_use_2_cameras_.28multiple_cameras.29_with_cvCam_library
Which contains:
First, init the cvcam library and get the number of cams by:
int ncams = cvcamGetCamerasCount( ); //returns the number of available cameras in the system
Show dialog to choose which cameras in use
int* out; int nselected = cvcamSelectCamera(&out);
Get the selected cams and enable them.
int cam1 = out[0];
int cam2 = out[1];
cvcamSetProperty(cam1, CVCAM_PROP_ENABLE, CVCAMTRUE);
cvcamSetProperty(cam1, CVCAM_PROP_RENDER, CVCAMTRUE); //We'll render stream from this source
cvNamedWindow("Cam1", 1);
cvcamWindow MyWin1 = (cvcamWindow)cvGetWindowHandle("Cam1");
cvcamSetProperty(cam1, CVCAM_PROP_WINDOW, &MyWin1); // Selects a window for video rendering
//Same code for camera 2
cvcamSetProperty(cam2, CVCAM_PROP_ENABLE, CVCAMTRUE);
cvcamSetProperty(cam2, CVCAM_PROP_RENDER, CVCAMTRUE);
cvNamedWindow("Cam2", 1);
cvcamWindow MyWin2 = (cvcamWindow)cvGetWindowHandle("Cam2");
cvcamSetProperty(cam2, CVCAM_PROP_WINDOW, &MyWin1);
//If you want to open the property dialog for setting the video format parameters, uncomment this line
//cvcamGetProperty(cam1, CVCAM_VIDEOFORMAT, NULL);
//cvcamGetProperty(cam2, CVCAM_VIDEOFORMAT, NULL);
Enable the stereo mode (2 cameras working at the same time)
cvcamSetProperty(cam1, CVCAM_STEREO_CALLBACK , stereocallback); //stereocallback is the function running to process every frames
cvcamInit();
cvcamStart();
//Your app is working
while (1)
{
int key = cvWaitKey(5);
if (key == 27) break;
}
cvcamStop( );
cvcamExit( );
Define the stereocallback function outside of the function above.
void stereocallback(IplImage* image1, IplImage* image2) {
//Process 2 images here
}