Android: Switch photo camera to video camera on the same surface - android-camera

I'm a new android developer.
I have done an Activity with a photo camera to take photos in a surface view but now I'm thinking to add a botton in the Activity to switch the camera to video recorder in the same surface view. This is possible?
Thanks in Advance.

I have the solution to my own question. I did a surface view and an Activity for a photo camera after I thought put a button to record a video in the same Activity with the same surface view but I don´t know if it was possible. Well, I wrote this method in the Activity to prepare de MediaRecorder and take the surface view.
public Boolean prepararCamaraVideo(){
mMediaRecorder = new MediaRecorder();
mCamera.unlock();
mMediaRecorder.setCamera(mCamera);
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
state = MediaRecorderState.INITIALIZED;
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.FROYO)
mMediaRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH));
else {
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
}
state = MediaRecorderState.DATA_SOURCE_CONFIGURED;
mOutputFile = Files.getExternalMediaFile(Files.MEDIA_TYPE_VIDEO).toString();
mMediaRecorder.setOutputFile(mOutputFile);
mMediaRecorder.setPreviewDisplay(mCameraPreview.getHolder().getSurface());
try {
mMediaRecorder.prepare();
} catch (IllegalStateException e) {
Log.d("Video", "IllegalStateException preparing MediaRecorder: " + e.getMessage());
releaseMediaRecorder();
return false;
} catch (IOException e) {
Log.d("Video", "IOException preparing MediaRecorder: " + e.getMessage());
releaseMediaRecorder();
return false;
}
return true;
}
This command mMediaRecorder.setPreviewDisplay(mCameraPreview.getHolder().getSurface()); get the surface for the Video Camera.
Finally the method to Record the video.
public void grabaVideo(View v) {
if (state!=MediaRecorderState.RECORDING){
if (prepararCamaraVideo()) {
mMediaRecorder.start();
state = MediaRecorderState.RECORDING;
Toast.makeText(getApplicationContext(), getString(R.string.capturing_video), Toast.LENGTH_SHORT).show();
} else {
// prepare didn't work, release the camera
releaseMediaRecorder();
// inform user
}
}
else{
mMediaRecorder.stop(); // stop the recording
releaseMediaRecorder(); // release the MediaRecorder object
mCamera.lock(); // take camera access back from MediaRecorder
state = MediaRecorderState.INITIAL;
Toast.makeText(getApplicationContext(), getString(R.string.video_stored_in) + " " + mOutputFile, Toast.LENGTH_SHORT).show();
}
}
I hope to help you if you need.

Related

Pass data from android to flutter

I have added my Android side code:
I know that I need to use a platform channel to pass data,I am unable to figure out:
import io.flutter.embedding.android.FlutterActivity;
public class MainActivity extends AppCompatActivity {
private Button Btn;
// Intent defaultFlutter=FlutterActivity.createDefaultIntent(activity);
String path;
private Button bt;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
Btn = findViewById(R.id.btn);
isStoragePermissionGranted();
Btn.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view)
{
path=takeScreenshot();
// activity.startActivity(defaultFlutter);
}
});
//write flutter xode here
//FlutterActivity.createDefaultIntent(this);
}
private String takeScreenshot() {
Date now = new Date();
android.text.format.DateFormat.format("yyyy-MM-dd_hh:mm:ss", now);
try {
// image naming and path to include sd card appending name you choose for file
String mPath = Environment.getExternalStorageDirectory().toString() + "/" + now + ".jpg";
// create bitmap screen capture
View v1 = getWindow().getDecorView().getRootView();
v1.setDrawingCacheEnabled(true);
Bitmap bitmap = Bitmap.createBitmap(v1.getDrawingCache());
v1.setDrawingCacheEnabled(false);
File imageFile = new File(mPath);
Log.d("path",mPath);
FileOutputStream outputStream = new FileOutputStream(imageFile);
int quality = 100;
bitmap.compress(Bitmap.CompressFormat.JPEG, quality, outputStream);
outputStream.flush();
outputStream.close();
return mPath;
///openScreenshot(imageFile);
} catch (Throwable e) {
// Several error may come out with file handling or DOM
e.printStackTrace();
return "Error";
}
}
public boolean isStoragePermissionGranted() {
String TAG = "Storage Permission";
if (Build.VERSION.SDK_INT >= 23) {
if (this.checkSelfPermission(android.Manifest.permission.WRITE_EXTERNAL_STORAGE)
== PackageManager.PERMISSION_GRANTED) {
Log.v(TAG, "Permission is granted");
return true;
} else {
Log.v(TAG, "Permission is revoked");
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.WRITE_EXTERNAL_STORAGE}, 1);
return false;
}
}
else { //permission is automatically granted on sdk<23 upon installation
Log.v(TAG,"Permission is granted");
return true;
}
}
}
I will receive a file from the android side, upon receiving I need to display it in a flutter. I also need to use cached engine for transferring data as normally it would cause a delay
You can use the cached engine, this will help me cover up for the delay.
Then you can add a invoke method onpressed that you can send method name and the data you want to pass.
On flutter side,you can create a platform and invoke method through which you can receive requirements and further process it,

Camera2: onCaptureCompleted() is not called

I recently started learning camera2 api, but I have some trouble.
This code can be run when camera is emulated.
However, when I use Webcam0 or virtual scene it will stuck after executing captureburst(), and without error message.
It only shows that "The application may be doing too much work on its main thread."
I check that it will call onCaptureStarted(), but will not call onCaptureCompleted().
AVD Manager: API30 Pixel 3
public void takePicture(){
if(cameraDevice == null){
return ;
}
try {
picturesRequestBuilder = cameraDevice.createCaptureRequest(cameraDevice.TEMPLATE_STILL_CAPTURE);
//
mImageReader.setOnImageAvailableListener(new OnImageAvailableListener(),mainHandler);
//imageSurface = mImageReader.getSurface();
//
picturesRequestBuilder.addTarget(imageSurface);
picturesRequestBuilder.addTarget(surface);
picturesRequestBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
//
CaptureRequest captureRequest = picturesRequestBuilder.build();
ArrayList<CaptureRequest> captureRequests = new ArrayList<>();
for(int i=0;i<5;i++){
captureRequests.add(captureRequest);
}
mCameraCaptureSession.captureBurst(captureRequests, new CaptureCallback(),mainHandler);
}
catch (CameraAccessException e) {
e.printStackTrace();
}
Log.d("test","finished");
}
///////////////
private void setupCamera(){
cameraManager = (CameraManager) getSystemService(CAMERA_SERVICE);
try{
cameraIdList = cameraManager.getCameraIdList();
cameraId = cameraManager.getCameraIdList()[0]; // 取得後置鏡頭
cameraCharacteristics = cameraManager.getCameraCharacteristics(cameraId);
} catch (CameraAccessException e){
e.printStackTrace();
}
StreamConfigurationMap streamConfigurationMap = cameraCharacteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
Size[] outputSizes = streamConfigurationMap.getOutputSizes(ImageFormat.JPEG);
/////camera支援最大的高與寬////
cameraWidth = outputSizes[0].getWidth();
cameraHeight = outputSizes[0].getHeight();
//////////////////////////////
REQUIRED_PERMISSIONS.add(android.Manifest.permission.CAMERA);
REQUIRED_PERMISSIONS.add(android.Manifest.permission.WRITE_EXTERNAL_STORAGE);
}
/////////
public void setPreview(){
List<Surface> outputSurface = new ArrayList<>(2);
SurfaceTexture surfaceTexture = cameraPreview.getSurfaceTexture();
ImageReader imageReader = ImageReader.newInstance( 1920 , 1080, ImageFormat.JPEG, 1);
imageReader.setOnImageAvailableListener(new OnImageAvailableListener(), childHandler);
mImageReader = imageReader;
imageSurface = imageReader.getSurface();
surface = new Surface(surfaceTexture);
outputSurface.add(surface);
outputSurface.add(imageSurface);
try{
previewRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
previewRequestBuilder.addTarget(surface);
cameraDevice.createCaptureSession(outputSurface, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
if(cameraDevice == null){
return;
}
mCameraCaptureSession = cameraCaptureSession;
try {
previewRequestBuilder.set(CaptureRequest.CONTROL_MODE, CaptureRequest.CONTROL_MODE_AUTO);
mCameraCaptureSession.setRepeatingRequest(previewRequestBuilder.build(), null, childHandler);
}catch (CameraAccessException e) {
Log.d("TAG", "Error creating the preview session");
Log.d("TAG", e.getMessage());
}
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession cameraCaptureSession) {
Log.d("fail","Failed");
}
}, childHandler);
} catch (CameraAccessException e) {
Log.d("TAG", "Error setting up the camera preview");
Log.d("TAG", e.getMessage());
}
}
If you're using the emulator virtual scene for Android 11 beta (API level 30), there's a known bug in JPEG capture, which you may be running into: https://buganizer.corp.google.com/issues/160382725
If it's the bug in question, the entire system logcat should show a crash involving a JPEG library for the camera HAL process.
This will be fixed in a later update to the Android 11 SDK.

Players are unmuted when relog

I have no idea why my code is not keeping players muted after they relog even though I added them to the config.
This is there the muted players get saved:
private static ArrayList <Player> mutedPlayers = new ArrayList<Player>();
This is the event that handles the muted player and that should check if the player is muted or not:
#EventHandler
public void handlePlayerChat(AsyncPlayerChatEvent e){
Player p = e.getPlayer();
if (mutedPlayers.contains(p)) {
p.sendMessage(ChatColor.DARK_RED + "You've been muted!");
e.setCancelled(true);
}
}
This is the command:
if(command.getName().equals("mute")) {
if (sender instanceof Player) {
Player p = (Player) sender;
if (p.hasPermission("shxr.mute")) {
if (args.length == 1) {
Player target = Bukkit.getPlayer(args[0]);
if (target != null) {
if (!mutedPlayers.contains(target)) {
mutedPlayers.add(target);
p.sendMessage(ChatColor.GREEN + "You have successfully muted " + target.getName() + ChatColor.GREEN + "!");
target.sendMessage(ChatColor.DARK_RED + "You are muted!");
getConfig().set("mutedPlayers.Players", mutedPlayers);
saveConfig();
} else {
mutedPlayers.remove(target);
p.sendMessage(ChatColor.GREEN + target.getName() + ChatColor.GREEN + " has been unmuted!");
target.sendMessage(ChatColor.DARK_RED + "You have been unmuted!");
saveConfig();
}
} else {
p.sendMessage(ChatColor.DARK_RED + "Cannot find the player.");
}
} else {
p.sendMessage(ChatColor.DARK_RED + "Proper usage of this command is: /mute <player>");
}
} else {
p.sendMessage(ChatColor.DARK_RED + "You do not have the permissions to mute players!");
}
}
}
Two issues:
You aren't saving this list to disk, so when the server restarts, you're going to lose it all.
You are storing references to the Player object, which gets recreated any time a user logs in or changes dimensions (Player is just an Entity class and is not a permanent reference). You need to store the user's UUID.

Image is stretching while capturing video using mediarecorder

I have made an application in which I am capturing video using MediaRecorder.But when preview starts on device it seems that image is stretching I just gone through all the related question here or on google but didn't get success.Here is my code.
video.xml
<SurfaceView
android:id="#+id/camera_view"
android:layout_width="fill_parent"
android:layout_height="250dp"
android:layout_gravity="center"
android:layout_marginLeft="10dp"
android:layout_marginRight="10dp"
android:layout_marginTop="5dp" />
Video.java file configuration
public void startRecording() {
try {
if(some codition){
// recording call.
setCameraDisplayOrientaion();
camera.unlock();
mediaRecorder.setCamera(camera);
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mediaRecorder
.setVideoSource(MediaRecorder.VideoSource.DEFAULT);
mediaRecorder.setAudioEncodingBitRate(16);
mediaRecorder.setAudioSamplingRate(44100);
CamcorderProfile profile = CamcorderProfile.QUALITY_1080P == CamcorderProfile.QUALITY_HIGH
|| CamcorderProfile
.hasProfile(CamcorderProfile.QUALITY_480P) ? CamcorderProfile
.get(cameraId, CamcorderProfile.QUALITY_480P)
: CamcorderProfile.get(cameraId,
CamcorderProfile.QUALITY_HIGH);
profile.videoFrameRate = 15;
profile.videoBitRate = 150000;
profile.audioBitRate = 12200;
mediaRecorder.setProfile(profile);
mediaRecorder.setOutputFile(string);
mediaRecorder.setPreviewDisplay(surfaceHolder.getSurface());
try {
mediaRecorder.prepare();
mediaRecorder.start();
} catch (Exception e) {
Log.e(ResponseActivity.class.toString(), e.getMessage());
releaseMediaRecorder();
}
} else {
mediaRecorder = new MediaRecorder();
setCameraDisplayOrientaion();
camera.unlock();
mediaRecorder.setCamera(camera);
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.DEFAULT);
mediaRecorder.setAudioEncodingBitRate(16);
mediaRecorder.setAudioSamplingRate(44100);
CamcorderProfile profile = CamcorderProfile.QUALITY_1080P == CamcorderProfile.QUALITY_HIGH
|| CamcorderProfile
.hasProfile(CamcorderProfile.QUALITY_480P) ? CamcorderProfile
.get(cameraId, CamcorderProfile.QUALITY_480P)
: CamcorderProfile.get(cameraId,
CamcorderProfile.QUALITY_HIGH);
profile.videoFrameRate = videoFrameRate;
profile.videoBitRate = videoBitrate;
profile.audioBitRate = audioBitrate;
mediaRecorder.setProfile(profile);
mediaRecorder.setOutputFile(fullQuestionPath);
mediaRecorder.setPreviewDisplay(surfaceHolder.getSurface());
try {
mediaRecorder.prepare();
mediaRecorder.start();
} catch (Exception e) {
Log.e(ResponseActivity.class.toString(), e.getMessage());
releaseMediaRecorder();
}
}
} catch (Exception e) {
}
}
Please let me know that what configuration need to change here,Any help would be appreciable.

SWT/JFACE progress bar in splash screen stop update while running external process in application

I Have problem with application that invoke splash screen with progrss bar during initialization, this is part of code inside my app,
try {
Runtime runTime = Runtime.getRuntime();
pbaw = runTime.exec(cmdGen);
//pbaw = probuilder.start();
try {
String line;
BufferedReader input =
new BufferedReader(new InputStreamReader(pbaw.getInputStream()));
SplashScreen ss= new SplashScreen(Display.getCurrent(), getShell());
int progress = 0;
ss.show();
while ((line = input.readLine()) != null) {
ss.setLabel(line);
progress+=1;
ss.advance(progress);
//System.out.println(line); //<-- Parse data here.
}
input.close();
ss.destroy(pbaw.waitFor());
} catch (InterruptedException e) {
e.printStackTrace();
}
} catch (IOException e) {
e.printStackTrace();
}
}
my question is why the progrss bar in splash screen can not update while i click in desktop or running other application, but my applicatin can run perfectly, please help ???
here part of my splash scrren code to make it clear,
void advance(final int progress) {
if (!progressBar.isDisposed()) {
progressBar.setSelection(progress);//progressBar.getSelection() + progress);
progressBar.redraw();
}
}
public void show() {
mainShell.pack();
mainShell.open();
}
void destroy(int val) {
if (val==0){
splashComposite.dispose();
if (appImage != null) {
appImage.dispose();
}
mainShell.dispose();
}
}
You are running your code in the User Interface thread and never giving SWT a chance to update. You need to run your reading code in a separate thread and use Display.asyncExec to update the splash screen in the UI thread.