How to restrict use of third party camera app from your app - android-camera

I have power cam app which is a third party camera app installed in my device. I am opening camera from my app, when i click on open camera button, it gives me choice of cameras like device camera along with power cam. I want that on clicking the open camera button, device camera should get open, in other words i want to restrict the user from using power cam from my app

If you want to run only the official camera, you can use the following intent (based on the official tutorial):
static final int REQUEST_IMAGE_CAPTURE = 1;
private void dispatchTakePictureIntent() {
Intent takePictureIntent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
takePictureIntent.setPackage("com.android.camera");
if (takePictureIntent.resolveActivity(getPackageManager()) != null) {
startActivityForResult(takePictureIntent, REQUEST_IMAGE_CAPTURE);
}
}
Unfortunately, many devices come with custom preinstalled camera apps, and com.android.camera may not be available.
If you want to filter out specific packages that you don't like, you must prepare your own chooser dialog (see example here). You can skip the dialog if you know which package to choose. E.g. it is possible to filter the list to only include "system" packages. But even then, there is no guarantee that there will be only one system package that is registered to fulfill the MediaStore.ACTION_IMAGE_CAPTURE intent.

Related

Get list of available audio devices on iOS/Android with Flutter

In my flutter application I want to switch between headphone to speaker vice versa.
I am looking for a way to get the available audio devices and to switch them.
I found;
final mediaDevices = navigator.mediaDevices;
var devices = await mediaDevices.getSources();
It is not clear to me what this navigator is?
May I know whether there is a way to do this?
Add audio_service package to your project and use
List<AudioDevice> audioDevices = (await session.getDevices()).toList();
to get the list of audio devices present
refer: https://pub.dev/packages/audio_session

Camera X - Accessing both Front and Back Cameras simultaneously using Camera X API

Trying to preview both cameras (front and back) concurrently using Android X API -
Camera camera = cameraProvider.bindToLifecycle((LifecycleOwner)this, cameraSelector, preview); //Back-Camera
Camera camera2 = cameraProvider.bindToLifecycle((LifecycleOwner)this, cameraSelector2, preview2); //Front-Camera
With the above code-snippet, only front camera comes up. If I change the order above, back camera shows up as expected.
Tried acquiring the instances of Camera feature (cameraProvider = ProcessCameraProvider.getInstance(this)) twice, however I found strange observation by mapping one camera per one provider. Upon home press and launching app again, either one of the preview (Back or Front) shows up and there is no pattern found.
Can anyone throw more lights on this? Is it anything to do with the target device i.e. device incompatibility? The target device I am using is OnePlus 5.
CameraX doesn't support opening more than 1 camera at a time, which is why when you attempt to open 2 cameras by calling ProcessCameraProvider.bindToLifecycle() twice, only the second camera is opened.
ProcessCameraProvider provides access to the cameras on the device, and as its name suggests, it has the scope of the process/application, i.e it's a Singleton, once it's initialized, you'll get the same instance with each consequent call to ProcessCameraProvider.getInstance().

How can I adjust microphone input levels in HoloLens?

In our communications app, some people's voices are too quiet. So we want to be able to change the system level of their microphone input.
I searched through all the Windows Universal App samples and Unity documentation and I couldn't find how to change the volume of the Windows microphone (on Windows or HoloLens).
I found that the property to adjust is the AudioDeviceController.VolumePercent property. The following code implements this:
MediaCapture mediaCapture = new MediaCapture();
var captureInitSettings = new MediaCaptureInitializationSettings
{
StreamingCaptureMode = StreamingCaptureMode.Audio
};
await mediaCapture.InitializeAsync(captureInitSettings);
mediaCapture.AudioDeviceController.VolumePercent = volumeLevel;
I confirmed that this code works on Desktop and HoloLens. It changes the system level, so it's automatically persisted, and would affect all apps.

Can I run ARCore Preview 1 App on Preview 2 release?

I've built an app which runs on ARCOre preview 1 package on Unity. I know Google has made major changes in preview 2.
My question is what changes will I have to make in order to run my ARCore preview 1 app run on preview 2?
Take a look at the code in the Preview 2 sample app(s) and update your code accordingly. For example, here is the new code for properly instantiating an object into the AR scene:
if (Session.Raycast(touch.position.x, touch.position.y, raycastFilter, out hit))
{
var andyObject = Instantiate(AndyAndroidPrefab, hit.Pose.position,
hit.Pose.rotation);
// Create an anchor to allow ARCore to track the hitpoint
// as understanding of the physical world evolves.
var anchor = hit.Trackable.CreateAnchor(hit.Pose);
// Andy should look at the camera but still be flush with the plane.
andyObject.transform.LookAt(FirstPersonCamera.transform);
andyObject.transform.rotation = Quaternion.Euler(0.0f,
andyObject.transform.rotation.eulerAngles.y,
andyObject.transform.rotation.z);
// Make Andy model a child of the anchor.
andyObject.transform.parent = anchor.transform;
}
Common
Preview 1 use Tango Core service that can changed Ar-Core service in Preview 2.
Automatic Screen Rotation is Handled.
Some Classes are altered like some reason of following.
For Users:
Introduce AR Stickers
For Developers:
A new C API for use with the Android NDK that complements our existing Java, Unity, and Unreal SDKs;
Functionality that lets AR apps pause and resume AR sessions, for example to let a user return to an AR app after taking a phone call;
Improved accuracy and runtime efficiency across our anchor, plane finding, and point cloud APIs.
I have updated my app from Preview 1 to Preview 2. And it's not a lot. It had minor API changes like the ones for hit flags, Pose.position etc. It would probably be stupid to post the change log here. I suggest that you can file the below steps:
Replace the old sdk with the new one in the Unity Project
Then, check for the error in your default editor, vs or vs code or mono
Just check for the relevant API's in the deveoper docs of AR.
It's not such a cumbersome job, it too me some 5-10 min to upgrade that's it.
Cheers!

How to make video captured by front camera not being inverse Android?

I recording video using MediaRecorder.When using back-camera,it working fine,but when using front camera,the video captured is being flipped/inverse.Means that the item in right,will appear on the left.The camera preview is working fine,just final captured video flipped.
Here is the camera preview looks like
But the final video appear like this(all the item in left hand side,appear on right hand side)
What I tried so far:
I tried to apply the matrix when prepare recorder,but it seems does change anything.
private boolean prepareRecorder(int cameraId){
//# Create a new instance of MediaRecorder
mRecorder = new MediaRecorder();
setCameraDisplayOrientation(this,cameraId,mCamera);
int angle = getVideoOrientationAngle(this,cameraId);
mRecorder.setOrientationHint(angle);
if(cameraId == Camera.CameraInfo.CAMERA_FACING_FRONT){
Matrix matrix = new Matrix();
matrix.preScale(1.0f,-1.0f);
}
//all other code to prepare recorder here
}
I already read for all this question below,but all this seems didnt solve my problem.For information,I using SurfaceView for the camera preview,so this question here doesn't help.
1) Android flip front camera mirror flipped video
2) How to keep android from inverting the image from the front facing camera?
3) Prevent flipping of the front facing camera
So my question is :
1) How to capture a video by front camera which the video not being inverse(exactly the same with camera preview)?
2) How to achieve this when the Camera preview is using SurfaceView but not TextureView ? (cause all the question I mention above,tell about using TextureView)
All possible solution is mostly welcome..Tq
EDIT
I made 2 short video clip to clarify the problem,please download and take a look
1) The video during camera preview of recording
2) The video of the final product of recording
So, if the system camera app produces video similar to your app, you didn't do something wrong. Now it's time to understand what happens to front-facing camera video recording.
The front facing camera is not different from the rear facing camera in the way it captures still pictures or video. There is a difference how the phone displays camera preview on the screen. To make it look more natural to the user, Android (and all other systems) mirrors the preview, so that you can see yourself as if in a mirror.
It is important to understand that this only applies to the way the preview is presented to you. If you pick up any video conferencing app, connect two devices that you hold in two hands, and look at yourself, you will see to your surprise that the two instances of yourself are flipped.
This is not a bug, this is the natural way to present the video to the other party.
See the sketch:
This is how you see the scene:
This is how your peer sees the same scene
Normally, recording of a video is done from the point if view of your peer, as in the second picture. This is the natural setup for, e.g., video conferencing.
But Snapchat and some other social apps choose to store the front-facing video clip as if you record it from the mirror (as if the recorder is in your hand on the first picture). Some people like this feature, others hate it (see https://forums.androidcentral.com/general-help-how/664539-front-camera-pics-mirrored-reversed-only-snapchat.html and https://www.reddit.com/r/nexus6/comments/3846ay/has_anyone_found_a_fix_for_snapchat_flipping)
You cannot use MediaRecorder for that. You can use the lower-level API of MediaCodec to record processed frames. You need to flip each frame 'manually', and this may be a significant performance hit, because normally the MediaRecorder 'connects' the camera to hardware encoder in a very efficient way, without need even to copy the pixels to user memory. This answer shows how you can manipulate the way camera is rendered to texture.
You can achieve this by recording video manually from surface view.
In such case preview and recording will match exactly.
I've been using this library for this purpose:
https://github.com/spaceLenny/recordablesurfaceview
Here is the guide how to use it (not with camera but with OpenGL drawing): https://withintent.uncorkedstudios.com/recording-screen-video-on-android-with-recordablesurfaceview-451c9daa213e