Hi I am trying to switch between the rear and front facing cameras if i start he session with the front facing camera it works but with the other way way round the AVCapturepreview layer is not visible and just presents a blank screen.
First stop the preview session.
When stopped (use the delegate for this) set the camera device you want and restart the preview session.
In code:
[[PBJVision sharedInstance] stopPreview];
//implement the following deluge ate
- (void)visionSessionDidStopPreview:(PBJVision *)vision{
vision.cameraDevice = PBJCameraDeviceFront;
[vision startPreview];
}
Related
Trying to preview both cameras (front and back) concurrently using Android X API -
Camera camera = cameraProvider.bindToLifecycle((LifecycleOwner)this, cameraSelector, preview); //Back-Camera
Camera camera2 = cameraProvider.bindToLifecycle((LifecycleOwner)this, cameraSelector2, preview2); //Front-Camera
With the above code-snippet, only front camera comes up. If I change the order above, back camera shows up as expected.
Tried acquiring the instances of Camera feature (cameraProvider = ProcessCameraProvider.getInstance(this)) twice, however I found strange observation by mapping one camera per one provider. Upon home press and launching app again, either one of the preview (Back or Front) shows up and there is no pattern found.
Can anyone throw more lights on this? Is it anything to do with the target device i.e. device incompatibility? The target device I am using is OnePlus 5.
CameraX doesn't support opening more than 1 camera at a time, which is why when you attempt to open 2 cameras by calling ProcessCameraProvider.bindToLifecycle() twice, only the second camera is opened.
ProcessCameraProvider provides access to the cameras on the device, and as its name suggests, it has the scope of the process/application, i.e it's a Singleton, once it's initialized, you'll get the same instance with each consequent call to ProcessCameraProvider.getInstance().
I am using vuforia & ARKit sdk in my application.
I have two buttons ( Vuforia & ARKit buttons)
My app work flow:
Open the app
Clicked on “vuforia AR” button (1st time)
If marker found Vuforia AR works.
i am switch to “ARKit” 1st time, it will be working fine.
If we press back button, it will redirect to MainScreen
Again Click on “vuforia AR” button (2nd time)
Vuforia working good.
If we switch to “ARKit” , ARKit Camera is not resetting .
Here arkit camera is freezing.
Note: both SDK are using single scene and I tried two different scenes also. My issue is not resolved.
Don't know if you solved your problem.
If not, did you tried to reset your ARKit scene ?
Here's a sample code I founded in another post. Working fine for me to reset my ARKit Session.
public void ResetScene()
{
ARKitWorldTrackingSessionConfiguration sessionConfig = new ARKitWorldTrackingSessionConfiguration(UnityARAlignment.UnityARAlignmentGravity, UnityARPlaneDetection.Horizontal);
UnityARSessionNativeInterface.GetARSessionNativeInterface().RunWithConfigAndOptions(sessionConfig, UnityARSessionRunOption.ARSessionRunOptionRemoveExistingAnchors | UnityARSessionRunOption.ARSessionRunOptionResetTracking);
}
Don't forget to change sessionConfig parameters with yours ! ;)
I can't seem to make iOS show the correct play / pause button in the remote audio controls. I do receive the remote control events and set all values of the nowPlayingInfo dictionary.
Everything works fine and I even see a cover photo on the lock screen. Except the pause/play button. It always looks like pause even if my AVAudioPlayer is playing. It sends a pause event regardless of playback state.
How can I notify iOS that AVAudioPlayer is paused and that it should now show a play button in the remote control buttons bar?
Make sure that you're setting the MPNowPlayingInfoPropertyPlaybackRate property. 0.0f to indicate paused, 1.0f to indicate playing. You'll also need to set the MPNowPlayingInfoPropertyElapsedPlaybackTime when you change these values.
Here is example code where updateMetadata is a function that applies those changes to the MPNowPlayingInfoCenter.nowPlayingInfo dictionary. This would indicate to the center that the player is paused.
[self updateMetadata:[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithDouble:audioFile.player.currentTime],
MPNowPlayingInfoPropertyElapsedPlaybackTime,
[NSNumber numberWithFloat:0.0f],
MPNowPlayingInfoPropertyPlaybackRate,
nil]];
I had this problem today: I found that the fact that I was recording audio as well as playing it caused the button to show the pause symbol.
When I stopped the AVAudioRecorder from recording the pause button became a play button.
Quite often, the problem is simply the iPhone Simulator. As soon as you are using the play() function of your AVAudioPlayer instance, the remote control bar is supposed to toggle pause/play automatically. If you run into problems where this doesn't happen, try to run your program on a device.
To toggle the button, you do not need to set any playingInfo of the MPNowPlayingInfoCenter, neither do you need to hold an active AVAudioSession.
Here is example code where updateMetadata is a function that applies those changes to the MPNowPlayingInfoCenter.nowPlayingInfo dictionary. This would indicate to the center that the player is paused.In swift
self.updateMetadata(NSDictionary.dictionaryWithValuesForKeys(NSNumber(Double(audioFile.player.currentTime))),MPNowPlayingInfoPropertyElapsedPlaybackTime,NSNumber(numberWithFloat:0.0f),MPNowPlayingInfoPropertyPlaybackRate,nil)
I am using the barcodereader sample application provided in the cascades samples to embed a QRCode scanner into my application.
As it stands the sample is great, but I want the scanner to open as soon as the user navigates to my screen and I want to get rid of the opening slider images that are in the sample.
Firstly, I have tried removing the images and their animations and adding the action:
onCreationCompleted: {
camera.open()
}
to the Page. This opens the camera perfectly as expected, but for some reason, the bacrode just doesn't scan.
So, I wound back a step, and this time I just put the code in exactly as is and just changed the code to read:
onCreationCompleted: {
startupAnimation.play()
}
As expected, the screen open, plays the annimation, but still it fails to read barcodes, however, if I invoke the animation again (by tapping the screen), the animation plays again and the scanner reads the barcode without any issues at all.
All I can think of is that this is a timing issue and that I need some sort of delay after the screen has been created before the camera can be started as a barcode reader?
Anyone able to help?
Thanks,
Douglas
To get scanning right away at application launch, you need to make sure the camera is actually set up and initialized.
Basically, in onCreationComplete, open the camera. In onCameraOpened, start the viewfinder. In onViewfinderStarted, set the barcode detector camera to be the camera.
My iPhone application has two states: UI and Game.
Game is played using device tilting only so I switched auto-sleep off on game start:
[[UIApplication sharedApplication].idleTimerDisabled = TRUE;
But as soon as I return to UI, I want auto-sleep to be active again. So on game finish I restore it:
[[UIApplication sharedApplication].idleTimerDisabled = FALSE;
After a long game playing, it resulted in immediate darkening of the first UI screen that I go after the game.
So it seems that when idle timer was disabled it was still calculating time. And it had fired immediately after "enabling". How can I fix this problem?
I don't know if the idleTimer can be reset programmatically, but an option is to require the user to touch the screen before it goes back.
Another option is set your own timeout once you are back and wait for it to complete before you set idleTimerDisabled = NO. Remember to clear this timeout if you start the game again.
This answer may be usefull.
I also had problems when using Music/Audio players, which seemed to reactivate the timer.
ps: in ObjectiveC you should use YES/NO instead of FALSE/TRUE