I'm doing calling feature in Flutter and I need to know whether user wants to listen with earpeace and turn off the screen, set audio output to earpeace. Is there a sensor to know if phone is close to face/ear to turn off the screen ?
I was able to know with this plugin proximity_sensor
The only package that worked and actually turned off the screen on proximity is the flutter_incall package like this:
IncallManager().turnScreenOff();
IncallManager().turnScreenOn();
The problem is that this package is so old and missing so many modern flutter updates like null-safety and Android embedding 2.0.
I tried using wakelock as many suggested, but the functionality is different from what I am looking for.
Related
Excuse me
"startVideoRecording();" in "camera"
Is there a similar code for "image_picker"?
I want to design an app that can automatically start recording after receiving a message.
But "image_picker" doesn't seem to have a similar code.
Or is there another easier way to do it?
Thanks
I try to use "Camera" to make a similar App,
But I finally think "image_picker" is better to use.
I'm working with video_360 from pub.dev and it comes enabled with Gyroscope sensing without any parameter to disable it.
I looked through all the source code but can't figure out where it is being read and updated.
Is there a way I can disable or freeze the gyroscope readings so that in such a plugin, the gyro sensor values would read the same values no matter how the phone is held?
This does not answer the question in the Title, but for my specific scenario, since the package uses Exoplayer, the code for the sensors was inside it and that's why it was not in view inside the package source code.
This setter was what I needed: setUseSensorRotation(bool).
So, I am using wear plugin for this, the first exciting thing I wanted to try out was scrolling through a ListView or a PageView using the rotating bezel. But that didn't happen. I searched too, but didn't find anything for flutter.
Is setting up a custom method channel the only way for now? Are there any details I'm missing out on?
UPDATE : Native method requires a View object to use view.setOnGenericMotionListener. Since flutter renders it's own views, I can't access it via method channel. Please correct me if I'm wrong and if it's possible to put a listener on the view generated by flutter itself.
I built my unity game through Xcode into an iOS device for testing, and the colors look duller and more pastel-like than in the game view in the editor. The exact same thing has also happened on a modern android device. How can I make it so that the colors as seen in the built game better reflect the colors in the editor?
EDIT: I have sent a file to my phone and found out that the phone perceives colors differently than my monitor. I'm sorry if I'm asking for quite a bit here, but... is there any way to make the game look as I intend it to? The fact that I never see exactly how my game will turn out seems kinda... awkward. Thanks.
I'm not sure if you already tried this, but you might want to use Unity Remote for your iOS device, so you can check the color differences in real time instead of having to build it every time.
But other than what the comments said, it sounds like you should try to adjust your monitor's display settings for better color correlation between the built game and the one in the editor.
Im using ZXing, is working fine on my new app, but i would like to integrate the option of front or rear camera,
so far the only reference to this i've found is on the google group
But is not very clear what they mean with that,
so any pointers on what i have to do to accomplish this?
thanks !
ZXWidgetController doesn't provide that functionality and it's not really set up to make it easy to change.
The code that needs to change is in - (void)initCapture. It calls [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]. This returns the default camera and you don't want the defalt.
You need code similar to that in - (ZXCaptureDevice*)device in ZXCapture.mm. That code won't work out of the box (it's designed to work with both AVFF and QTKit) but it's the same idea. Instead of using the default video input device, go through the devices and look at device position to find the device you want.
Be nice to port that code to the widget but that hasn't happened at this point.