Nativescript ML kit OCR Text recognition crashes on resume sdk < 23 - android-camera

In Nativescript using ML-kit text recognition for devices with sdk less than 23 when I pause the app and resume it brokes with an error:
System.err: Caused by: java.lang.RuntimeException: Camera
initialization failed because the camera device was already
opened(-16)
In sdk >= 23 it seems that it works fine, but the idea is to make that it works in all devices (my client device has android 5.5 so sdk < 23). I'm using the plugin on real time:
<FirebaseMLKitTextRecognition:MLKitTextRecognition
id="ocrCam"
class="ocrCam"
width="100%"
height="50%"
processEveryNthFrame="3"
preferFrontCamera="false"
torchOn="{{ lightOn }}"
pause="{{ pauseOCR }}"
scanResult="onTextRecognitionResult" />
I read something about permissions like older versions no need them but new ones need them and it crashes when ask for them. But the issue where closed by "min sdk 23".
Are there any method to controll that crash or destroying the xml element on pause?

Try releasing the camera on pause event of your application / activity.
// Where ocrCam should be reference to MLKitTextRecognition
ocrCam.camera.release();

Related

nw_socket_get_input_frames [C4:1] recvmsg(fd 35, 9216 bytes) [61: Connection refused]

We're using the Apple's Network.framework and send data between an iOS app and a macOS app using UDP. We use bonjour for discovery. It works great. Until it doesn't...
The problem is that when I stop using the iOS app for like 2-3 minutes (No, I don't quit the app nor do I put it in background. I just don't interact with the UI for this short period of time. Otherwise the iOS app is open and active), the connection stops working. By this I mean I press a button that should send some data to the macOS app but it just stops working and on the macOS Xcode console I see this
The debug log of the macOS app says: 2020-12-01 22:40:40.274638+0100 Our app [10750:497151] [] nwsocketgetinputframes [C4:1] recvmsg(fd 35, 9216 bytes) [61: Connection refused]
What is this and how to heal it? Does the 2-3 min pause as symptom / trigger of the above error message ring some bells?
Maybe the most interesting thing is that when I turn off the wifi on the iOS app the app immediately discovers that the macOS app is gone. And when I turn the wifi back on the iOS app immediately discovers and connects with the macOS app. And yet this doesn't heal the problem. The only thing that does it rebuilding the macOS app in Xcode.

UnityGfxDeviceW Crash

I have a problem testing an app for android with unity, when I run the application on my Phone, the application opened and crash inmediatly after that with this message in the logs:
Fatal signal 11 (SIGSEGV) at 0x64397000 (code=2), thread 28557 (UnityGfxDeviceW)
I suposed that is a problem of video memory, I tested on differents phones and only works perfectly on HUAWEI CE0197, on many others phones crash with that error.
I have some plugins installed:
Prime31 EtceteraAndroid (For use camera and file manager).
AndroidNative (for use facebook).
OnlineMaps de Inifnity Code
NGUI for manage the views
Thanks.
Check the options in Player Settings ,there may be some options like Minimum API Level not correct and Stripping Level

iOS app with iBeacon must restart device

I built a simple ios app with IBeacon . I ran this app on an iPhone 4s whit ios7.1 while running an IBeacon base station.Everything is working right in background or foreground(exit region have 30 seconds delay in background).But over a period of time,about 3 hours,this app can not monitor any event though device setting were not change(blueTooth and locate is normal).This situation must restart the device.
Please tell me what should i do aboult this situation?
Thanks!
I wonder if iOS Bluetooth scanning is slowing down when your app is in the background such that it appears that events never fire because they are just taking so long to happen. Rebooting may speed up the cycle.
One way to force a Bluetooth LE scan cycle to look for iBeacons is to run a different app in the foreground that uses the CoreLocation iBeacon ranging APIs.
Try installing Locate for iBeacon, then as soon as your app appears to be not getting notifications in the basckground, launch Locate for iBeacon and tap Locate iBeacons. Do you see iBeacons? Does your background app get a notification?
If this works, then repeat the test and instead of using the Locate app to force a scan, just wait (an hour if needed). See if you eventually get your notification anyway, and note how long it took.
Edit: it appears that this is a case where iOS stops looking for iBeacons entirely requiring a reboot. See related question below.
iBeacon: didRangeBeacons stops getting called, must reset device for it to work again

How to get user input in scene based application using samsung smart tv

Am new to develop app in samsung smart tv in normal application i use IME and got output and my problem is by using scene based application how i will get user input and make process it ? and Second one i am using samsung smart tv sdk4.5 in that i try add js file in scene.js file but $.sf.loadJS is not working it shows.."Error Detail: Uncaught TypeError: Cannot call method 'loadJS' of undefined" , like this plzee tell me solution for this without this solution i cannot look forward.thanks in advance
I don't know your code looks like, but usually the code for native object from SDK is non-compatible between scene-based and normal project (actually the library loaded from those 2 template kind like different)
Scene-based library usually all wrapped in "sf" object. Please look at the links below
http://samsungdforum.com/Guide/ref00009/uiapi_textinput_textinput.html
http://samsungdforum.com/Guide/ref00009/uiapi_textinput_sftextinput.html

Can not get any hear any audio from Android SDK emulator launched from eclipse

I am starting to develop for Android, and I am using Eclipse with the Android plugin and Android SDK.
No matter what I do, I can not get the emulator to make any sound.
I have tried turning on audio in the virtual device setup.
I have also tried various command lines in the run configuration such as:
-audio oss ==== this gets a error message that there is no oss backend defined.
-useaudio === comes up as not a valid option
-audion -winaudio === starts w/o error but still no sound
If I open my windows7 audio controller on my laptop, I see that the android virtual machine gets its own volume slider, but nothing sounds when I move or click on that audio slider. (The other volume sliders produce the normal beep sound.)
I've been searching for "Android emulator no sound" for hours but no luck.
Any ideas?
The issue was not with the emulator not making sound ( as verified by the fact it would not work on my device either ), but a unaccounted problem with the SoundPool class.
I looked in the logcat and found references like "sample 1 not ready". Researching this I finally found a obscure thread in which it was mentioned that it takes some time before sound pool is ready to be used ( and therefore all sounds should be loaded well before they are used ).
I modified my playSound method to monitor the return value of the soundPool.play(...) method call. It returns the id of the running sound stream, or 0 if it failed ( i.e. "sample 1 not ready" ).
What I did was to put it into a loop, and when the return value of the
soundPool.play(...) method call was 0, I had the thread sleep for 1
millisecond, then try again. With this method in place, I now always
get a sound.
As a side note I have also been running tutorials with my SDK set up for Android 2.2 instead of Android 2.3.1, because Android 2.2 is what is installed on my device, a Sprint LG optimus S LS670.
I have ran my modified code on both a Android 2.2 and a Android 2.3.1 Virtual device.
When I run the code on the older Android 2.2 VD, it typically took about 10 to 15 loops ( so a 10 to 15 ms delay ) before the sounspool was ready to play the sound.
When I ran the same code on a Android 2.3.1 VD, the delay was MUCH worse, taking around a 350 ms delay before it would play - yes almost 35 times slower!
When I ran the same code deplyed to my device running Android 2.2, the time delap was about identical to running it on the emulator.