face detection in flutter using google ml kit - flutter

I am working on an application where main screen has a live camera opened automatically.
Here I need faces to be detected.
Am getting confused about how to integrate ML kit.
Do I have to signup with firebase????
I tried working with google ml kit but got too many errors.

There are two separate ML libraries now - Firebase ML and ML Kit. Firebase ML runs on the cloud, and you need to use Firebase for it. You can find the documentation here.
ML Kit, however, runs entirely on the user's device, and it does not require you to use Firebase, unless you want to use custom models.
The easiest way to integrate it into your flutter app would be to use the google_ml_kit library, which provides modules for different purposes. For face recognition, you can use google_mlkit_face_detection.

Related

Flutter: Google ML Kit alternative for desktop?

I have an idea for a desktop app that needs to detect whether a user has his/her eyes closed and track the orientation of the user's head in real time.
Google ML Kit offers these two features right out of the box and with the google_ml_kit Flutter package it's as easy as pie to get it up and running in an Android or iOS app.
But sadly the Google ML KIT doesn't support MacOS or Windows.
Is there an alternative package for desktop that doesn't require me to train the models myself and would be easy to integrate into a Flutter app?

Liveness Detection in Flutter

I have an app for attendance that is based on a facial recognition system. I want to implement liveness detection or antispoofing. I found some models and solutions but none of these solutions work in offline mode (no internet mode). My app works in offline mode without internet.
I made my app in the Flutter framework and I am using ML Kit. If anyone has any idea how to do it or has any solution or code regarding this issue. It will be very helpful to me.
you have to use tflite dependency to achieve live face recognition in flutter.
you can use below link to refer more about tflite. which is using to recognize live camera faces.
https://pub.dev/packages/tflite
And for offline mode you have to implement PWA(progressive web app) to your flutter app. use below document to implement pwa to your flutter app.
Note : pwa only works with https. so you can make build your flutter app using firebase to make your app secured
https://medium.flutterdevs.com/progressive-web-app-flutter-62c7dea05fc5

need help for Firebase Face detection and recognition on Android can anyone guide me (Programming Language :- JAVA)?

I'm developing an Android app with java and I need some help. I want to store the user's face data in Firebase with the help of Firebase Face Detection and my app will recognize the user's face whenever the user tries to log in to the system if it matches then it will allow the user to access the app otherwise the app will close.
Welcome on SO! Firebase Face Detection is part of ML Kit for Android and according to its documentation is old version of SDK. New version was split into Firebase ML and ML Kit.
As Face Detection is part that can be used without Firebase it is currently included into ML Kit.
In the documentation there is detailed guide with example how to implement it on Android/Java. You may find there as well link to github with ML Kit quick-start sample.
First you should get familiar with those documentation and try to implement the sample on your side. Than when you will know what do you want to store create appropriate database for it.
If you will get any problems with during those implementation please create new question with issue exact details. Please follow the instruction to create good question.

Emotion analysis in ml kit in flutter

I am building a flutter app which needs the emotion analysis feature using face recognition . So I tried out ML Kit but it doesn't support emotion analysis. Then I tried using Azure or aws but they are all APIs that need time to process an image, I want it in real time. So is there any solution for integrating such a feature with flutter .

How to create Augmented Reality Web app using Unity & Vuforia?

I am developing an Augmented Reality app to be integrated into a website using Unity.I need to take output in WebGL. I am using Vuforia to create AR experience. Since Vuforia is not supported with WebGL, i am not able to build. Please suggest an alternate method or how to do Augmented reality in Unity for Web. Is there any alternative to Vuforia?
The good news is yes, you definitely can build an AR experience on the web!
The bad news is that none of the current libraries built for doing so offer a Unity plugin.. Meaning you'll either have to create a wrapper, do some complicated RPC call to talk to the JS library via Unity, or completely scrap Unity altogether and use only the library. To my knowledge, the best browser-based AR library is AR.js. I know this isn't the answer you were hoping for, but I hope you're able to achieve your goals. Good luck!
This is probably a bit late in the thread. But I'd like to add an option which might help. You can definitely build your AR app in web using WebGL as output. There is easy way to integrate it with a webiste too. SLAM based AR like Google ARCore is a great example to do it.
There are two options:
You can build such an app from scratch which will obviously take more time. Because apart from development, setting up hosting infrastructure is a challenge.
Otherwise, if you want to scale such AR web app development with low or no code and cloud ready hosting, you can use a SaaS platform called Marvin XR.
You can login and try it out for FREE: https://www.marvinxr.com:8443
Hope this helps the other folks who stumble upon this thread.