Is it possible to detect in a flutter application and convert those objects to 3d and display them in the application?
I did look it for a day, but i didn't find any useful answer. I am not that experienced in flutter o looking forward for the guidance how to do that in flutter.
Related
I am developing an application in flutter.
CAT's S62 smartphone has a thermal camera.
I want to take pictures with this camera, but I don't know how to access it in Flutter.
I already know how to implement that app with a normal camera using the camera module.
I wonder if there is a library that accesses the thermal camera, or if there is a technique to choose the type of camera.
If anyone knows, please help.
I am looking for support which can render the flutter widgets inside unity 3d engine.
This will be helpful to utilize in our current flutter application and for building 3D applications.
I am new to flutter it will be helpful if anyone can suggest me how I can achieve this with flutter API or Flutter Engine.
The other option is UIWidgets. Version 2.0 debuts for preview recently.
https://github.com/Unity-Technologies/com.unity.uiwidgets
The sample project looks impressive:
https://github.com/UnityTech/ConnectAppCN
You may not be able to render flutter widgets in unity but surely the other way around.
Check out flutter_unity_widget which allows you to embed Unity game scenes in flutter.
And this Medium article might be helpful - Unity 3D in Flutter
I like to know which technology I can use for creating android application which has the feature of detecting an random object like hoarding board and after detecting, a video must played over that object. If the user move the camera , the video must also move over that object position.
I have an example of what I want to achieve in this video of big burger https://www.youtube.com/watch?v=lhXW8_7CaHM . But I want to play that video on any type of hoarding the camera detect. Not only on specific trained object.
I have study some technology like tensorflow and Vuforia which can made ar application.
But I'm not sure if it can detect real time objects with tracking of the objects position.
I'm as the new of AR development. I want to create AR demo app, but I face some problems.
could anyone help me to solve below problems:
. Does it possible to recognize the floor, if I want to placement with big 3D object ( around 3 meter x 1.5 meter )?
. How can I touch screen to placement only one object on the floor? after that, can disable or enable (buttons) plane detection and still appear 3D object, that we have added to interact on 3D object.
. After added one 3D object, How can we make interaction on 3D object? ex: rotation or scale.
could you share me the tutorials or other links to solving that problems?
Thank you very much.
you're in luck. there is a video that shares how to make almost everything exactly like you want it. Also if you want to read over the components that enable you this I'll give you the link to the Vuforia official website documentation when they go over each component and how it works.
Video link: https://www.youtube.com/watch?v=0O6VxnNRFyg
Vuforia link: https://library.vuforia.com/content/vuforia-library/en/features/overview.html
I am new in AR technology, I have researched and come to know that AR works upon marker detection technology.
I found some sample code with vuforia, string, ARKit, etc.
But how they are detecting for particular marker that I am not able to fetch yet.
So my main questions are:
How to create our own marker (for iOS) (Sample code/link would be helpful)?
How to detect that particular marker using the camera to place our AR object?
Let me know if you require any more details.
Any help/information would be welcomed.
Did you see this one?
Augmented reality on the iPhone using NyARToolkit
Also I found some sample code here:
https://github.com/kamend/iPhone-Stuff