How to show ImageTarget's feature points on AR camera - unity3d

I want to show the feature points of image targets on AR camera while scanning the image using unity tool. Is there any script or any option to achieve that?.
I have attached the screenshot of showing image feature like that I want to show on AR camera.

Basically, there is no such option. Vuforia is not exposing this information, and it is not supported in any API.
I did see, however, someone mentions he was able to do it by parsing the binary '.dat' file of the dataset - but I do not think it is something you can rely on (and Vuforia also do not encourage such usage). You can take a look here: Feature Points

Related

Face Detection in Preview Camera Feed on Flutter

How do we "draw a square" on detected faces on camera preview feed in Flutter? Is there a cross platform solution to this?
Flutter provides a Camera Plugin, but is there a way for us to draw a square box detecting faces on the preview feed? Any thoughts on this please?
.
SOMETHING LIKE THIS EXAMPLE CAMERA PREVIEW FEED
Firstly, get the image data. This can be done by either using the camera plugin's output, or even directly communicate with the SurfaceView/TextureView.
Secondly, run face detection algorithm. If you do not need cross-platform, https://medium.flutterdevs.com/face-detection-in-flutter-2af14455b90d?gi=f5ead7c6d7c9 MLKit sounds good. If needing cross-platform, you can use Rust algorithms like https://github.com/atomashpolskiy/rustface and bind Rust code to Flutter via https://github.com/fzyzcjy/flutter_rust_bridge. Or, use C++ face detection algorithms and bind to Flutter (though setup may be a bit harder).
Lastly, once you know the face, draw a box around it. For example, Container widget.

Is it posibile to change the focus for Camera Module V2?

I am using the camera for reading some text and currently, my images look quite blurry
Is it possible to change the focus of the camera?
I am using
https://www.raspberrypi.org/products/camera-module-v2/
Yes, it's definitely possible, I did it many times. Sometimes in the camera box there is even a specific tool to rotate the lens included (check if you have it, I experienced that it's not always present). If you don't have a tool take thin pliers and rotate the lens, you can look here.

Apply custom camera filters on live camera preview - Swift

I'm looking to make a native iPhone iOS application in Swift 3/4 which uses the live preview of the back facing camera and allows users to apply filters like in the built in Camera app. The idea was for me to create my own filters by adjusting Hue/ RGB/ Brightness levels etc. Eventually I want to create a HUE slider which allows users to filter for specific colours in the live preview.
All of the answers I came across for a similar problem were posted > 2 years ago and I'm not even sure if they provide me with the relevant, up-to-date solution I am looking for.
I'm not looking to take a photo and then apply a filter afterwards. I'm looking for the same functionality as the native Camera app. To apply the filter live as you are seeing the camera preview.
How can I create this functionality? Can this be achieved using AVFoundation? AVKit? Can this functionality be achieved with ARKit perhaps?
Yes, you can apply image filters to the camera feed by capturing video with the AVFoundation Capture system and using your own renderer to process and display video frames.
Apple has a sample code project called AVCamPhotoFilter that does just this, and shows multiple approaches to the process, using Metal or Core Image. The key points are to:
Use AVCaptureVideoDataOutput to get live video frames.
Use CVMetalTextureCache or CVPixelBufferPool to get the video pixel buffers accessible to your favorite rendering technology.
Draw the textures using Metal (or OpenGL or whatever) with a Metal shader or Core Image filter to do pixel processing on the CPU during your render pass.
BTW, ARKit is overkill if all you want to do is apply image processing to the camera feed. ARKit is for when you want to know about the camera’s relationship to real-world space, primarily for purposes like drawing 3D content that appears to inhabit the real world.

Apply a filter algorithm for each frame of the camera

I am working on an Iphone application.
I need to do the following: when the user clicks on the "Camera Tab" the camera open inside the view with circle overlays.
I want to apply a filtering algorithm on the camera.
I am looking for the best way to do this. Is there a library that can help?
What I am doing currently:
I am using the OpenCV Library.
I define a timer.
For each timer tick I call cvCaptureFromCam() method from the OpenCV
framework (This will capture the picture with a camera and return
it).
I apply the algorithm on the image captured.
i display the image in a UIImageView
The idea is that on each timer tick I get the image, filter it and put it in the UIImageView. If the timer tick is fast enough it will appear as continuous.
However the cvCaptureFromCam is a little slow and this whole process is taking too much memory.
Any suggestions of a better way is greatly appreciated. Thanks
Anything that's based on CPU-bound processing, such as OpenCV, is probably going to be too slow for live video filtering on current iOS devices. As I state in this answer, I highly recommend looking to OpenGL ES for this.
As mentioned by CSmith, I've written an open source framework called GPUImage for doing this style of GPU-based filtering without having to worry about the underlying OpenGL ES involved. Most of the filters in this framework can be applied to live video at 640x480 at well over the 30 FPS framerate of the iOS camera. I've been gradually adding filters with the goal of replacing all of those present in Core Image, as well as most of the image processing functions of OpenCV. If there's something I'm missing from OpenCV that you need, let me know on the issues page for the project.
Build and run the FilterShowcase example application to see a full listing of the available filters and how they perform on live video sources, and look at the SimplePhotoFilter example to see how you can apply those filters to preview video and photos taken by the camera.

Filter / image texture Iphone

Good!
I need to add the image of the iPhone's camera filter / texture through a UIImagePickerController, but do not know how to directly access the camera image, since only with access to the eye.
The filter may be black / white, sepia or anything else.
Thank you very much! regards
Finally fixed the problem by adding a mask to the image, so it was not necessary to apply a filter lol.
thanks
Look here Apple Reference on Image Picker to see what you would need to select an image taken from the camera.
You can then apply the filters you need to this image any way you want.
Apple provides a nice sample project to do this: PhotoPicker You may want to try this out too.
If you need to have an effect that's added on the preview layer in real-time, you'll need to go into a bit lower api's methods, like the AVCaptureSessions and OpenGLES2.0 shaders. There is a good introduction into real-time shader's for the iPhone 4 at: Introduction to Augmented Reality on the iPhone by Nick Waynik. This should help you in figuring out how to add real-time augmentation to the preview layer.
Hope that helps!