I'm trying to connect my iPhone camera to python script built on OpenCV library. OpenCV works fine with built-in camera on OSX but I can't find out how to connect it with some kind of remote or mobile camera.
Tried to use soft that emulates web-camera on macOS, but openCV can't recognize it.
I can't find any way out. Any ideas?
Thanks.
Unfortunately, video capture in current OpenCV (2.4.1) is not supported on iOS (iPhone/iPad) platform. But as far as i know, there is no possibilities to connect a hardrare camera of your mobile device to python app in such way.
But in general a good tutorial how to setup video capture on iPhone camera can be found here:OpenCV Tutorial - Part 3
Also you can read about image processing using OpenCV on iPhone and iPad in OpenCV Tutorial - Part 1 and OpenCV Tutorial - Part 2.
The Git Repository with the Code for the tutorials here.
Related
I have a Xiaomi Redmi Note 8 Pro and it doesn't seem to support ARCore. I found some ways to dodge this but it appears to be quite complicated nor very safe.
My question is:
What other tools would you recommend if I want to create an app in Unity that also needs to use GPS modules, maybe altimeter and of course camera (AR stuff)?
I heard about Vuforia that might do the trick, also read something about AR Foundation from Unity. But to me, it looked like depending on chosen deployment it use AR Core or AR Kit(even Vuforia).
Any clearance about this maybe?
I suggest you don't try messing with your device. It doesn't support ARCore for good reason. And maybe you just try to use Android Studio Emulator, but Alas for some unknown reason, i try it but APK which is generated from unity can't be installed on the Emulator. Some stuff with the architecture
If you want to use Unity anyway. I suggest you use Vuforia. It works on most modern devices and doesn't even need device to test, just hit unity play mode and you can test around from your PC (need webcam).
Vuforia Engine provides a simulator mode in the Game view that you can activate by pressing the Play button. You can use this feature to evaluate and rapidly prototype your scene(s) without having to deploy to a device. (Source: https://library.vuforia.com/articles/Training/getting-started-with-vuforia-in-unity.html)
For Unity with ARFoundation, you can't use your PC like Unity with Vuforia, you need ARCore/ARKit supported devices.
Last if you want AR with GPS Modules (although this is not with unity) checkout AR.js https://github.com/AR-js-org/AR.js
I am developing an app for the iPhone/iPad using Xcode and iOS 5. Can anyone suggest any third party libs to do this, or give me some direction as to how to natively do it?
Try using this sdk http://www.izotope.com/tech/iZomobile/
it uses audioqueue bit complicated yet powerfull
You can use Bass and Bass_fx library for changing pitch/tempo in real time.
You can download this library from http://www.un4seen.com/.
You can use following function for Pitch and tempo setting
BASS_ChannelSlideAttribute(mainStream, BASS_ATTRIB_TEMPO_PITCH, 0.0,
updatePeriod);
which is available in Bass library.
How to use face detection for the face image using iPhone camera ?
I want to develop an application for iPhone in xcode and use face Image in the application to but I do not know how it is possible to use face detection in my application. Any ideas?
You can take a look at openCV
Here is a link to the question asking something similar to your question
iPhone and OpenCV.
And i think you can find a template from this site
http://www.ient.rwth-aachen.de/cms/software/opencv/
I want to make a face tracking application in iPhone.
Can any one help me how to use
CvCapture * camera = cvCreateCameraCapture(CV_CAP_ANY); in iPhone?
When I add this line in XCode4, I got the error "_cvCreateCameraCapture", referenced from:.
How can solve this error?
Thanks,
Chetan
The latest OpenCV source from the SVN trunk supports video capture on iOS. Take a look at this article for a pre-compiled iOS framework and an example project.
Currently, OpenCV camera capture API cannot be used in iOS apps. You have to create AVCaptureSession and setup video capturing pipeline in order to get video frames from iPhone camera device.
I'm trying to make an iPhone/iPad application that uses VTK to visualize DICOM images and present them on the screen. The problem is no matter what I do, I am not able to build VTK to work on the device (it works correctly on the simulator).
Is there any way to build VTK for the iPhone/iPad?
Thank you very much for your help! :)
VTK can render using the Open GL API, or more recently Manta. The iPhone (and other devices such as Android) use OpenGL ES, which is essentially a subset of OpenGL targeted at embedded systems. Until VTK is ported to use OpenGL ES as an alternative backend it will not be possible to use VTK on mobile devices. I am not familiar with the iPhone simulator, but I imagine that VTK is still able to use OpenGL in the simulator. Porting is possible, but it would not be a trivial thing to do.
Remove the tag -DIOS_SIMULATOR_ARCHITECTURES=''
Just use below:
cmake -DBUILD_SHARED_LIBS=OFF -DCMAKE_BUILD_TYPE=Release -DCMAKE_FRAMEWORK_INSTALL_PREFIX=$prefix/lib -DBUILD_EXAMPLES=OFF -DBUILD_TESTING=OFF -DVTK_IOS_BUILD=ON -DModule_vtkFiltersModeling=ON ..