Is the mentioned code is available anywhere? I want to create the same funcionality with the Insta360 One X2 camera.
Thanks,
Tibor
Related
is there any way to mirror and use an iPhone with a touch monitor?
I need to create an app demo presentation.
thanks in advance.
iPhone screen sharing can only be done by Airplay.
There are many softwares that is used to screen-share from iPhone to Mac/Desktop
http://www.airsquirrels.com/reflector/
http://www.x-mirage.com/x-mirage/
AirServer is probably the best for this, it supports both Airplay and Miracast for projection of smartphone to screen. It allows you to record the screen to file as well as stream live on YouTube, with webcam and everything.
I am thinking of building a camera application - with the ability to do image processing (adjust contrast, apply different image filters) while you are taking picture or after the pictures has taken.
The app will also have the ability of drag and drop icons.
At the end you are able to export the edited images either to the camera roll or app memory.
There is already many apps out there like this. (Line Camera) etc...
Just wondering what is the best way to build such app.
Can I build the app purely with Objective C ios sdk? or do i need to build it with C++/cocos2d, etc...
Thanks for your help!
Your question is very broad, so here is a broad answer...
Accessing the camera/photo library
First you'll need to access the camera using UIImagePickerController to either take a new photo or grab one from your photo library. You can read up on how to accomplish this here: Camera Programming Topics for iOS
Image Manipulation
AviarySDK has much of this already built for you. Very easy to set up and use in your apps. You can download their sample app for free in the app store if you want to see what it can do. Check it out here: http://aviary.com/
Alternatively, read up on Core Image if you'd like to avoid third-party libraries. See Core Image Programming Guide for more information.
There is absolutely no need for cocos2d which is a game engine.
You can accomplish everything you mentioned using only Objective-C.
If you want real-time effects you will need to dive into OpenGL. you can use GLKit if you target iOS 5 and above.
I created an app that plays the song and calculates the decibels of the audio that is being played. Its fine.
But I want to make a change in it. That is to receive the sound/audio from outside (when user speaks) and calculate the decibels.
I don't want to record anything. Just receive audio/sound and calculate the decibels?
Any hints or tutorials please?
You could try using the source code for one of the sample apps (SpeakHere) in the iOS Developer Library as a starting point: http://developer.apple.com/library/ios/#samplecode/SpeakHere/Introduction/Intro.html
I found the src code "https://github.com/jkells/sc_listener_sample" which is working without any modifications.
http://www.google.com/products/catalog?oe=UTF-8&gfns=1&q=iphone+video+out+cable&um=1&ie=UTF-8&cid=17045329161497634089&ei=JpU1TcymOcnogQfC25C7Cw&sa=X&oi=product_catalog_result&ct=result&resnum=5&ved=0CCgQ8wIwBA#
I want to know if its possible to play video from an app through a lead like this onto a TV or something similar. I've heard that the functionality is not available in apps. Is this true?
If its not true and its perfectly possible what exactly is possible? Is it possible to push a different video output to the external TV as that that is on the device?
Thanks
Tom
I suppose that cable will have the same functionality as a connector for a projector or second display right?
If that is the case then the answer is: IS POSSIBLE.
But, everything that is want to show in the second display have to be explicitly done by you. There is no mirroring system or something alike.
Read here, there is a sample app also :)
http://mattgemmell.com/2010/06/01/ipad-vga-output
I want to develop an application for iPhone in xcode and integrate face recognition in the application to correspond to other application functions but I do not know how it is possible to use face recognition in my application. Any ideas?
Check out this Wikipedia page. It has a lot of references to algorithms, applications, etc.
Check out OpenCV. Here's a blog post that should help get you started:
http://niw.at/articles/2009/03/14/using-opencv-on-iphone/en
See the Apple published iOS sample code that implements face detection called
SquareCam. --
Integrating with CoreImage's new CIFaceDetector to find faces in a real-time VideoDataOutput, as well as in a captured still image. Found faces are indicated with a red square.