Augmented Reality Application in iOS - iphone

I am trying to create an ios application using which we can convert a real life object e.g Sofa, Table as 3D objects using IPhone's camera. These 3D object info can be saved in the database and can be displayed as Augumented reality objects when the IPhone camera is pointed at some other part of the room.
I have searched the internet but could'nt find any info on where to get started to convert real life objects to 3D objects for viewing as augumented reality objects.

check below link where you found SDK and also sample code for implement AR
http://quickblox.com/developers/IOS

I think any way you go with this it's going to be huge task. However I've had good results with similar goals using OpenCV.
It has an iOS SDK, but is written in C++. Unfortunately I don't think there's anything available that will allow you to achieve this using pure Obj-C or Swift.

You can go through following links
https://www.qualcomm.com/products/vuforia
http://www.t-immersion.com/ar-key-words/augmented-reality-sdk#
http://dev.metaio.com/sdk/
https://www.layar.com

Related

Is there a way to perform multiple 3d object recognition in unity using vuforia?

I'm using vuforia scanner to detect and recognize a 3D object. It works well with one object but now i want to recognize multiple objects at once. I have gone through so many links but they only speak of multiple image targets and not 3d objects. If not using vuforia, is there any other sdk to do so?
I messed with object recognition once but I'm pretty sure the databases are basically the "same" as 2D image target databases. That is, you can tell Vuforia to load more than one of them and they'll run simultaneously. I don't have Vuforia installed at the moment, but I know the setting is in the main script attached to the camera (you have to fiddle with it when creating your project in the first place to get it to use something other than the sample targets).
There is, however, the limit on how many different targets Vuforia will recognize at once (IIRC is something really small, like 2 or 3). So be aware of this when planning your project.

Fixing object when camera open Unity AR

Im trying to create a AR Game in Unity for educational project.
I want to create something like pokemon go: when the camera open the object will be fixed somewhere on the real world and you will have to search for it with the camera.
My problem is that ARCore and vuforia groundDetection (I dont want to use targets) are only limited for few types of phone and i tried to use kudan sdk but it didnt work.
Any one can give me a tool or a tutorial on how to do this? I just need ideas or someone to tell me where to start?
Thanks in advance.
The reason why plane detection is limited to only some phones at this time is partially because older/less powerful phones cannot handle the required computing power.
If you want to make an app that has the largest reach, Vuforia is probably the way to go. Personally, I am not a fan of Vuforia, and I would suggest you use ARCore (and/or ARKit for iOS).
Since this is an educational tool and not a game, are you sure Unity is the way to go? I am sure you may be able to do it in Unity, but choosing the right platform for a project is important - just keep that in mind. You could make a native app instead.
If you want to work with ARCore and Unity (which is a great choice in general), here is the first in a series of tutorials that can get you started as a total beginner.
Let me know if you have other questions :)
You can use GPS data from phone to display object when the user arrived specific place you can show the object. You can search GPS based Augmented Reality on google. You can check this video : https://www.youtube.com/watch?v=X6djed8e4n0

Best way to build a camera app on iPhone

I am thinking of building a camera application - with the ability to do image processing (adjust contrast, apply different image filters) while you are taking picture or after the pictures has taken.
The app will also have the ability of drag and drop icons.
At the end you are able to export the edited images either to the camera roll or app memory.
There is already many apps out there like this. (Line Camera) etc...
Just wondering what is the best way to build such app.
Can I build the app purely with Objective C ios sdk? or do i need to build it with C++/cocos2d, etc...
Thanks for your help!
Your question is very broad, so here is a broad answer...
Accessing the camera/photo library
First you'll need to access the camera using UIImagePickerController to either take a new photo or grab one from your photo library. You can read up on how to accomplish this here: Camera Programming Topics for iOS
Image Manipulation
AviarySDK has much of this already built for you. Very easy to set up and use in your apps. You can download their sample app for free in the app store if you want to see what it can do. Check it out here: http://aviary.com/
Alternatively, read up on Core Image if you'd like to avoid third-party libraries. See Core Image Programming Guide for more information.
There is absolutely no need for cocos2d which is a game engine.
You can accomplish everything you mentioned using only Objective-C.
If you want real-time effects you will need to dive into OpenGL. you can use GLKit if you target iOS 5 and above.

Load 3d object dynamic in Cocos 3d

I am developer of iOS but new bee in Cocos, i am building an Augmented Reality App in which i want to load 3d object on run time and show them when specific marker detected. But my following Questions are about Cocoas and related to my own task, forgive me if i asked something silly .
Can i load 3d objects on run time in Cocos 3d?
Is this possible that i can get those 3d objects from my server via calling web-service, because these objects can b in .fbx format so is cocoas understand this type or should i use something else.
Is cocoas support objective-c because i only have knowledge in objective-c.
Will integration will be easy with my first part of app which i being developed in iOS, because i was thinking to do this task in Unity but Integration is so hectic so i decided to move any other and i got suggestion to go with Cocos 3d.
Please help me on this. Thanks in Advance.
Yes
Download from server: yes. You could use NSData (dataFromURL) to download a file. But last time I checked (4 months ago or so) cocos3d only supported the POD file format.
Yes, Cocos3d is written in Objective-C (haven't you downloaded the code yet?).
Easy is relative. Might be super-easy for a pro and next to impossible for a beginner. Hard to say without knowing you, your project, your requirements, your goals.

Any idea about an iPhone Accelerometer Library?

Have looked so long for a library specialized in dealing with iPhone Accelerometer but couldn't find anything.
I have made some few sample apps, but none reaches a level of accuracy as in Labyrinth games for example, so any idea about a library for that? Or maybe an open source app?
Would be better if it's integrated in a Physics library
UPDATE: I didn't mention it, but i don't want to use game engines. Specially now, that their future is still unknown. ObjC libraries or tutorials would be better.
I highly recommend looking at tweejump. It's basically an open source version of games like Doodle Jump. It really helped me learn how to use the accelerometer to control an object on the screen.
Although you said you didn't want any game engines, this is powered by the Cocos2D library. However, Cocos2D is written in Objective-C, so there shouldn't be any issue getting anything powered by Cocos2D passed Apple.
Best of luck!
It seems that it may just be easier for you to use a game engine that works with iPhone if you are looking to make a game. Here are 2 engines that export to iPhone GameSalad or Unity 3D