I am new to augmented reality.
I have been going through several tutorials, but I do not see them showing a way to create a side-by-side stereoscopic view for AR apps on android.
How can I implement it? Please help, let me know if there are any tutorials on this.
An implementation of SBS 3D
Hello augmented reality with unity can be done with ar foundation , here is a tutorial
https://www.youtube.com/watch?v=0mpsiO2lCx0
The side-by-side view is a virtual reality view which can be achieved with cardboard sdk. Here is a tutorial for that.
https://www.youtube.com/watch?v=qZzhXHqXM-g
Related
I am beginner in unity and not so much used to HoloLens 2. Actually I am trying to make a simple AR app for HoloLens 2, when I connect the external camera it is working fine but when I want to access the stream of HoloLens 2 in recognizing the marker, it does not recognize the marker and I do not know that either the HoloLens 2 camera stream is working or not. I used the main camera as the child of MixedRealityPlaySpace and add some components from AR camera but it does not work then I tried AR camera separate and as a child of MixedRealityPlaySpace but still the HoloLens camera stream is not working. It will be great help if anyone knows about the problems. Thanks In advance
Which way are you using for marker recognizing development? Vuforia or Unity AR foundation? Without any specific context the above information does not make any sense to me.
For AR Foundation, 2D Image tracking feature is not supported on HoloLens platforms for now, you can refer to this link to learn more about feature support per platform:Platform Support.
For Vuforia, there is a closed issue that show how to make Vuforia work with the MRTK: What's the current status of using Vuforia with the MR Toolkit? #1461
I have built a Unity3D + Google Tango based game on the NVidia Dev. device. Everything seems to work fine, but now I would like to play this game in stereoscopic view (For Dive Goggles). I looked at the ExperimentalVirtualReality example (https://github.com/googlesamples/tango-examples-unity/tree/master/UnityExamples/Assets/TangoExamples/ExperimentalVirtualReality) and was successfully able to port all the prefabs into my game, but for some reason the experience is not satisfactory.
The stereoscopic view of my game tends to over lap with each other when I look through the Dive goggles. The experience is a quite off.
I noticed that there are some public parameters on the TangoVR Player Object in Unity Project for 'IPD in MM', 'Screen Width in MM', 'Eye Offset in MM', etc. Do I have to play around with any of these. What does these values even represent?
Any help or pointers will be greatly helpful and appreciated.
IPD would be Inter-Pupillary Distance, while offset is the distance from your eye to the 'point of articulation' when you move your head.
This describes it (with pictures!): http://gamasutra.com/blogs/NickWhiting/20130611/194007/Integrating_the_Oculus_Rift_into_Unreal_Engine_4.php
I've found when trying to use cardboard lenses on devices with wider displays than the fov of the lenses you get an unsatisfactory experience.
This has to do with the lenses not being in the center of the frame, when focused at the display.
To circumnavigate this with larger devices you can push in the margins of the stereoscopic views. For the tango, with testing standard cardboard lenses I found that things work nicely if they were pushed in about an inch. The apps on the play store, Tango Mini Town and Tango Mini Village do a nice job of demonstrating this work around.
The ideal way to get this working would be with google cardboard and a proper tango tablet 7 inch view controller, but currently the cardboard app is incompatible with the tango. Fingers crossed for cardboard support.
As far as simply playing around with an optimal view points in unity, one can modify the view port rect on the stereo camera inspector menu in unity to get the ideal experience for a specific device with what ever controller you choose.
Thanks for all those who helped answer this. Many of my concepts definitely got cleared but nothing got me close to an actual solution. After researching a lot, I finally found this article (http://www.talkingquickly.co.uk/2014/11/google-cardboard-unity-tutorial/) super useful. it basically tells me to implement the Durovis SDK (https://www.durovis.com/sdk.html) with its Unity package.
Everything was pretty straightforward and experience I got from it was so far the best.
I am new in AR technology, I have researched and come to know that AR works upon marker detection technology.
I found some sample code with vuforia, string, ARKit, etc.
But how they are detecting for particular marker that I am not able to fetch yet.
So my main questions are:
How to create our own marker (for iOS) (Sample code/link would be helpful)?
How to detect that particular marker using the camera to place our AR object?
Let me know if you require any more details.
Any help/information would be welcomed.
Did you see this one?
Augmented reality on the iPhone using NyARToolkit
Also I found some sample code here:
https://github.com/kamend/iPhone-Stuff
I'm new to iOS development, so I'm trying to learn iOS game design from the ground up. My first goal is to make an OpenGL ES app that does one thing: display a tile map. I made a map using Tiled, and I've been trying to figure out how to import it into my XCode project, but I can't find any good up-to-date tutorials. Can someone help me?
The best game engine that supports tile maps is COCOS2D,
found here: http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&ved=0CGQQFjAA&url=http%3A%2F%2Fwww.cocos2d-iphone.org%2F&ei=v23GT9H0BqnA2gXDwsTYAQ&usg=AFQjCNEUx9C3mwhsRRKQWFiepw3aHvrzwA&sig2=h84PTR8IQ1xghM1pnmmGWw
A good tutorial is http://www.raywenderlich.com/1163/how-to-make-a-tile-based-game-with-cocos2d
Once you start using cocos2d, tile map tutorials are easy to find and the implementation is relatively easy too! Hope this helped!
I need to use a image sequence as a VR on the iOS. Is there component on the OS that does that? If not, is there any third party view for that? I'm very interested on learning more about it!
Thanks!
Here is a tutorial to create your own OpenGL (iOS or Android) VR engine.
The principle is to create a 3D skybox where you will have to apply your images to create your environment.
There are some libraries ready to use too like this one : Open AR