As far as i know, Samsung Gear VR is the only VR headset that has its own head-tracking sensors and sends sensor data to its mounted smartphone(galaxy series).
Am i getting it right?
From technical specifications of ZEISS VR One, it says
Tracking sensors : Internal tracking by smartphone sensors
Does this mean it has no in-built sensors?
Just like Google Cardboard, all the other low-cost VR headsets are dependent FULLY on smartphone sensors, right?
Cardboard-like headsets are just a way for your eyes to look at your phone in stereo mode, they don't provide any sensors (and very often not even an interaction button like the original google cardboard), these devices typically use your phone's gyroscope to track your orientation.
Related
In the process of preparing for the development of a built-in mobile phone game, we designed a game that affects the player using the data obtained from the user's movement by using the sensor built into the smartphone as a control device. Included in Unreal Engine 4 UE4duino when implementing the game I tried to get the acceleration and velocity values of the cell phone using the functions and blueprints that have been created. But I don't know how.
I'm currently developing an app for the HoloLens 2 that needs to stream audio from a desktop PC.
The idea is to send control information (position, orientation, etc.) to a Cycling'74 Max/Msp application running on a Windows 10 computer to process audio for 3D audio playback. I now need to somehow stream the resulting sound to the Unity app running on the HoloLens. Both devices run on the same network.
At the moment I've achieved something using mrtk webrtc for unity in combination with a virtual cable as input. My issue is that this seems to be optimized for microphone use as it applies some options like noise reduction and smaller bandwidth. I can't find a way to set the options for webrtc to stream what I need (music) with better quality.
Does anyone know how to change that on mrtk webrtc or has a better solution for the audio streaming to the hololens?
WebRTC project for Mixed Reality is deprecated and it is designed for real-time communication. If your requirement is media consumption, you need other workaround solutions.
For dedicated media streaming, you can set up a DLNA server on your PC for media access.
You may also set up Samba or NFS on your PC if you need to access files in other formats.
I'm developing an app based on AR and orientation of the device, the problem is that different devices give me different data from the gyroscope. Probably because the different hardware, but at this point there is a system to uniforming the results?
I also have the problem of working on Unity and not on Android Studio or Xcode, so the methods I can use are limited.
I'm using the Input.gyro.attitude to get the data and the results are not the same. Any suggestion?
Quaternion direction = Input.gyro.attitude;
i'm developing VR using google cardboard SDK..
i want to move on virtual environment when i walk on real world, like this : https://www.youtube.com/watch?v=sZG5__Z9pzs&feature=youtu.be&t=48
is it possible to make VR application like that for android...? maybe using accelerometer sensor ? how can i implement this using unity...?
i try to record accelerometer sensor while i walk with smartphone, here are the result : https://www.youtube.com/watch?v=ltPwS7-3nOI [i think the accelerometer value is so random -___- ]
Actually it is not possible with only mobile:
You're up against a fundamental limitation of the humble IMU (the primary motion sensor in a smartphone).
I won't go into detail, but basically you need an external reference frame when trying to extract positional data from acceleration data. This is the topic of a lot of research right now, and it's why VR headsets that track position like the Oculus Rift have external tracking cameras.
Unfortunately, what you're trying to do is impossible without using the camera on your phone to track visual features in the scene and use those as the external reference point, which is a hell of a task better suited to a lab full of computer vision experts.
One another possible but difficult way is:
This may be possible if you connect device to internet then watch it's position from satelite(google maps or something like that)but that is a very hard thing to do.
Can we make able the iphone simulator to capture an image via webcam, I've written a program to take image from iphone camera. Can i test this with the iphone simulator??? Pls help
You cannot take image via Mac's webcam from iPhone Simulator. For camera, you need to test it on a device.
Imagine if this would be possible on the Simulator, your Mac camera resoultion would be very different from the device one and your mac performances too. This could lead to bad surprises when moving and testing the application on the device.
(Sorry for my english)
In android world, the phone emulator don't have a live camera preview so the solution is create a "Webcam Server" running on emulator host (PC/Mac). This program is a socket server to capture frames from the built-in Webcam and transmit them using a socket.
Then in the phone code (android emulator) you can read the frame through socket and then shows and simulate a real phone camera.
Is that posible in iPhone simulator?
You cannot use mac's camera. For camera, you need to test it on a device.
From Apple documentation:
The hardware features that are not simulated as of iOS 8.2 are:
Motion support (accelerometer and gyroscope) are unsupported.
Audio and video input (camera and microphone) are unsupported.
Proximity sensor
Barometer Ambient light sensor