I am working on a unity game, and there is a speaker upstairs in a house playing music. I want the music to get louder as I get closer and quieter as I move further away. I have attached an AudioSource to the object and I have selected force to mono.
I have also set the Rolloff min and max distance to 0.2 and 12. Here is my inspector:
However, for some reason the volume of the music stays the same no matter where I am in the house.
In the AudioSource the value of Spatial Blend ist set to 0 (= 2D). This results in a setting used for a 2D game, where an audio simply gets louder if you get closer.
In order to get a 3D sound set it to 1 (= 3D). This enables all you are after.
After doing that you can adjust further how the volume shall falloff with the distance to the object and tweak all the other effects.
Related
I am wondering how many pixels is 1 unit in Unity3D for Oculus rift. For example, how a cube of 1 by 1 by 1 units could be given its dimensions in pixels.
There's not a 1:1 correlation here. It depends on many factors, such as the distance to the object, the angle you're viewing the object, the field of view of your camera, and the pixel resolution of your headset.
This is sort of like asking how many feet an object should be in a movie so that it takes up 6 feet of a movie theater screen. It'll depend on the kind of lens the movie is shot with, how far away the movie camera is, how big the movie theater screen is, etc.
However, at runtime, you can get the current pixel position on the screen of a position in the 3D world using Camera.WorldToScreenPoint. You could then do this for multiple points (say, at each end) of an object of interest to determine how large it is currently appearing on the screen.
Using cubes as my primitive game objects, I've created a runway, ramp and player. I expect my player to be flung into the sky on hitting the ramp at velocity, however the ramp stops my player cold in its tracks. I have friction and drag set low. Do I need to adjust mass? Should I use a rounded player?
Here are my object settings:
Ground
Ramp
Player
Ramp 3D
In the Collider component, you need to use Physical Material. In the Physical material, enter Bounciness "1"
I am developing an augmented reality application that tracks an object via camera (real object, using Vuforia), my aim is to detect the distance it pass.
I am using unity + Vuforia.
For each frame, I calculate the distance between the first position and the current position (Vector calculating).
But I got wrong position/s details, and camera movements affect the result.
(I don't want to take the camera offset in account)
any solution?
for more clearing I want to implement this experience: (video):
https://youtu.be/-c5GiXuATh4
From the comments and the question i understood problem is using camera as origin. This means at all frames of your application camera will be origin and the position of all trackables will be calculated relative to camera. Therefore, even though if you do not move your target, it's position will change because of camera movement.
To eliminate this problem i would recommend using extended tracking. This will minimize the impact of camera movement to position of your target. You can try and test this by adding a trail renderer to your image and you will see your image will stay at a certain position regardless of camera movement.
Actually, I can play 360 mono videos on EasyMovieTexture, but now I need to know if, Is possible to play stereoscopic videos? and if is it, how can this be done?
Yes you can, and it is fairly easy.
You need to create two layers, one for the left eye, one for the right eye.
Then, you duplicate both your camera and your spherical screen.
One sphere should be on the Left-Eye layer, and the other on the Right-Eye layer.
Then, you configure your cameras like so:
This is the right camera. So the Culling Mask has the Lef-Eye layer disabled and the Target Eye is set to Right. You need to do the opposite with the left camera.
Note that both spheres and both cameras should be at the exact same position. The Stereo Separation is done automatically and can be configured on your cameras. (You can just keep the default values)
Alright so just one last thing, you need to configure your materials on each sphere to show only one side of the video.
Here is an example for side-by-side stereoscopy. You can easily adapt that to handle top-bottom stereoscopy.
IS it possible to convert a 360 video into a VR World ? Let's say i take a 360 video of my room, convert it into a 360 VR world and user with a headset can click on a chair in my room to know the details ?
One way to accomplish this is when you put the 360 degree picture or video of your room, place GameObjects in the line of sight from the camera towards the object outside the sphere on which the photo/ video is playing. Then attach all the interactions you want on those game objects so that when your raycaster interact with the GameObjects your desired functions get done, like for example the details appear on another canvas which is just above your chair. Look at the attached picture, I've placed the red sphere which is my gameObject used as an event trigger to perform a change scene function when a user looks at an object in the video which is exactly in the line of sight of camera to red sphere.