How to play sound only at a certain distance? - unity3d

I have Audio Source. On certain events, the sound is played through the function:
[PunRPC] private void PlaySound() {
GetComponent<AudioSource>().PlayOneShot(shootSound);
}
And then this function is called via
photonView.RPC(nameof(PlaySound), RpcTarget.AllBufferedViaServer);
And here's the problem: If you use this method, then the sound will be played simultaneously for all players, throughout the game map, but it needs to be played at a certain distance.
If you use this function without RPC, then everything seems to work, but then this sound is not synchronized.
In the sound settings, I made the minimum distance 1, and the maximum 20.
I need the sound to be synchronized, but only spread over a certain area

Related

Unity AudioSource.Play has noise

I have multiple piano keys with the AudioSource Component, if I press one key repeatedly the sound starts with a tick noise that is not in the original sound clip.
I tried changing the audio clips and set Doppler Factor to 0 but nothing works.
I could manage it by instantiating the key while its playing and the noise is gone now but if I play multiple keys by moving my finger over them I get a "Hall Reverb" effect and its a little expensive too.
if (go.GetComponent<AudioSource>().isPlaying)
{
AudioSource Note = Instantiate(go.GetComponent<AudioSource>(), Clones);
Note.GetComponent<SpriteRenderer>().enabled = false;
Note.GetComponent<BoxCollider>().enabled = false;
Note.Play();
Destroy(Note.gameObject, 2);
}
else
{
go.GetComponent<AudioSource>().Play();
}
I tried PlayOneShot() and it had the exact same effect like the code above.
Could you help me? I am using Unity 2018.2.14f1.
Correct me if I am wrong, but it looks to me that you want to play a note, even when your audiosource is already playing.
Instead of duplicating an existing gameobject (which is prone to errors, use a prefab instead) you can play an audio clip at a point.
AudioSource.PlayClipAtPoint(clip, transform.position);
You would need to add a public variable that takes in the target clip into your script.

Playing same sound on multiple object collide on Unity

I have a Dice rolling game. I have a 6 dice which is applied random force in random direction in a box. Now, when Dice collide with each other as well as within the box wall the sound need to be produced.
Currently when i add sound to each dice and trigger it when the dice collide, the sound is wired when all of them plays at the same time.
Is there a better way to produce the sound like real when all 6 dice collide with each other and with the walls of the box.
The flange-like effect you hear happens when two identical sounds are played with a very small delay causing their wavelengths to amplify and dampen themselves.
To avoid such effect you have many options:
To just avoid playing same sfx with delays less than inconceivable to user. (you are probably playing each dice hit sfx twice right now)
Use different samples and play them randomly. (if you can't generate new samples try modifying the ones you have by simply changing their pitch by say 3%-10% to have enough different samples)
If second option does not satisfy your need (project size increase) you can use third party plugins such as master audio to create several custom-pitched sounds out of one single sound at run-time.
You can change the pitch in code (at run-time) and make sure two close hits never play with same (or very close) pitch
It's actually pretty difficult to produce realistic collision sound for multiple objects collision.
If you use the same AudioClip for each dice-to-dice or dice-to-box collision event and trigger them on collision event, the end result will simply sound like an echoed version of the AudioClip with various delays.
If you choose to use a variety of collision AudioClips as a pool to choose from, you might produce ok end results as long as you can guarantee there is no two collision sounds with the same AudioClip playing during any given time period.
The best solution probably is to obtain several recordings of the real scenario (dice rolling and colliding in a box), and randomly play one when you are simulating the collision in game. As long as the duration of the AudioClip matches the simulation, it will be relatively hard to spot it's faked.

SpriteKit - SKActions or updating positions in update function

I am quite new to SpriteKit, coming from about 4 months playing around with HTML5. Using Mark Wihlberg's HTML5 youtube game tutorials, my programs always constantly updated a player or object's position.
Example if it helps:
function run() {
var loop = function() {
window.requestAnimationFrame(loop, canvas);
update(); //here I would add maybe 3 to a player's x pos and redraw in draw()
draw();
}
window.requestAnimationFrame(loop, canvas);
}
But looking at various SpriteKit tutorials online, many use SKAction to move nodes across the screen.
My question is whether constantly updating a node's position in SpriteKit is unorthodox, or frowned upon, and I should get used to using actions, and why?
No it isn't frowned upon, and it's good to know how to do these things manually. If it's fairly straightforward, regular movement, then SKAction is fine. For more complicated stuff (and SKActions can actually get complicated, with sprites following paths and bezier curves etc), use the Update() function, iterate over your sprites (using enumerateChildNodesWithName or similar) and move them as necessary.
You can combine the 2 - move the sprites in Update() and animate them with an SKAction, or vice-versa.
Bear in mind that you don't call Update(); it is called automatically by the game engine 60 times a second. It gets passed a time, so you can work out exactly how long it has been since Update() was last called (not always 1/60s)

Callback at certain times during audio playback

Is it possible to play an audio file from the user's ipod library and have a callback occur whenever the player reaches a certain time point ? I need it to be very accurate, so simply using methods like currentPlaybackTime might not be enough because float equality is inaccurate. I could use the A - B < Epsilon float check, but is there a better, more accurate way of achieving this?
If you can target iOS 4.0 or hight, try using AVPlayer. Then you will be able to use
- (id)addBoundaryTimeObserverForTimes:(NSArray *)times queue:(dispatch_queue_t)queue usingBlock:(void (^)(void))block
which takes an array of NSValues for CMTimes and will run the contents of the block each time one of the boundary times is hit. (Beware of some behavior like, if you pause the audio file inside of the block, the callback will fire again when you unpause it).
Since CMTime is not a float, I think this will be more accurate than checking for the currentTime repeatedly.

iPhone/iPodTouch. What are my options for syncing movement to a soundtrack?

What are the best strategies for syncing music to object movement? The app I envision would have an underlying soundtrack with characters animating in time to the music. What I'm after is a strategy for having the soundtrack periodic send a message to an object, objects, triggering it to commence it's scripted movement.
Thanks,
Doug
FMOD Ex should allow this sort of thing. It's not built-in, but it's relatively cheap.
The point is writing the object movement code so that it can use external clock signal. You can for example write the movement update method so that it takes a time delta:
- (void) updateMovementBy: (double) seconds {…}
In each loop you’ll poll the soundtrack player for current playback time, compute time delta from the last iteration and update the model. But of course a lot depends on what exactly you want to do in response to the music.