Wait for event to play - iphone

How can I make sure an event finish playing even though I move around the listener in the 3D world at the same time. I only want to play the event at certain time with the same listener position as when the event started playing. Other event sounds needs to know where the listener is so I need to update the listener position. But I don't want the already started event sound to be effected...

You can register a callback with events so you know when they have finished. Event::setCallback and FMOD_EVENT_CALLBACKTYPE_EVENTFINISHED.
Perhaps you should consider keeping the listener stationary and only moving the events, that way once you start an event it will keep playing at the same position (unless you move it). It's hard to say without knowing exactly what you are trying to achieve.

Related

Unity Input System event triggered without input

Using the "new" Unity Input System.
I have a start scene with a menu and from there I can load the game scene where I have a TouchManager, the touch manager allows me to control the player and to visualize the force applied to the player.
To visualize the force I have a game object that starts disabled and with the touch manager I enable the visualizer only when there is touch input.
The issue is that at start of the game scene the touch visualizer is enabled, although after the first touch it starts working perfectly.
With debugging I see that the event that signals "touch" is fired (without touch) but the event that signals the release isn't.
Regarding the code there is nothing particularly relevant as far as I am aware, so I briefly explain: (Scripts by order of execution)
GameManager:
Awake() the game is set to pause;
Start() the game is set to "unpause";
TouchManager:
Pause(bool isPause) input events are subscribed and unsubscribed;
Move is the input that is causing the issue.
I tried to disable the visualizer but has to be on update since the event that enables it is triggered after start method and I dont know how/when the touch event is triggered, also I would like to fix the issue at the source instead of covering up.
Any ideas?

Handle pan, zoom and click with both Mouse/Touch on Unity3D

I'm working on a 2D turn based game and I'm in the condition to handle mouse and touch.
The game has an hexagonal map and requires pan, zoom and click action.
I decided to apply delegate-pattern so each object that requires an action, sends the event to its delegate who will do or not stuff.
In this way all inputs converge to a TurnManager that, with a state machine, handles events in accordance to the current game state.
Example:
MapCell.OnMouseUpAsButton() calls
delegate.OnCellClick(MapCell)
All works well and in this way I can handle when do something.
The problems arrived when I started to implement Zoom and Pan in the map.
For these two actions, I had to avoid classic Monobehaviour method (OnMouseDown, OnMouseUpAsButton, ...) and use LateUpdate.
So,
I created a CameraHandler that in the LateUpdate uses:
HandleMouse()
HandleTouch()
and, using delegate pattern, evokes the actions below:
OnMapWillPan()
OnMapPan()
OnMapEnd()
To avoid Pan or Clicks over UI elements, TurnManager filters received events with EventSystem.current.IsPointerOverGameObject()
Problem
On Mac/Mouse all works great! :D
On smartphone/Touch I can't click on nothing and only Pan is working. The debug on device is infernal because the lack of breakpoint or console.
Questions
Do you ever handle this things? How?
Which approach did you use?
What do you think I'm doing wrong?
Are there best practices to avoid problem like this and handle correctly crossplatform input?
Are there any good lecture/book for this argument?
PS: if needed I can show the code

Do AKAudioPlayer nodes apply a 10 ms fade out once they are stopped before reaching the end of the file/buffer?

First off, I just want to say thanks to the team at AudioKit for shedding some light on some difficult problems through their code. I have a few questions.
1: It does not appear the the AKAudioPlayer class applies on-the-spot fades if a player is stopped before reaching the end of the file/buffer. Is there another place in the AudioKit library where this is handled?
2: Does anybody know if the AVAudioMixer node’s volume can be adjusted in real time? E.G. can I make adjustments every 1/441 ms to follow the curve of my fade envelope? There is also the AVAudioUnitEQ with its globalGain property.
3: Is it possible to write to an AVAudioPCMBuffer’s floatChannelData after it has been scheduled, and while it is being played?
I’m writing a sampler app with AVFoundation. When it came time to tackle the problem of applying fades to loaded audio files within AVAudioPlayerNodes my first plan was to adjust the volume of the mixer node attached to my player node(s) in real time. This did not seem to have any sort of effect. It is entirely possible that my timing was off when doing this.
When I finally looked at the AKAudioPlayer class, I realized that one could adjust the actual buffer associated with an audio file. After a day or two of debugging, I was able to adapt the code from the AKAudioPlayer class into my PadModel class, with a few minor differences, and it works great.
However, I’m still getting those nasty little clicks whenever I stop one of my Pads from playing before the end of the file because the fades I apply are only in place at the start and the end of the file/buffer.
As far as my first question is concerned, in looking through the AKAudioPlayer class, it appears that the only fades applied to the buffer occur at the beginning and end of the buffer. The stop() method does not appear to apply any sort of on-the-spot fade to the buffer.
In my mind, the only way to have a fade out happen once a stop event happens is to apply it after said stop event, correct?
I have tried doing this, playing a 10 ms long faded-out buffer consisting of the buffer 10 ms after the stop position immediately after I call stop on my player node. It does not have the desired affect. I did not have much confidence in this scheme from the onset, but it seemed worth a try.
To be clear, once my stop() method is called, before actually stopping the the player node, I allocate the 10 ms fade buffer, read into the buffer at the position it is currently at, for the number of frames my fade buffer consists of. I then apply the envelope to the recently allocated fade out buffer, just as it is done in fadeBuffer() method in the AKAudioPlayer class. At this point I finally call stop() on the playing node, then schedule and play the fade out buffer.
Obviously there is going to be a discontinuity between stopping the buffer and playing the fade out buffer, e.g. by the time I apply the fade to the fade out buffer, the stop frame position I assigned to a local variable will no longer be valid, etc. And indeed, once I let off a pad, the sound that is played can only be described as discontinuous.
The only other solution to the problem I can think of strikes me as a daunting task, which would be to continually apply the fade envelope in realtime to the samples immediately ahead of the current play position as the buffer is being played. I currently do not believe I have the coding chops to pull this off.
Anyway, I looked through all the questions on S.O. concerned with AudioKit and this particular subject did not seem to come up. So anybodies thoughts on the matter would be greatly appreciated. Thanks in advance!
If anybody wants to look at my code, the PadModel class starts on line 223 of this file:
https://github.com/mike-normal13/pad/blob/master/Pad.swift
AudioKit is lacking in a fade-to-stop method. I would suggest requesting the feature as it is a worth while endeavor. If you are using AVAudioUnitSampler, I believe you can set ADSR values to achieve the fading effect, but not in a very straightforward way. You have to create a preset using AULab, figure out how to get the release to work, then import it into your project.

Swift Spritekit Modify Action

I am making an IOS game in Swift with Spritekit, I have a player Sprite which I want to rotate and move towards where a touch is on the screen. Currently I get the angle, Create the action to turn, Run the action and do the same for the movement. This works well for a single touch, However I now want to do the same when a touch moves. First I tried removing the action then running the new one, The sprite jitters or does not move at all, because the action is being cancelled very soon after being created. I have also tried running it every 100ms however I still do not get smooth movement.
So I was wondering is there any way to modify an action as it is running? Or what is the right way of doing this?
You can override the didEvaluateActions method in your scene class, this gets called after all actions are performed in an update frame. In that method, destroy the old action and start your new action. If you are still seeing jitters, then you need to reevaluate when you want to be removing actions

When does a touchesBegan become a touchesMoved?

When you drag a finger across the iPhone touchscreen, it generates touchesMoved events at a nice, regular 60Hz.
However, the transition from the initial touchesBegan event to the first touchesMoved is less obvious: sometimes the device waits a while.
What's it waiting for? Larger time/distance deltas? More touches to lump into the event?
Does anybody know?
Importantly, this delay does not happen with subsequent fingers, which puts the first touch at a distinct disadvantage. It's very asymmetric and bad news for apps that demand precise input, like games and musical instruments.
To see this bug/phenomenon in action
slowly drag the iPhone screen unlock slider to the right. note the sudden jump & note how it doesn't occur if you have another finger resting anywhere else on the screen
try "creeping" across a narrow bridge in any number of 3D games. Frustrating!
try a dual virtual joystick game & note that the effect is mitigated because you're obliged to never end either of the touches which amortizes the unpleasantness.
Should've logged this as a bug 8 months ago.
After a touchesBegan event is fired the UIKit looks for a positional movement of the finger touch which translates into touchedMoved events as the x/y of the finger is changed until the finger is lifted and the touchesEnded event is fired.
If the finger is held down in one place it will not fire the touchesMoved event until there is movement.
I am building an app where you have to draw based on touchesMoved and it does happen at intervals but it is fast enough to give a smooth drawing appearance. Since it is an event and buried in the SDK you might have to do some testing in your scenario to see how fast it responds, depending on other actions or events it could be variable to the situation it is used. In my experience it is within a few ms of movement and this is with about 2-3k other sprites on the screen.
The drawing does start on the touchesBegan event though so the first placement is set then it chains to the touhesMoved and ends with the touchesEnd. I use all the events for the drag operation, so maybe the initial move is less laggy perceptually in this case.
To test in your app you could put a timestamp on each event if it is crucial to your design and work out some sort of easing.
http://developer.apple.com/IPhone/library/documentation/UIKit/Reference/UIResponder_Class/Reference/Reference.html#//apple_ref/occ/instm/UIResponder/touchesMoved:withEvent:
I don't think it's a bug, it's more of a missing feature.
Ordinarily, this is intended behavior to filter out accidental micro-movements that would transform a tap or long press into a slide when this was not intended by the user.
This is nothing new, it has always been there, for instance there are a few pixels of tolerance for double clicks in pointer-based GUIs - or even this same tolerance before a drag is started, because users sometimes inadvertently drag when they just meant to click. Try slowly moving an item on the desktop (OSX or Windows) to see it.
The missing feature is that it doesn't appear to be configurable.
An idea: Is it possible to enter a timed loop on touchesBegan that periodically checks the touch's locationInView:?
I don't represent any kind of official answer but it makes sense that touchesBegan->touchesMoved has a longer duration than touchesMoved->touchesMoved. It would be frustrating to developers if every touchesBegan came along with a bunch of accidental touchesMoved events. Apple must have determined (experimentally) some distance at which a touch becomes a drag. Once the touchesMoved has begun, there is no need to perform this test any more because every point until the next touchesUp is guaranteed to be a touchesMoved.
This seems to be what you are saying in your original post, Rythmic Fistman, and I just wanted to elaborate a bit more and say that I agree with your reasoning. This means if you're calculating a "drag velocity" of some sort, you are required to use distance traveled as a factor, rather than depending on the frequency of the update timer (which is better practice anyway).
Its waiting for the first move.
That's how the OS distinguishes a drag from a tap. Once you drag, all new notifications are touchesMoved.
This is also the reason why you should write code to execute on touch up event.
Currently such "delay" between touchesBegan and touchesMoved is present also when other fingers are touching the screen. Unfortunately it seems that an option to disable it doesn't exist yet. I'm also a music app developer (and player), and I find this behavior very annoying.