I am trying to make something like,i am recording a sound and on the basis of sound (pitch,frequency,not sure) the image should move.
I am able to achieve recording, also i image sequence in place, but seperately.
I am not sure how to link that,just for the information, i am trying to achieve something like
mouth mover app:
app url here
My question is , how can i move/animate image on the basis of sound frequency.
Thanks
I am done with the solution.Used Dirac and problem solved.
Edit:
What is it?
DiracAudioPlayer is a new set of Cocoa classes that wrap the entire Dirac functionality in a convenient way, exposing an API that is similar to what AVAudioPlayer offers. Note that this is not an AVAudioPlayer subclass.
Following are the core features and a description of the API.
DiracAudioPlayer Core Features
DiracAudioPlayer is a set of classes that allow file based playback of a variety of audio formats (including MPMediaItems) while simultaneously changing speed and pitch of the audio file in real time. Version 3.6 consists of DiracAudioPlayerBase (the base class taking care of file IO and playback), DiracAudioPlayer (wrapping the Dirac Core API) and DiracFxAudioPlayer (wrapping the DiracFx API).
Make sure you include all 3 classes in your project as well as the "ExtAudioFile" and "util" folders, and add Accelerate.framework and CoreAudio.framework to the project. On MacOS X you will have to add the AudioUnit.framework as well, on iOS you will have to add AudioToolbox.framework, AVFoundation.framework, MediaPlayer.framework and CoreMedia.framework instead.
DiracAudioPlayer is…
…an Apple-compatible class to play back time stretched audio that works on both iOS (version 4 and higher) and MacOS X (version 10.6 and higher)
…very easy to use
…fully ARC compatible
…delivered to you including the full source code
DiracAudioPlayer API
Version 3.6 released in November 2012 offers the following calls:
- (id) initWithContentsOfURL:(NSURL*)inUrl channels:(int)channels error: (NSError **)error;
Initializes and returns an audio player for playing a designated sound file. A URL identifying the sound file to play. The audio data must be in a format supported by Core Audio. Pass in the address of a nil-initialized NSError object. If an error occurs, upon return the NSError object describes the error. To use an item from the user's iPod library supply the URL that you get via MPMediaItem's MPMediaItemPropertyAssetURL property as inUrl. Note that FairPlay protected content can NOT be processed.
- (void) setDelegate:(id)delegate;
- (id) delegate;
Set/get delegate of the class. If you implement the delegate protocol, DiracAudioPlayer will call your implementation of
- (void)diracPlayerDidFinishPlaying:(DiracAudioPlayerBase *)player successfully:(BOOL)flag
When it is done playing
- (void) changeDuration:(float)duration;
- (void) changePitch:(float)pitch;
Change playback speed and pitch
- (NSInteger) numberOfLoops;
- (void) setNumberOfLoops:(NSInteger)loops;
A value of 0, which is the default, means to play the sound once. Set a positive integer value to specify the number of times to return to the start and play again. For example, specifying a value of 1 results in a total of two plays of the sound. Set any negative integer value to loop the sound indefinitely until you call the stop method.
- (void) updateMeters;
Must be called prior to calling -peakPowerForChannel in order to update its internal measurements
- (float) peakPowerForChannel:(NSUInteger)channelNumber;
A floating-point representation, in decibels, of a given audio channel’s current peak power. A return value of 0 dB indicates full scale, or maximum power; a return value of -160 dB indicates minimum power (that is, near silence). If the signal provided to the audio player exceeds ±full scale, then the return value may exceed 0 (that is, it may enter the positive range). To obtain a current peak power value, you must call the updateMeters method before calling this method.
- (BOOL) prepareToPlay;
Starts the Dirac processing thread and prepares the sound file for playback. If you don't call this explicitly it will be called when calling -play
- (NSUInteger) numberOfChannels;
The number of audio channels in the sound associated with the audio player. (read-only)
- (NSTimeInterval) fileDuration;
Returns the total duration, in seconds, of the sound associated with the audio player. (read-only)
- (NSTimeInterval) currentTime;
- (void) setCurrentTime:(NSTimeInterval)time
Returns the current play time in the input file. Note that if you apply time stretching, -currentTime will reflect the slowed down time depending on the time stretch factor.
IMPORTANT CHANGE: In previous versions this value returned the total play time independent of the position in the file. Please update your code accordingly to reflect the change
Setting this property causes playback to fast forward or rewind to the specified play time.
- (void) play;
Plays a sound asynchronously. Returns YES on success, or NO on failure. Calling this method implicitly calls the -prepareToPlay method if the audio player is not already prepared to play.
- (NSURL*) url;
The URL for the sound associated with the audio player. (read-only)
- (void) setVolume:(float)volume;
- (float) volume;
The playback gain for the audio player, ranging from 0.0 through 1.0.
- (BOOL) playing;
A Boolean value that indicates whether the audio player is playing (YES) or not (NO). (read-only). To find out when playback has stopped, use the diracPlayerDidFinishPlaying:successfully: delegate method.
- (void) pause;
Pauses playback; sound remains ready to resume playback from where it left off. Calling pause leaves the audio player prepared to play; it does not release the audio hardware that was acquired upon calling -play or -prepareToPlay.
- (void) stop;
Stops playback and undoes the setup needed for playback. Calling this method, or allowing a sound to finish playing, undoes the setup performed upon calling the -play or -prepareToPlay methods.
Most text-to-speech systems will allow you to register a callback function that will send you the phoneme (in laymen terms the sound) that is being produced. Look at the following link. Click on callbacks on the left hand side. Look down at SpeechPhonemeProcPtr which will allow you to register a function that will be called when the noise being made is "uh", "th", "ah", or whatever noise it is. You would then update your image to look like what a person's mouth would look like when making that particular sound. This was very easy in IBM's ViaVoice and I have never coded such an application on an iPhone but I think this is better than trying to match the audio.
If this is truly unfiltered audio you are trying to match then you can pass it to a voice recognition system and pass the recognized text into the TTS system and get the phonemes.
Related
I'm new to swift, and I'm creating a random generated game using NSTimer i used this function for timer https://github.com/yuzixun/swift-timer/tree/master/example
my game was working fine until i got a problem with speeding up my game depending on the player score using timer, so i can't change the speed of my game
my Gamescene class :
let myGame = Timer.repeat(after: 1) {
//my generated game code here
}
myGame.start()
myGame : is a function that generate random object for my game every second using Timer.repeat(after:1).
let LevelUpdate = Timer.repeat(after: 0.1) {
//update my game and verify player score
if(self.score >= 1000 && self.score <= 2500){
// Speedup the time of the game
}
}
LevelUpdate : is a function that update some variable for my game and verify player score every 0.1 second.
My objectif : is to be able to change the timer of myGame if the player reached more then 1000 point and speedup myGame to 0.8 second, and my question is it possible to change time interval of myGame?
Please i need to find a way of speeding up my game by player score.
Ok, now we have enough information to answer your question.
To summarize:
You're using a third party library from Github (link provided) that lets you create timers in various different ways.
You're using a class called Timer from that library, and creating a repeating timer with the method Timer.repeat(after:).
You create 2 repeating timers. The first runs on a 1 second interval and you save it in a constant myGame.
You also create another repeating timer that runs on a .1 second interval and save it to a constant called LevelUpdate.
You want to know how to change the interval of your myGame timer.
That is the kind of description you should write. Clear, specific, and easy to follow.
Now the answer:
You can't change the interval on the system class NSTimer once it's created.
Looking at the library you're using, it appears that it doesn't offer that feature either.
What you have to do is to kill the timer and replace it with a new one with a different interval.
You could probably modify the library you're using to do that internally if you change the interval on the timer.
Failing that, you'd need to change your myGame to a var.
You should create a method func createGameTimer (interval: NSTimeInterval) -> NSTimer that takes a timer interval as input and creates and returns your game timer. That way you can kill the old timer and create a new one with a different interval when you need to do that.
I'm creating an application where the user can draw a line on a screen from an object to the location they want to send it and the object will then follow the line to the final location. In order to do this, I've already created working methods to allow the user to draw the lines and then to store the coordinates of the line in a MutableArray. However, I'm having some trouble when I try to animate. As I'm pretty new to the iPhone OS, this could be a simple problem, but I haven't been able to find a solution yet.
I am NOT using Bezier Paths as the user is drawing the line manually, I'm not drawing it programmatically.
Here's the code that I've tried
-(void)animateButtonWasPressed
{
for (int f = 0; f < [cordArrayY count]; f++) {
NSString *newY = [cordArrayY objectAtIndex:f];
NSString *newX = [cordArray objectAtIndex:f];
[self myAnimate:newX :newY];
}
}
-(void)myAnimate:(NSString *)PntX :(NSString *)PntY
{
[UIView animateWithDuration:.5 animations:
^{
object.center = CGPointMake([PntX floatValue], [PntY floatValue]);
}];
}
SYNTAX:
object - the object I am trying to move
cordArray - the mutable array containing the x-coordinates
cordArrayY - the mutable array containing the y-coordinates
Everything else is either defined within the code or Apple methods
The problem: the object moves instantly from its original location directly to the final location. I get a NSLog which tells me this:
-[UIApplication endIgnoringInteractionEvents] called without matching -beginIgnoringInteractionEvents. Ignoring.
Any help would be appreciated!
The method you're using to animate "object" seems to be ok. I believe the problem is the loop in which you are invoking that method. You are trying to animate the same property of the object over and over in every step of that loop. I think this causes that "jump".
Take a look at this quote from Apple's docs:
Important: Changing the value of a property while an animation
involving that property is already in progress does not stop the
current animation. Instead, the current animation continues and
animates to the new value you just assigned to the property.
http://developer.apple.com/library/ios/#documentation/WindowsViews/Conceptual/ViewPG_iPhoneOS/AnimatingViews/AnimatingViews.html#//apple_ref/doc/uid/TP40009503-CH6
As a consequence of invoking the animation in each step, you will end up animating object in approximately 0.5 secs to the last position of your coordinates array.
I think you should link those animations together, but you should wait for each animation to finish to start the following one. Take a look at this
Another thing that both the animateWithDuration:animations:completion:
and animateWithDuration:delay:options:animations:completion: methods
support is the ability to specify a completion handler block. You
might use a completion handler to signal your application that a
specific animation has finished. Completion handlers are also the way
to link separate animations together.
Hope this helps,
Cheers.
I was finally able to solve this problem:
At first I tried to continue using the method that I had above. Lio's advice to use the "completion" block was perfect, but as I needed it to loop for an undefined number of times, I would have to use a counter variable. iPhone block programming doesn't allow the modification of external variables or the use of the _block declaration, so this didn't work out for me.
However, I eventually created my entire animation using NSTimer in the method described here just using my array coordinates instead:
http://www.icodeblog.com/2008/10/28/iphone-programming-tutorial-animating-a-ball-using-an-nstimer/
people, how do you actually slow down the orientation-rotation when using the "willAnimateSecondHalfOfRotationFromInterfaceOrientation:" method?
I currently have this:
-(void) willAnimateSecondHalfOfRotationFromInterfaceOrientation:(UIInterfaceOrientation)fromInterfaceOrientation
duration:(NSTimeInterval)duration {
[self positionViews];
}
And I understand that this "willAnimate2ndHalf..." method gets called automatically when the rotation does indeed happen - well where do I actually get to change its DURATION value?
If you want to change the overall timing of he app's rotation, it can't be done. willAnimateSecondHalfOfRotationFromInterfaceOrientation is meant for adding custom code, like setting custom coordinates, properties, things like that.
How can I apply a timer count down in Cocos2D and the score counter that both will be display on the screen?
I am still new to this
Please help me out
Thanks a lot
You should have a look into the Timer class. Here is a link to the Timer class overview - cocos2d for iPhone 0.8.2. You won't need to implement one directly (but indirectly), but its good to know how it works.
For displaying the actual counter in the screen using cocos2d, have a look at LabelAtlas. Here is a link to LabelAtlas class overview - cocos2d for iPhone 0.8.2
/* basically you create a LabelAtlas subclass that
has a method for updating itself with the current time
or score. I will call it ScoreLabel in this example */
ScoreLabel * bonusScore = [[ScoreLabel alloc] initWithString:#"0"
charMapFile:#"charMap.png"
itemWidth:10
itemHeight:10
startCharMap:'0'];
// then schedule a method to be called that updates the current score
// schedule: paramter of type SEL. It should pass the method to be scheduled
// interval is in seconds, it determines how often the method is called.
[bonusScore schedule:#selector(updateScore) interval:0.5];
/* the implementation of the updateScore method could be */
-(void)updateScore {
[self setString: [[World sharedWorld] timeLeft]]];
}
Have a look at the cocos2d examples for AtlasSprites (and for LabelAtlas) to see how to implement the image you need to support the class.
I'm using OpenAL on iPhone to play multiple audio samples simultaneously.
Can I get OpenAL to notify me when a single sample is done playing?
I'd like to avoid hardcoding the sample length and setting a timer.
I didn't have much luck with callbacks in OpenAL. In my state machines, I simply poll the source and delay the transition until it's done.
- (BOOL)playing {
ALint sourceState;
alGetSourcei(sourceID, AL_SOURCE_STATE, &sourceState);
return sourceState == AL_PLAYING;
}
// ... //
case QSTATE_DYING:
if (![audioSource playing])
[self transitionTo:QSTATE_DEAD];
If this isn't what you need, then you're best bet is probably a timer. You shouldn't need to hardcode any values. You can determine the playback time when you're populating your buffers.
A bit of insight into the "why" of the question might offer some additional choices.
If you have the OpenAL source abstracted into a class, I guess you can simply call performSelector:afterDelay: when you start the sound:
- (void) play
{
[delegate performSelector:#selector(soundHasFinishedPlaying)
afterDelay:self.length];
…
}
(If you stop the sound manually in the meantime, the callback can be cancelled, see the NSObject Class Reference.) Or you can poll the AL_SOURCE_STATE:
- (void) checkStatus
{
ALint state;
alGetSourcei(source, AL_SOURCE_STATE, &state);
if (state == AL_PLAYING)
return;
[timer invalidate];
[delegate soundHasFinishedPlaying];
}
I don’t know how to have OpenAL call you back. What exactly do you want the callback for? Some things can be solved better without a callback.
This OpenAL guide suggests a possible solution:
The 'stream' function also tells us if the stream is finished playing.
...and provides sample source code to illustrate the usage.
Wait, are you talking about having finished one sample (e.g., 1/44100 second for 44.1 KHz audio)? Or are you talking about knowing that a source has played through its buffer and has no more audio to play?
For the latter, I've had good results polling a source for the AL_BUFFERS_PROCESSED property when I stream buffers to a source; it might work for the single-buffer case to look for a non-zero value of this property.