CocosDenshion::SimpleAudioEngine::getInstance()->preloadEffect("Sounds/Hit.mp3");
CocosDenshion::SimpleAudioEngine::getInstance()->preloadEffect("Sounds/Point.mp3");
CocosDenshion::SimpleAudioEngine::getInstance()->preloadEffect("Sounds/Wing.mp3");
I make the gameclone with cocos2dx version 4.0 but I check in document of cocos2dx-v4, I see in Doc we can use SimpleAudioEngine and i try this code when I Debug the code say "SimpleAudioEngine unidentified ", How can i change to AudioEngine, Thanks for answer and reading.
You can try:
cocos2d::AudioEngine::preload("Sounds/Hit.mp3");
Related
I want to make a turret that is shooting lazers and i watched a tutorial on it. The person in the video used (scenename).position(get.node("Position2D").get_global_pos()) in order to spawn a projectile on the potition 2d. When i run this code, the game freezes and the debugger says "Invalid call. Nonexistant function 'get_global_pos' in base 2D". In the comments someone said that the tutorial is using 2.0 godot and that in 3.0 you dont use get_ and instead of pos you put potition, so i tried changing get_global_pos to global_position but it didnt work. i am very new to this and if you want any further information please let me know
As you already found out, global_position is the equivalent to get_global_pos.
The difference is, that global_position is no function but a member.
So instead of:
(scenename).position(get.node("Position2D").get_global_pos())
you have to write:
(scenename).position = get.node("Position2D").global_position
I've been looking around Swift documentation to save an audio output from AVAudioEngine but I couldn't find any useful tip.
Any suggestion?
Solution
I found a way around thanks to matt's answer.
Here a sample code of how to save an audio after passing it through an AVAudioEngine (i think that technically it's before)
newAudio = AVAudioFile(forWriting: newAudio.url, settings: nil, error: NSErrorPointer())
//Your new file on which you want to save some changed audio, and prepared to be bufferd in some new data...
var audioPlayerNode = AVAudioPlayerNode() //or your Time pitch unit if pitch changed
//Now install a Tap on the output bus to "record" the transformed file on a our newAudio file.
audioPlayerNode.installTapOnBus(0, bufferSize: (AVAudioFrameCount(audioPlayer.duration)), format: opffb){
(buffer: AVAudioPCMBuffer!, time: AVAudioTime!) in
if (self.newAudio.length) < (self.audioFile.length){//Let us know when to stop saving the file, otherwise saving infinitely
self.newAudio.writeFromBuffer(buffer, error: NSErrorPointer())//let's write the buffer result into our file
}else{
audioPlayerNode.removeTapOnBus(0)//if we dont remove it, will keep on tapping infinitely
println("Did you like it? Please, vote up for my question")
}
}
Hope this helps !
One issue to solve:
Sometimes, your outputNode is shorter than the input: if you accelerate the time rate by 2, your audio will be 2 times shorter. This is the issue im facing for now since my condition for saving the file is (line 10)
if(newAudio.length) < (self.audioFile.length)//audiofile being the original(long) audio and newAudio being the new changed (shorter) audio.
Any help here?
Yes, it's quite easy. You simply put a tap on a node and save the buffer into a file.
Unfortunately this means you have to play through the node. I was hoping that AVAudioEngine would let me process one sound file into another directly, but apparently that's impossible - you have to play and process in real time.
Offline rendering Worked for me using GenericOutput AudioUnit. Please check this link, I have done mixing two,three audios offline and combine it to a single file. Not the same scenario but it may help you for getting some idea. core audio offline rendering GenericOutput
I'm having trouble compiling the current release.
I was able to download a copy of the source distribution today using:
hg clone https://core-plot.googlecode.com/hg/ core-plot
I opened the "core-plot/framework".
I then double clicked on CorePlot-CocoaTouch.xcodeproj to launch Xcode.
When I build the project I get the following error:
-(void)bind:(NSString *)binding toObject:(id)observable withKeyPath:(NSString *)keyPath options:(NSDictionary *)options
{
#if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE
[NSException raise:CPException format:BindingsNotSupportedString];
Format not a string literal and no formal arguments
#else
[super bind:binding toObject:observable withKeyPath:keyPath options:options];
#endif
}
I am running on a new MacBook with OS 10.6, and IPhone Simulator 4.0.
Any help will be greatly appreciated.
Charles
A more appropriate place to ask this question would be the Core Plot mailing list, because I'm one of the few developers for the project that regularly visits here.
That said, the issue here is that we're using a string constant for a format string, which Xcode now seems to be warning about (rightly so, as this can lead to problems). To work around this for now, you can replace the line in CPLayer.m
static NSString * const BindingsNotSupportedString = #"Bindings are not supported on the iPhone in Core Plot";
with
#define BindingsNotSupportedString #"Bindings are not supported on the iPhone in Core Plot"
Search in your project for BindingsNotSupportedString, seems like it's not in the current file and therefore needs to be included. Or just try to change this to an acceptable format.
Apple guy tried to be funny and wrote in the docs:
("Headphone," "Speaker," etc.)
What kind of return values are possible in reality?
I ran 'strings' on the CoreMedia framework (iOS4.2 SDK), and the following strings seem reasonable and are grouped together:
ReceiverAndMicrophone
HeadsetInOut
HeadphonesAndMicrophone
SpeakerAndMicrophone
HeadsetBT
LineInOut
Default
Command was:
strings -a -o CoreMedia | less
# CoreMedia is from /Developer/Platforms/iPhoneOS.platform/Developer \
# /SDKs/iPhoneOS4.2.sdk/System/Library/Frameworks/CoreMedia.framework
He wasn't being funny, those are actual values. The only one I've seen that he didn't outline is "LineOut"
According to http://lists.apple.com/archives/coreaudio-api/2009/Jan/msg00084.html
there are also LineOut, HeadsetInOut, ReceiverAndMicrophone, HeadphonesAndMicrophone,
but the guy who asked whether there are more values received no answer.
I just got MicrophoneWired from it. (I actually have a special piece of hardware plugged in that is a temperature probe, but we are using it through the headphone jack).
Then I got MicrophoneBuiltIn with nothing plugged in. This is on an ipod touch with 4.3 by the way.
The values provided by l8nite above are reserved for when your audio session is configured for both input and output. Other values used when you're only doing audio out: (I used the same trick as l8nite - thanks!)
LineOut
HeadphonesBT (used for Bluetooth audio output - observed this hooked up to via bluetooth to a car audio system)
AirTunes (used for AirPlay output)
How is HeadphonesBT different from HeadsetBT ? My app could successfully use the HeadsetBT device to send and receive audio while HeadphonesBT failed to do anything. This is on iOs6
No matter what I try (build -> content, NSUrl, filename) I get a 'null exception': file not found when I try to play a .caf sound file in monotouch.
//var path = NSBundle.MainBundle.PathForResource("MatchGame", "caf");
//var gameSong = SystemSound.FromFile( new NSUrl(path, false));
var gameSong = SystemSound.FromFile("MatchGame.caf");
gameSong.PlaySystemSound();
I also try combinations using the folder name "images/MatchGame.caf" and moving MatchGame.caf into the root folder.
What am I missing? Thanks a lot.
Here is a link to a video of adding the sound in monotouch. http://www.screencast.com/t/MmE0ZmFh What is wrong?
Bryan,
From looking at your screencast - you are trying to play a mp3 file and not a caf file. Mp3 files are encoded differently and will not play with the SystemSound class that's there (I can't remember if you can do this in Obj-C or not.)
You'll want to use the AVFoundation Namespace and AVAudioPlayer class.
using Monotouch.AVFoundation;
var mediaFile = NSUrl.FromFilename("myMp3.mp3");
var audioPlayer = AVAudioPlayer.FromUrl(mediaFile);
audioPlayer.FinishedPlaying += delegate { audioPlayer.Dispose(); };
audioPlayer.Play();
You might need to tweak the code above - I don't have MonoDevelop to hand but that should help you a little further.
Cheers,
ChrisNTR
You need the first line:
var path = NSBundle.MainBundle.PathForResource("MatchGame", "caf");
Then make sure that your audio file is included in the application by making sure that your CAF file is flagged as "Content" in the Properties pane, otherwise the file is not copied to the resulting application package (your .app)
You can follow the steps documented here (they are for images, but apply the same to audio files):
http://wiki.monotouch.net/HowTo/Images/Add_an_Image_to_your_Project
Additionally, this is a good resource for where you should store files:
http://wiki.monotouch.net/HowTo/Files/HowTo%3a_Store_Files