I'm making a game with Swift and Sprite Kit.
I would like to add background music and sound effects.
The "classic" way is to use AVFoundation, but this does not seem to have new Swift APIs, only Objective C ones.
Is there a new modern swift sound API I can use with Sprite Kit easily?
No frameworks have new Swift APIs in iOS 8 or OS X Yosemite. All Objective-C frameworks are automatically bridged to Swift, and those include both SpriteKit and AVFoundation.
If you're looking for a quick high-level alternative to the more detailed AVFoundation APIs for adding sounds and music to your game, look no further than SpriteKit itself. (Regardless of whether you use Swift or Objective-C.)
SKAction provides a playSoundFileNamed:waitForCompletion: action. Pass false for the second parameter to play sound effects, or pass true and make a repeating action to loop music.
Related
What are the possible methods to create an AR solution like this?
https://www.youtube.com/watch?v=9zD_YQBaZEY
You need AR devices and AR SDK and 3D engine, for example, iPhone + ARKit + Unity(Unreal).
But the content shown in your youtube link is unlikely AR content, it made by some kind of after effect software.
OK I make a simple search, here:tracking-3d-camera-movement, this is what you want I think.
I making a iOS video player using ffmpeg, the flow likes this:
Video File---> [FFMPEG Decoder] --> decoded frames --> [a media director] --> /iphone screen (full and partial)/
A media director will handle the tasks of rendering decoded video frames to iOS ui (UIView, UIWindow etc), outputting audio samples to iOS speaker, and threads management.
SDL is one of those libs, but SDL is mainly made for game making purpose and seem to be not really mature for iOS.
What can be the substitute for SDL?
On Mac OS X I used CoreImage/CoreVideo for this, decoding frame into a CVImageBuffer and rendering them into a CoreImage context. I'm not sure CoreImage contexts are supported on iOS though. Maybe this thread will help on this: How to turn a CVPixelBuffer into a UIImage?
A better way on iOS might be to draw your frames with OpenGLES.
SDL uses opengl and FFMpeg, you can come pretty close using ffmpeg and apple native api's functions. We've done it with several video players.
This certainly will get you started.
https://github.com/mooncatventures-group
I'm trying to use the MPMediaPlayback protocol's currentPlaybackRate() to slow down a video. I'm confused though as the class MPMoviePlayerController states that:
You can control most aspects of playback programmatically using the methods and properties of the MPMediaPlayback protocol, to which this class conforms.
Except just above in the header here: http://developer.apple.com/iphone/library/documentation/MediaPlayer/Reference/MPMoviePlayerController_Class/MPMoviePlayerController/MPMoviePlayerController.html it doesn't seem to.
All I want to do is slow down the playback rate for a video.
That protocol is only available in iPhone OS 3.2 and later, which is currently only for iPad.
You should check the Availability section with a lot of the newer classes' methods since there are two platforms now.
Is it possible to load a short video file and - once loaded - select a specific frame and display that frame in a view? If there is no native support for this, how about an open source alternative?
Thanks in advance.
-Doug
I think that in iphone programming you're stuck with the fullscreen video solution proposed by apple. You could write your own controller to do it differently, but i think it could be difficult to achieve good performances and you're cut out of the app-store for sure.
edit:
looks like in iphone sdk 3.2 apple added something for you:
The MPMoviePlayerController class
defines an interface for managing the
playback of a movie. Playback occurs
either in full-screen mode or in a
custom view that is vended by the
movie player controller. You can
incorporate the view into your own
view hierarchies or use a
MPMoviePlayerViewController object to
manage the presentation for you.
and again
Behavior in iPhone OS 3.1 and Earlier
In iPhone OS 3.1 and earlier, this
class implemented a full-screen movie
player only. After creating the movie
player and initializing it with a
single movie file, you called the play
method to present the movie. (The
definition of the play method has
since moved out of this class and into
the MPMediaPlayback protocol.) The
movie player object itself handled the
actual presentation of the movie
content.
i haven't tested it yet but have a look at the official documentation under MPMoviePlayerController Class Reference, it may help.
I am implementing a sound effect that plays while a user is dragging a UISlider.
Here is the IBAction: called by the UISlider's Value Changed event
-(IBAction)playTone4; {
AudioServicesPlaySystemSound(soundID4);
}
I would like the sound to halt when the user is not dragging the slider but has not released it.
Is there a way to do that? There doesn't seem to be an AudioServicesStopSystemSound() function.
System sounds cannot be stopped.
See the iPhone Programming Guide: section Multimeda Support for more information.
To accomplish the desired effect, I would recommend using AVAudioPlayer or audioQueues. The Programming Guide I addressed covers everything you want to know about these techniques.