How do you display song lyrics in Karaoke style on the iPhone? - iphone

I am currently creating an app that plays music. I would like to add a feature that shows the music lyrics while the music plays, with the current position of the text marked to match the current position in the song. The Bouncing Ball effect, just like you see on every Karaoke screen as the song plays.
I have been looking into extending my caf files, adding "string chunks" and then reading them out again. Is this the correct way to do this? Does anybody know of a better/easier/normal way to achieve my goal?
As I am not sure how I would synchronize everything, I would be happy for any suggestions, code examples or helpful comments. Maybe somebody has done this before and would be happy to advise me.
Thanks in advance for any information offered.
Alan

Most Karaoke apps these days use the .Kar file extension. It's a slightly butchered MIDI file with some annotations and it's pretty small in file size.
I have no idea where you'd start to read it though. You could ask these guys: http://www.ikaraokeapp.com/
Most little devs are happy to help others out.

The standard way to do this is using the kar file extension as suggested by the other answer. Here are some more details.
The .kar extension just signifies that a standard midi file contains lyric information. To build your karaoke player you need to do two things: play the instrument part of the midi file and display the lyrics.
This can be achieved in two ways. Either you could use the built in MusicPlayer to play the midi file and then access the lyric information from a virtual endpoint: here
You would need to access midi meta messages with codes 0x01 (text) or 0x05 (lyric). These messages store the lyric data as a series of hex numbers where each hex number represents an ascii character.
This approach allows you to play the MIDI file and access the lyrics. The problem is that you need to be able to display the lyrics ahead of time so the user can read them before they're played.
To do this you could parse the midi file manually: here
You would need to loop over the MIDI file and extract the text lyrics with their time stamps. Then you could display a section of lyrics ahead of time and change the colour or do a bouncing ball effect as the lyrics became due.

You should be able to do this through the MusicPlayer API, specifically the MusicTrack MIDIMetaEvent assuming midi type music is acceptable.
MIDIMetaEvent
Describes a MIDI metaevent such as lyric text, time signature, and so on.
typedef struct MIDIMetaEvent {
UInt8 metaEventType;
UInt8 unused1;
UInt8 unused2;
UInt8 unused3;
UInt32 dataLength;
UInt8 data[1];
} MIDIMetaEvent;
Alternately, you might also made some headway looking at iD3 metadata for storing lyrics and timing info.
Either way, there will be lots of file prep in getting the lyrics and timing into the music files.

Related

Is it possible to record the audio that comes out of the iPhone?

I am working on an app that allows the user to create a sort of dub. There is an audio file playing, and the user can tap at certain moments to insert sound (kind of like a censor button.) I'm wondering how to go about capturing the final product.
Capturing audio directly from the iPhone seems the easiest route, as the user already hears the finished product as it is made. However, I can't find anything on how to do this. If not possible, are there any suggestions?
The best way would probably be to be using the AV Foundation framework for mixing and then buffering the audio as well as playing it. This would allow for a high abstraction level while guaranteeing both played back and saved audio to be equal.
Apart from that: from a How can I achieve this with minimum code-perspective, without more information about your setup, the question is way too broad and/or opinion-based.
You will have to work with buffers. Don't know right now how it is done in Swift but you can implement it in Obj-C and then bridge it out.
You can refer to this answers here in StackOverflow (They are a bit old)
https://stackoverflow.com/a/11218339/2683201
https://stackoverflow.com/a/10101877/2683201
and a project also exists (but is in Obj-C)
https://github.com/alexbw/novocaine
Mainly the idea for your case would be to have 2 separated buffers and your sound effect.
Then, you will be playing from buffer A (your music) and copying played data into buffer B (final Output) unless you are playing the effect. In wich case you will be copying the effect data into your buffer B.
Other option is to do it offline:
Play your music (or audio) and keep a timer running synced with the elapsed time of your "to be censored audio".
Save the timestamp of when you start and end tapping the censor button (for example).
Overlap buffer A with your effect in those recorded (start-end) timestamps.
Save the buffer as a file (or do whatever you need to do with it)
UPDATE:
You should take a look into the Apple implementation of something like this:
https://developer.apple.com/library/ios/samplecode/AVAEMixerSample/Introduction/Intro.html

How to display song lyric line by line while playing a song in iPhone? [duplicate]

I am currently creating an app that plays music. I would like to add a feature that shows the music lyrics while the music plays, with the current position of the text marked to match the current position in the song. The Bouncing Ball effect, just like you see on every Karaoke screen as the song plays.
I have been looking into extending my caf files, adding "string chunks" and then reading them out again. Is this the correct way to do this? Does anybody know of a better/easier/normal way to achieve my goal?
As I am not sure how I would synchronize everything, I would be happy for any suggestions, code examples or helpful comments. Maybe somebody has done this before and would be happy to advise me.
Thanks in advance for any information offered.
Alan
Most Karaoke apps these days use the .Kar file extension. It's a slightly butchered MIDI file with some annotations and it's pretty small in file size.
I have no idea where you'd start to read it though. You could ask these guys: http://www.ikaraokeapp.com/
Most little devs are happy to help others out.
The standard way to do this is using the kar file extension as suggested by the other answer. Here are some more details.
The .kar extension just signifies that a standard midi file contains lyric information. To build your karaoke player you need to do two things: play the instrument part of the midi file and display the lyrics.
This can be achieved in two ways. Either you could use the built in MusicPlayer to play the midi file and then access the lyric information from a virtual endpoint: here
You would need to access midi meta messages with codes 0x01 (text) or 0x05 (lyric). These messages store the lyric data as a series of hex numbers where each hex number represents an ascii character.
This approach allows you to play the MIDI file and access the lyrics. The problem is that you need to be able to display the lyrics ahead of time so the user can read them before they're played.
To do this you could parse the midi file manually: here
You would need to loop over the MIDI file and extract the text lyrics with their time stamps. Then you could display a section of lyrics ahead of time and change the colour or do a bouncing ball effect as the lyrics became due.
You should be able to do this through the MusicPlayer API, specifically the MusicTrack MIDIMetaEvent assuming midi type music is acceptable.
MIDIMetaEvent
Describes a MIDI metaevent such as lyric text, time signature, and so on.
typedef struct MIDIMetaEvent {
UInt8 metaEventType;
UInt8 unused1;
UInt8 unused2;
UInt8 unused3;
UInt32 dataLength;
UInt8 data[1];
} MIDIMetaEvent;
Alternately, you might also made some headway looking at iD3 metadata for storing lyrics and timing info.
Either way, there will be lots of file prep in getting the lyrics and timing into the music files.

How can i detect CODEC in MPMoviePlayerController in iphone-sdk

When Video is made with the Sorenson CODEC... MPMoviePlayerController just plays Audio(and not the Video), Instead i want to show my custom error message at this point. How can i detect which CODEC is used by particular File programmatically ... ?
EDIT: I am not using Quick time in my code so that solution won't work
Thanks
Check this documentation to understand the Quicktime file format :
http://developer.apple.com/library/mac/documentation/QuickTime/QTFF/qtff.pdf
The field you are looking for is the "vfmt" code that is containing the video fourcc code (there is one for each video track in your file, so take care if your file is containing several video tracks). The fourcc codes for Sorenson codec are "SVQ1" and "SVQ3".
Now you'll have to write some code to parse the QT file to find the correct atom, extract the "vfmt" value and compare it to SVQ1/SVQ3 !
Apple is providing some classes to easily parse quicktime files, but it is only available on Mac OS, not on iOS !

Is it possible to play a video on iPhone and have a subtitles synchronized with it show up?

I want to add a "subtitles" to a video played in an iPhone app. I don't want those subtitles encoded into the video itself - ideally I'd love to have a view showing the video (with pause, play, volume and such standard controls) together with a view displaying the text that changes together with movie time changing.
If I drawn that, it's something like this,
So, basicly, I would need a way to get a method called when movie is playing, and then synchronize the text displayed on the label with the movie timing.
Anyone used a solution that was able to do it?
I've recently done something that syncs graphics to times in an audio track. The way I did it was by using the currentPlaybackTime property of the MPMediaPlayback interface (which the MoviePlayer controller should also conform to). This returns the seconds elapsed in the media, in a double (typedef'ed as NSTimeInterval). The actual synchronisation in my app was not done in notifications, as I couldn't find any resembling a "tick", but instead I created a timer, calling a function queried the currentPlaybackTime, and updated the graphics based on this.
In terms of your implementation, I would assume you have some kind of system for associating label text (subtitles) with a particular time. You could then compare the text's time range with the time returned from currentPlaybackTime to find the correct text to display.

Is it possible to play a movie file in a UIView layer?

If yes, what movie format has best performance? And how would a simple setup look like? I have some views, and I want to play a short movie inside a view (not fullscreen). The movie is about 5 seconds long.
Looks like the system frameworks only support playing video full screen with the MPMoviePlayerController. Supported formats are basically flavors of H.264 and MPEG-4; more in the documentation.
Theoretically, you might be able to roll your own decoding and playback code, but I doubt you'd get acceptable performance. (And most of the Open Source media player examples I can think of are GPL. Not that I imagine they'd fare much better.)
If it's only 5 seconds long, you can fake it by playing the audio file in the background, and an animated UIImageView at the same time. Lots of apps do this like the ~50 or so baby sign language ones.