Set a ScheduledAudioFileRegion to start again - ios5

I am trying to play a sound when a button it tapped and am using the iOS 5 file player audio unit.
The file player plays audio using the ScheduledAudioFileRegion and can be scheduled to play as many frames as needed
ScheduledAudioFileRegion rgn;
memset (&rgn.mTimeStamp, 0, sizeof(rgn.mTimeStamp));
rgn.mTimeStamp.mFlags = kAudioTimeStampSampleTimeValid;
rgn.mTimeStamp.mSampleTime = 0;
rgn.mCompletionProc = NULL;
rgn.mCompletionProcUserData = NULL;
rgn.mAudioFile = audioFile;
rgn.mLoopCount = INT_MAX;
rgn.mStartFrame = 0;
rgn.mFramesToPlay = nPackets * fileASBD.mFramesPerPacket; // plays entire file.
How can I tell this file player to play the sound from the start whenever a button is pressed, or do I have to create a new region and memset it each time? I would like the sound to play when the button is pressed from start to finish but when the button is tapped again, it should start from the beginning even if the file is currently playing. Is this possible with the fileplayer audio unit?

Call AudioUnitReset to stop playback, then prime and play again. Happens instantly (or as near as makes no difference.)
Do this whenever your user presses the button:
// Reset
AudioUnitReset(filePlayerUnit, kAudioUnitScope_Global, 0);
// Prime (have to do this before every play call.)
UInt32 defaultVal = 0;
AudioUnitSetProperty(filePlayerUnit,
kAudioUnitProperty_ScheduledFilePrime,
kAudioUnitScope_Global,
0,
&defaultVal,
sizeof(defaultVal));
// Play (again)
AudioUnitSetProperty(_playerUnit,
kAudioUnitProperty_ScheduleStartTimeStamp,
kAudioUnitScope_Global,
0,
&startTime,
sizeof(startTime));
You already have your timestamps in the region. You need those, but you also have to create an AudioTimeStamp:
AudioTimeStamp startTime;
memset(&startTime, 0, sizeof(startTime));
startTime.mFlags = kAudioTimeStampSampleTimeValid;
startTime.mSampleTime = -1;
Create this, and call the associated SetProperty from the first snippet above, before you call play the first time.

Can you not just have your UIButton trigger a Stop action before calls Play..?

Related

White frame before video plays in Unity instead of a custom thumbnail

I load a thumbnail before the video starts to play, but later when the video is playing, there is first a white frame and then the video is playing. How can I avoid this white frame??
Here is my code-
video.GetComponent<RawImage>().texture = thumbnailTex;
//Play the video:
RenderTexture rt = new RenderTexture(1920, 1080, 16, RenderTextureFormat.ARGB32);
rt.Create();
video.GetComponent<RawImage>().texture =rt;
video.GetComponent<RawImage>().targetTexture=rt;
video.GetComponent<VideoPlayer>().url = "www....";
video.GetComponent<VideoPlayer>().Play();
// white frame, and then the video is playing
You need to wait first and test if the video is ready to play
It would be better if it's not already to have the above code in a coroutine. What is happening is you call play before the player has had a chance to download/load the first frame. Then display your rendertexture.
video.GetComponent().url = "...";
video.GetComponent().Prepare();
while (!video.GetComponent().isPrepared)
yield return new WaitForEndOfFrame();
video.GetComponent().frame = 0; //just incase it's not at the first frame
video.GetComponent().Play();
//now display your render texture
Thanks you very much for this solution to avoid jump frames during the playing of a video!
But there are some changes to do :
In the "Start" function, I call a coroutine :
StartCoroutine(PrepareVideoCoroutine());
In this coroutine, I put :
while (!gameOverGlassVideo.GetComponent().isPrepared) {
gameOverGlassVideo.GetComponent<VideoPlayer>().Prepare();
yield return new WaitForEndOfFrame();
}
//::: I put the line to prepare the video in the condition and NOT
outside ::://
When I want to play the video after, it's was already prepared and there will be not jumping frames!

Single Audiosource for multiple clips?

I have a couple of audio sources in a script, idea being that one is used for effects and the other for music. I am noticing that if an effect plays at the same time as another one, it actually stops the previous one from playing, probably because they are playing from the same audio source.
My question is, do I really have to instantiate a new audio source every time i want to play a new sound, just to make sure that one sound will not stop a currently playing one ? Is there a better way ?
The method that plays a sound effect(and the awake one) in my class are these :
void Awake() {
audioSource = GetComponents<AudioSource>()[0];
audioEffectsSource = GetComponents<AudioSource>()[1];
audioSource.volume = PlayerPrefsManager.getMasterVolume();
audioEffectsSource.volume = PlayerPrefsManager.getSFXMasterVolume();
}
public void playSoundEffect(AudioClip clip)
{
if(clip != null)
{
audioEffectsSource.clip = clip;
audioEffectsSource.volume = PlayerPrefsManager.getSFXMasterVolume();
audioEffectsSource.PlayOneShot(clip);
}
}
The script is attached to a musicmanager gameobject with 2 audiosources, one for music fx and one for sound tracks.

MovieTexture won't play audio

I'm trying to dynamically load and play a video file. No matter what I do, I cannot seem to figure out why the audio does not play.
var www = new WWW("http://unity3d.com/files/docs/sample.ogg");
var movieTexture = www.movie;
var movieAudio = www.movie.audioClip;
while (!movieTexture.isReadyToPlay) yield return 0;
// Assign movie texture and audio
var videoAnimation = videoAnimationPrefab.GetComponent<VideoAnimation>();
var videoRenderer = videoAnimation.GetVideoRenderer();
var audioSource = videoAnimation.GetAudioSource();
videoRenderer.material.mainTexture = movieTexture;
audioSource.clip = movieAudio;
// Play the movie and sound
movieTexture.Play();
audioSource.Play();
// Double check audio is playing...
Debug.Log("Audio playing: " + audioSource.isPlaying);
Every time I receive Audio playing: False
I've also tried using a GUITexture using this as a guide, but no dice. There are no errors displayed in the console.
What am I doing wrong that makes the audio never work?
Thanks in advance for any help!
Changed to:
while (!movieTexture.isReadyToPlay) yield return 0;
var movieAudio = movieTexture.audioClip;
Even though AudioClip inherits from Object, a call to movieTexture.audioClip seems to return a copied version instead of returning a reference by value to the object. So at the time I was assigning it, it had not been created yet and had to wait until the movie was "Ready to Play" until fetching the audioClip.

Why might my AudioQueueOutputCallback not be called?

I'm using the Audio Queue Services API to play audio streamed from a server over a TCP socket connection on an iPhone. I can play the buffers that were filled from the socket connection, I just cannot seem to make my AudioQueue call my AudioQueueOutputCallback function, and I'm out of ideas.
High level design
Data is passed to the player from the socket connection, and written
immediately into circular buffers in memory.
As AudioQueueBuffers become available, data is copied from the circular buffers into the
available AudioQueueBuffer, which is immediately re-queued. (Or would be, if my callback happened)
What happens
The buffers are all filled and enqueued successfully, and I hear the audio stream clearly. For testing, I use a large number of buffers (15) and all of them play through seamlessly, but the AudioQueueOutputCallback is never called, so I never re-queue any of those buffers, despite the fact that everything seems to be working perfectly. If I don't wait for my callback, assuming it will never be called, and instead drive the enqueueing of buffers based on the data as it is written, I can play the audio stream indefinitely, reusing and re-enqueueing buffers as if they had been explicitly returned to me by the callback. It is that fact: that I can play the stream perfectly while reusing buffers as needed, that confuses me the most. Why isn't the callback being called?
Possibly Relevant Code
The format of the stream is 16 bit linear PCM, 8 kHz, Mono:
_streamDescription.mSampleRate = 8000.0f;
_streamDescription.mFormatID = kAudioFormatLinearPCM;
_streamDescription.mBytesPerPacket = 2;
_streamDescription.mFramesPerPacket = 1;
_streamDescription.mBytesPerFrame = sizeof(AudioSampleType);
_streamDescription.mChannelsPerFrame = 1;
_streamDescription.mBitsPerChannel = 8 * sizeof(AudioSampleType)
_streamDescription.mReserved = 0;
_streamDescription.mFormatFlags = (kLinearPCMFormatFlagIsBigEndian |
kLinearPCMFormatFlagIsPacked);
My prototype and implementation of the callback are as follows. Nothing fancy, and pretty much identical to every example I've seen so far:
// Prototype, declared above the class's #implementation
void AQBufferCallback(void* inUserData, AudioQueueRef inAudioQueue, AudioQueueBufferRef inAudioQueueBuffer);
// Definition at the bottom of the file.
void AQBufferCallback(void* inUserData, AudioQueueRef inAudioQueue, AudioQueueBufferRef inAudioQueueBuffer) {
printf("callback\n");
[(MyAudioPlayer *)inUserData audioQueue:inAudioQueue didAquireBufferForReuse:inAudioQueueBuffer];
}
I create the AudioQueue like this:
OSStatus status = 0;
status = AudioQueueNewOutput(&_streamDescription,
AQBufferCallback, // <-- Doesn't work...
self,
CFRunLoopGetCurrent(),
kCFRunLoopCommonModes,
0,
&_audioQueue);
if (status) {
// This is not called...
NSLog(#"Error creating new audio output queue: %#", [MyAudioPlayer stringForOSStatus:status]);
return;
}
And I enqueue buffers like this. At this point, it is known that the local buffer contains the correct amount of data for copying:
memcpy(aqBuffer->mAudioData, localBuffer, kAQBufferSize);
aqBuffer->mAudioDataByteSize = kAQBufferSize;
OSStatus status = AudioQueueEnqueueBuffer(_audioQueue, aqBuffer, 0, NULL);
if (status) {
// This is also not called.
NSLog(#"Error enqueueing buffer %#", [MyAudioPlayer stringForOSStatus:status]);
}
Please save me.
Is this executed on the main thread or a background thread? probably not good if CFRunLoopGetCurrent() returns a run loop of a thread that could disappear (thread pool etc) or is a run loop that don't care about kCFRunLoopCommonModes.
Try to change CFRunLoopGetCurrent() to CFRunLoopGetMain() or make sure AudioQueueNewOutput() and CFRunLoopGetCurrent() is executed on the main thread or a thread that you have control over and has a proper run loop.
Try changing self for (void*)self. Like this:
status = AudioQueueNewOutput(&_streamDescription,
AQBufferCallback,
(void*)self,
CFRunLoopGetCurrent(),
kCFRunLoopCommonModes,
0,
&_audioQueue);

How to start Audio file from certain location using AudioQueue?

I have analyzed "SpeakHere" sample code of iPhone dev forum.
There is a code for starting AudioQueue as following..
AudioTimeStamp ats = {0};
AudioQueueStart(mQueue, &ats);
But I have no idea that how to start middle of file.
I changed AudioTimeStamp with various values include negative. But it does not works.
Please let me know your great opinion. Thanks.
AudioQueueStart is not the function that will help you to do that. The time there is like a delay, if you pass NULL then it means that the queue will start ASAP.
You have to pass the frame you want to play and enqueue it, to calculate that you have to know the number of frames your file has and the (relative) position you want to play.
These are instructions how to make it in SpeakHere
In the new (objc++ based) SpeakHere
In AQPlayer.h add a private instance variable:
UInt64 mPacketCount;
and a public method:
void SetQueuePosition(float position) { mCurrentPacket = mPacketCount*position; };
In AQPlayer.mm inside AQPlayer::SetupNewQueue() before mIsInitialized = true; add:
// get the total number of packets
UInt32 sizeOfPacketsCount = sizeof(mPacketCount);
XThrowIfError (AudioFileGetProperty (mAudioFile, kAudioFilePropertyAudioDataPacketCount, &sizeOfPacketsCount, &mPacketCount), "get packet count");
Now you have to use it (In SpeakHereControler.mm add this and link it to a UISlider for example):
- (IBAction) sliderValueChanged:(UISlider *) sender
{
float value = [sender value];
player->SetQueuePosition(position);
}
Why this works:
The playback callback function AudioQueueOutputCallback that feeds the queue with new packets and which in the new SpeakHere is: void AQPlayer::AQBufferCallback( , , ) calls AudioFileReadPackets to read and enqueue a certain part of a file. For that task mCurrentPacket is used and that is what we just adjusted in above methods, hence the part you wanted to play is read, enqueued and finally played :)
Just for historical reasons :)
In the old (objc based) SpeakHere
In AudioPlayer.h add an instance variable:
UInt64 totalFrames;
AudioPlayer.m inside
- (void) openPlaybackFile: (CFURLRef) soundFile
add:
UInt32 sizeOfTotalFrames = sizeof(UInt64);
AudioFileGetProperty (
[self audioFileID],
kAudioFilePropertyAudioDataPacketCount,
&sizeOfTotalFrames,
&totalFrames
);
Then add a method to AudioPlayer.h and .m
- (void) setRelativePlaybackPosition: (float) position
{
startingPacketNumber = totalFrames * position;
}
Now you have to use it (In AudioViewController add this and link it to a UISlider for example):
- (IBAction) setPlaybackPosition: (UISlider *) sender
{
float value = [sender value];
[audioPlayer setRelativePlaybackPosition: value];
}
When value is 0 you will play from the beggining, 0.5 from the middle, etc.
Hope this helps.