Here, I am using the following Click here attached code for mixing the audio files. But when i am using this for mixing recorded voice and audio file, it's not working for me.So how can i resolve this issue. you Can see my code via this link Code Link
Related
I'm using the AoG Trivia sample code (there's so much depth to this code!) that it's easier for me to grapple with its functions. I'm trying to create audio-only questions (I host .ogg files in a GCP bucket), but when I use the ssml method in ssml.js .audio, it fails to use the url to speak the .ogg file. Is there a special way to enter the questions in the question.json file, that are urls to audio files? I checked that the ssml was valid using the simulator.
Thanks for your help!
OK, so my bad, in the code I was leaving out the AUDIO_BASE_URL which is used to point where the hosted audio files are in Firebase. However ... a new problem has arisen, but I'll close this question. (I get different behaviour of playing the audio on the simulator&Google Assistant on Android vs Google Home, coupled with some intermittent network time-outs - I've raised it with Google :)
Hi i am using Twilio Video sdk to implement a video calling feature inside my app. I implemented the video call successfully and the video is successful. But the voice is not transmitting between the two users.
I added MicroPhone Usage Key too in the info.plist file but that did not solve the problem. I tried with and without the microphones and headset but no voice.
I see that the addedVideoTrack functions is being called and the statements are printed inside that function. But the function addedAudioTrack is not being called at all.
Can some body provide a solution for this problem or point me in the right direction.
I am using the code from Quicstart example provided by Twilio.
Here is the link to the tutorial i am referring to.
I commented out a line and it was my mistake. I actually commented a line that prepared audio to work. I just found out and it worked. But now the video and audio are fully functional.
I'm using Matt Gallagher's AudioStreamer to perform audio streaming in app, the URLis coming from server, right now its of type .m3u8 format, and the problem is ON, before it was of .mp3type and streamed normally. It showing an error "No audio data found". However I tried to play URL in Safari browser in simulator and its playing good so there's no problem with the URL.
I've been searching long in google but it was ended with these two SO questions. Question-1, Question-2, but none of having the answer of it yes one has solution to play it using MPMoviePlayerController but I want to stream the same with the same I've.
So I dig into code of AudioStreamer .h and .m files, where I get to know that the logic of file type selection is at #line no. 555
+ (AudioFileTypeID)hintForFileExtension:(NSString *)fileExtension { .... }
an AudioFileTypeID need to return there, the list is defined in AudioFile.h of AudioToolbox.framework doesn't contains .m3u8 file type so I can't return it here (I tried for patching with different types there).
I tried to find any alternative types (which can be use instead of this) but no results I get. Then I gone through Apple Doc, and Issues Discussion but none of helped me!
P.S. I've checked AudioToolbox.framework in iOS6.0. to check for the availability of file type but it doesn't exist at all.
Any solution?
You can try with mpmovieviewcontroller. I am also using it for playing streaming audio in one of my Apps. For streaming type content (Like audio/video from web services or from internet) then it is perfect player. And it also look like iPhone default player. Search some tutorial on it & implement it. It is easily be implemented also.
Hi seen many tutorial but couldn't find the right workable answer. I need to play audio and video file from my webservice, for this I want to download the file and play it in the app, seen tutorial for avPlayer nothing workable. Please provide link or guide how to do this.
Use MPMoviePlayer, as this can open network streams and despite the name works with audio. The only thing is this pops up the modal player with controls.
First Download the network file using NSURLConnection and store it locally, then use AVAudioPlayer to play the local file.
I have used Local Notification in my app an successfully generated an alert at correct time..
But when i tried following code for playing sound it does not play....
localNot.soundName =#"Ghulam Ali-Chamkte Chaad Ko.mp3";
Can anyone tell me the reason....Does the playing length of sound file affects
According to the Apple Developer Documentation, you need to use "aiff", "caf" or "wav" files.
The link provided shows some ways of converting audio to these formats on your mac.