Hi i am using Twilio Video sdk to implement a video calling feature inside my app. I implemented the video call successfully and the video is successful. But the voice is not transmitting between the two users.
I added MicroPhone Usage Key too in the info.plist file but that did not solve the problem. I tried with and without the microphones and headset but no voice.
I see that the addedVideoTrack functions is being called and the statements are printed inside that function. But the function addedAudioTrack is not being called at all.
Can some body provide a solution for this problem or point me in the right direction.
I am using the code from Quicstart example provided by Twilio.
Here is the link to the tutorial i am referring to.
I commented out a line and it was my mistake. I actually commented a line that prepared audio to work. I just found out and it worked. But now the video and audio are fully functional.
Related
I am using TwilioVideo for video calling. My problem is when I connect to a video room only voice is working. My code and issue as shown below.
It is a compilation error. It looks like the addRenderer method does not exist on the subscribedVideoTrack. I am not familiar with Swift but I think you should add the renderer on the publication object.
Here, I am using the following Click here attached code for mixing the audio files. But when i am using this for mixing recorded voice and audio file, it's not working for me.So how can i resolve this issue. you Can see my code via this link Code Link
I created an app that plays the song and calculates the decibels of the audio that is being played. Its fine.
But I want to make a change in it. That is to receive the sound/audio from outside (when user speaks) and calculate the decibels.
I don't want to record anything. Just receive audio/sound and calculate the decibels?
Any hints or tutorials please?
You could try using the source code for one of the sample apps (SpeakHere) in the iOS Developer Library as a starting point: http://developer.apple.com/library/ios/#samplecode/SpeakHere/Introduction/Intro.html
I found the src code "https://github.com/jkells/sc_listener_sample" which is working without any modifications.
I've been developing an iPhone application that streams audio using Matt Gallagher's audio streamer found here: GitHud: AudioStreamer
However, I'm having some problems when the iPhone loses internet connection because the stream cuts out and doesn't reconnect until the user actually presses the play button again. I've even tried using the reachability classes from Apple to try and automatically stop and reconnect the stream but this isn't working 100%.
I've been reading around on the internet and I've found something called HTTP Live Streaming that can supposedly be used to stream audio on the iPhone. However, I can't seem to find any examples of how to use this, therefore can anyone help me by given a brief description any any source that might help to get this working please?
Thanks in advance,
Luke
Not enough detail for me to answer this entirely, but I use a set of calls
to be notified of reachability changes.
When I get a failure, I change the play image to stop.
I then wait for a notification that the network is back
and then programmatically press play for the user.
I have used Local Notification in my app an successfully generated an alert at correct time..
But when i tried following code for playing sound it does not play....
localNot.soundName =#"Ghulam Ali-Chamkte Chaad Ko.mp3";
Can anyone tell me the reason....Does the playing length of sound file affects
According to the Apple Developer Documentation, you need to use "aiff", "caf" or "wav" files.
The link provided shows some ways of converting audio to these formats on your mac.